ER&L 2012: Working For You — NISO Standards and Best Practices

wood, water, and stone
courtyard of the conference hotel

Speaker: Nettie Legace

NISO is more than just the people at the office — it’s the community that comes together to address the problems that we have, and they’re alway soliciting new work items. What are your pain points? Who needs to be involved? How does it relate to existing efforts?

A NISO standard is more formal and goes through a rigorous process to be adopted. A NISO recommended practice is more edgy and discretionary. There are a lot of standards and best practices that have not been adopted, and it depends on the feasibility of implementation.

Get involved if you care about standards and best practices!

Speakers: Jamene Brooks-Kieffer & John Law

They’re talking about the Open Discovery Initiative. Discovery systems have exploded, and there are now opportunities to come together to create some efficiencies through standards. The working group includes librarians, publishers, and discovery service providers.

The project has three main goals: identify stakeholder needs & requirements, create recommendations and tools to streamline process, and provide effective means of assessment. Deliverables include a standard vocabulary, which the group has found they need just to communicate, and a NISO recommended practice. They plan to have the initial draft by January 2013 and finish the final draft by May 2013.

There will be an open teleconference on Monday.

Speaker: Oliver Pesch

SUSHI was one of the first standards initiatives to use the more agile development process which allows for review and revision in 5-7 years. The down side to a fixed standard is that you have to think of every possible outcome in the standard because you may not get a chance to address it again, and with electronic standards, you have to be able to move quickly. So, they focused on the core problem and allowed the standard to evolve through ongoing maintenance.

SUSHI support is now a requirement for COUNTER compliance, and it has been adopted by approximately 40 content providers. MISO client is a free tool you can download and use it to harvest SUSHI reports.

Part of SUSHI’s success comes from being a part of NISO and the integration with COUNTER, but they’re not done. They’re doing a lot of work to promote interoperability, and they have published the SUSHI Server Test Mode recommended practice. They’re also preparing for release 4 of COUNTER, and making adjustments as needed. They’re also publishing a COUNTER SUSHI Implementation Profile to standardize the interpretation of implementation across providers.

Questions/Comments:
Major concern about content providers who also have web-scale discovery solutions that don’t share content with each other — will that ever change? Not in the scope of the work of the ODI. More about how players work together in better ways, not about whether or not content providers participate.

Why is SUSHI not more commonly available? Some really big players aren’t there yet. How fast is this going to happen? A lot of work is in development but not live yet. What drives development is us. [Except we keep asking for it, and nothing happens.] If you allow it to be okay for it to not be there, then it won’t.

JISC created a template letter for member libraries to send to publishers, and that is making a difference.

ER&L 2010: Beyond Log-ons and Downloads – meaningful measures of e-resource use

Speaker: Rachel A. Flemming-May

What is “use”? Is it an event? Something that can be measured (with numbers)? Why does it matter?

We spend a lot of money on these resources, and use is frequently treated as an objective for evaluating the value of the resource. But, we don’t really understand what use is.

A primitive concept is something that can’t be boiled down to anything smaller – we just know what it is. Use is frequently treated like a primitive concept – we know it when we see it. To measure use we focus on inputs and outputs, but what do those really say about the nature/value of the library?

This gets more complicated with electronic resources that can be accessed remotely. Patrons often don’t understand that they are using library resources when they use them. “I don’t use the library anymore, I get most of what I need from JSTOR.” D’oh.

Funds are based on assessments and outcomes – how do we show that? The money we spend on electronic resources is not going to get any smaller. ROI is focused more on funded research, but not electronic resources as a whole.

Use is not a primitive concept. When we talk about use, it can be an abstract concept that covers all use of library resources (physical and virtual). Our research often doesn’t specify what we are measuring as use.

Use as a process is the total experience of using the library, from asking reference questions to finding a quiet place to work to accessing resources from home. It is the application of library resources/materials to complete a complex/multi-stage process. We can do observational studies of the physical space, but it’s hard to do them for virtual resources.

Most of our research tends to focus on use as a transaction – things that can be recorded and quantified, but are removed from the user. When we look only at the transaction data, we don’t know anything about why the user viewed/downloaded/searched the resource. Because they are easy to quantify, we over-rely on vendor-supplied usage statistics. We think that COUNTER assures some consistency in measures, but there are still many grey areas (i.e. database time-outs equal more sessions).

We need to shift from focusing on isolated instances of downloads and ref desk questions, but focus on the aggregate of the process from the user perspective. Stats are only one component of this. This is where public services and technical services need to work together to gain a better understanding of the whole. This will require administrative support.

John Law’s study of undergraduate use of resources is a good example of how we need to approach this. Flemming-May thinks that the findings from that study have generated more progress than previous studies that were focused on more specific aspects of use.

How do we do all of this without invading on the privacy of the user? Make sure that your studies are thought-out and pass approval from your institution’s review board.

Transactional data needs to be combined with other information to make it valuable. We can see that a resource is being used or not used, but we need to look deeper to see why and what that means.

As a profession, are we prepared to do the kind of analysis we need to do? Some places are using anthropologists for this. A few LIS programs are requiring a research methods course, but it’s only one class and many don’t get it. This is a great continuing education opportunity for LIS programs.

CiL 2008: What Do Users Really Do in Their Native Habitat?

Speakers: Pascal Lupien and Randy Oldham

Unsubstantiated assumptions about Millennials cause libraries to make poor choices in providing services and resources. Lupien and Oldham spent some time studying how students actually use the tools we think they use. They used typical survey and focus group methodologies, which make for rather boring presentation recaps, so I won’t mention them.

Study found that only 9% of students used PDAs, and tended to be among older students. 69% of students had cell phones, but only 17% of them have ever used them to browse the Internet. 93% of student have used a chat client, and most have used them for academic purposes several times per week. 50% of users had never used online social network applications for academic group work.

The focus groups found that students preferred email over online social networks for group work. Students are more willing to share the results of their work with friends than with other classmates.

42% of students has never played online games, and men were three times more likely to do so than women. Only 4.1% were involved with online virtual worlds like World of Warcraft and Second Life.

The survey respondents indicated they were more likely to go to the library’s website first rather than Google. The focus groups also confirmed this, in addition to indicating that the library had the best sources of information despite being the most difficult to manage.

Students are reluctant to mix personal and academic computing. The uptake on online social networks for academic use has been slow, but will likely increase, and we have to ask, “is this the best use of our resources and time?” Our priorities need to be more on improving the services we already offer, such as our websites and search tools. “Rather than looking at technologies & trying to find a use for them in our environment, we should determine what our students need & seek solutions to meet those needs.”


Speaker: John Law

Proquest conducted a survey of seven universities across North America and the United Kingdom, involving 60 students. As with Lupien and Oldham’s study, they conducted it anonymously. Observations were conducted in a variety of locations, from the library to dorm rooms. They used a program like web conferencing software to capture the remote sessions.

Law gave an anecdote of a fourth year student who did all the things librarians want students to do when doing research, and when he was asked why, the student gave all the right answers. Then, when he was asked how long he had been doing his research that way, he indicated something like six weeks, after a librarian had come to his class to teach them about using the library’s resources. Library instruction works.

Course instructors are also influential. “My English instructor told me to use JSTOR.”

Brand recognition is fine, but it doesn’t necessarily effect the likelihood that resources will be used more or less.

Students use abstracts to identify relevant articles, even when the full text is available. They’re comfortable navigating in several different search engines, but not as well with library websites in locating relevant resources. Users don’t always understand what the search box is searching (books, articles, etc.), and can find it to be discouraging. A-Z databases page is too unmanageable for most users, particularly when starting their research.

Students are using Google for their research, but mainly for handy look-ups and not as a primary research tool. Those who use Google as a primary research tool do so because they aren’t as concerned with quality or are insufficiently aware of library eresources or have had bad experiences with library eresources.

Librarians, students use Google and Wikipedia the same way you do. (We know you all use those tools, so don’t even try to deny it.)

Students laughed at surveyors when asked how they use online social networks for academic purposes.

css.php