NASIG 2013: Getting to the Core of the Matter — Competencies for New E-Resources Librarians

“Canyon do Buracão” by Joao Vicente

Speakers: Roën Janyk (Okanagan College) & Emma Lawson (Langara College)

Two new-ish librarians talk about applying their LIS training to the real world, and using the Core Competencies as a framework for identifying the gaps they encountered. They wanted to determine if the problem is training or if eresources/serials management is just really complicated.

Collection development, cataloging (both MARC and Dublin Core), records management, and digital management were covered in their classes. Needed more on institutional repository management.

They did not cover licensing at all, so all they learned was on the job, comparing different documents. They also learned that the things librarians look for in contracts is not what the college administrators are concerned about. In addition, the details of information about budgeting and where that information should be stored was fuzzy, and it took some time to gather that in their jobs. And, as with many positions, if institutional memory (and logins) is not passed on, a lot of time will be spent on recreating it. For LIS programs, they wish they had more information about the details of use statistics and their application, as well as resource format types and the quirks that come with them.

They had classes about information technology design and broader picture things, but not enough about relationships between the library and IT or the kinds of information technology in libraries now. There were some courses that focused on less relevant technology and history of technology, and the higher level courses required too high of a learning curve to attract LIS students.

For the core competency on research analysis and application, we need to be able to gather appropriate data and present the analysis to colleagues and superiors in a way that they can understand it. In applying this, they ran into questions about comparing eresources to print, deciding when to keep a low-use resource, and other common criteria for comparing collections besides cost/use. In addition, there needs to be more taught about managing a budget, determining when to make cancelation or format change decisions, alternatives to subscriptions, and communicating all of this outside of the library.

Effective communication touches on everything that we do. It requires that you frame situations from someone else’s viewpoint. You need to document everything and be able to clearly describe the situation in order to trouble-shoot with vendors. Be sympathetic to the frustrations of users encountering the problems.

Staff supervision may range from teams with no managerial authority to staff who report to you. ER librarians have to be flexible and work within a variety of deparmental/project frameworks, and even if they do have management authority, they will likely have to manage projects that involve staff from other departments/divisions/teams. They did not find that the library management course was very applicable. Project management class was much more useful. One main challenge is staff who have worked in the library for a long time, and change management or leadership training would be very valuable, as well as conversations about working with unionized staff.

In the real world being aware of trends in the profession involves attending conferences, participating in webinars/online training, and keeping up with the literature. They didn’t actually see an ERMS while in school, nor did they work with any proprietary ILS. Most of us learn new things by talking to our colleagues at other institutions. MLS faculty need to keep up with the trends as well, and incorporate that into classes — this stuff changes rapidly.

They recommend that ILS and ERMS vendors collaborate with MLS programs so that students have some real-world applications they can take with them to their jobs. Keep courses current (what is actually being used in libraries) and constantly be evaluating the curriculum, which is beyond what ALA requires for accreditation. More case studies and real-world experiences in applied courses. Collection development course was too focused on print collection analysis and did not cover electronic resources.

As a profession, we need more sessions at larger, general conferences that focus on electronic resources so that we’re not just in our bubble. More cross-training in the workplaces. MLS programs need to promote eresources as a career path, instead of just the traditional reference/cataloger/YA divides.

If we are learning it all on the job, then why are we required to get the degrees?

ER&L 2012: Knockdown/Dragout Webscale Discovery Service vs. Niche Databases — Data-Driven Evaluation Methods

tug-of-war
photo by TheGiantVermin

Speaker: Anne Prestamo

You will not hear the magic rational that will allow you to cancel all your A&I databases. The last three years of analysis at her institution has resulted in only two cancelations.

Background: she was a science librarian before becoming an administrator, and has a great appreciation for A&I searching.

Scenario: a subject-specific database with low use had been accessed on a per-search basis, but going forward it would be sole-sourced and subscription based. Given that, their cost per search was going to increase significantly. They wanted to know if Summon would provide a significant enough overlap to replace the database.

Arguments: it’s key to the discipline, specialized search functionality, unique indexing, etc… but there’s no data to support how these unique features are being used. Subject searches in the catalog were only 5% of what was being done, and most of them came from staff computers. So, are our users actually using the controlled vocabularies of these specialized databases. Finally, librarians think they just need to promote these more, but sadly, that ship’s already sailed.

Beyond usage data, you can also look at overlap with your discovery service, and also identify unique titles. For those, you’ll need to consider local holdings, ILL data, impact factors, language, format, and publication history.

Once they did all of that, they found that 92% of the titles were indexed in their discovery service. The depth of the backfile may be an issue, depending on the subject area. Also, you may need to look at the level of indexing (cover to cover vs. selective). In the end, they found that 8% of the titles not included, they owned most of them in print and they were rather old. 15% of the 8% had impact factors, which may or may not be relevant, but it is something to consider. And, most of the titles were non-English. They also found that there were no ILL requests for the non-owned unique titles, and less than half were scholarly and currently being published.

Delicious is still tasty to me

I can’t help feeling disappointed in how quickly folks jumped ship and stayed on the raft even when it became clear that it was just a leaky faucet and not a hole in the hull.

I’ve been seeing many of my friends and peers jump ship and move their social/online bookmarks to other services (both free and paid) since the Yahoo leak about Delicious being in the sun-setting category of products. Given the volume of outcry over this, I was pretty confident that either Yahoo would change their minds or someone would buy Delicious or someone would replicate Delicious. So, I didn’t worry. I didn’t freak out. I haven’t even made a backup of my bookmarks, although I plan to do that soon just because it’s good to have backups of data.

Now the word is that Delicious will be sold, which is probably for the best. Yahoo certainly didn’t do much with it after they acquired it some years ago. But, honestly, I’m pretty happy with the features Delicious has now, so really don’t care that it hasn’t changed much. However, I do want it to go to someone who will take care of it and continue to provide it to users, whether it remains free or becomes a paid service.

I looked at the other bookmark services out there, and in particular those recommended by Lifehacker. Frankly, I was unimpressed. I’m not going to pay for a service that isn’t as good as Delicious, and I’m not going to use a bookmarking service that isn’t integrated into my browser. I didn’t have much use for Delicious until the Firefox extension, and now it’s so easy to bookmark and tag things on the fly that I use it quite frequently as a universal capture tool for websites and gift/diy ideas.

The technorati are a fickle bunch. I get that. But I can’t help feeling disappointed in how quickly they jumped ship and stayed on the raft even when it became clear that it was just a leaky faucet and not a hole in the hull.

IL 2010: Adding Value – CIO Insights

speakers: Mike Ridley, Donna Scheeder, & Jim Peterson (moderated by Jane Dysart)

Ridley sees his job as leveraging information and economics to move the institution forward. Scheeder combines information management and technology to support their users. Peterson is from a small, rural library system where he manages all of the IT needs. (regarding his director: “I’m the geek, she’s the wallet.”)


Ridley

“I just want to remind you that if you think my comments are a load of crap, that’s a good thing.” Mike Ridley, referencing yesterday’s keynote about the hidden treasure of bat guano in libraries.

Information professionals have ways of thinking about how we do what we do, but our user populations have different perspectives. The tribal identities can be challenging when it comes to communicating effectively.

The information age is over. We’ve done that. But we’re still hanging on to it, even though everyone is in the information business. We need to leave that metaphor behind.

This is the age of imagination. What can we do differently? How will we change the rules to make a better world?

Open organizations are the way to go. Command and control organizations won’t get us to where we need to be in this age of imagination. We need to be able to fail. We are completely ignorant of how this will play out, and that opens doors of possibilities that wouldn’t otherwise be there.


Scheeder

It’s challenging to balance the resource needs of diverse user groups. You can add value to information by deeply understanding your users, your resources, and the level of risk that is acceptable.

There’s a big movement towards teleworking in the government. This can change your culture and the way you deliver services. Also, the proliferation of mobile devices among the users creates challenges in delivering content to them.

There’s a constant push and pull among the disciplines to get what they want.

Finally, security requirements make outside collaboration difficult. They want to be open, but they also have to protect the assets they were entrusted with.


Peterson

We all have computers, servers, and patrons, so under the hood we’re all the same.

The ability that IT has to cut power consumption costs can really help you out. Technology upgrades will increase productivity and decrease energy costs. In general, if it’s generating heat, it’s wasting electricity. Open source software can save on those costs, particularly if you have tech support that can manage it.

IT is more than just the geek you call when you have a tech problem. We’re here to help you save money.

Dysart’s questions

What’s the future of libraries?

Scheeder: The screen is the library now, so the question is where do we want the library. The library should be where people have their “dwell time.”

Ridley: The internet is going to get so big that it will disappear as a separate entity. Libraries will be everywhere, no matter what you’re doing. The danger is that libraries may disappear, so we need to think about value in that sphere.

Peterson: Libraries of the future are going to be most valuable as efficient information providers.


Tips for financing resources?

Peterson: Show a solid business model for the things you need.

Scheeder: Figure out how the thing you want to do aligns with the greater good of the organization. Identify how the user experience will improve. Think like the decision-makers and identify the economic reality of the organization.

Ridley: Prefers “participant” to “user”. Make yourself visible to everyone in your organization. Bridge the gap between tribes.

Anything else?

Peterson: If we don’t talk to our legislators then we won’t have a voice and they won’t know our needs.

Scheeder: Information professionals have the opportunity to maximize content to be finable by search engines, create taxonomy, and manage the digital lifecycle. We need to do better about preserving the digital content being created every moment.

Ridley: Go out and hire someone like Peterson. We need people who can understand technology and bridge the divide between IT and users.

CIL 2010: Conversations with the Archivist of the United States

Speakers: “Collector in Chief” David Ferriero interviewed by Paul Holdengräber

Many people don’t know what the archivist does. They often think that the National Archives are a part of the Library of Congress. In fact, the agency is separate.

Ferriero is the highest ranking librarian in the administration. It’s usually a historian or someone with connections to the administration. He was surprised to get the appointment, and had been expecting to head the IMLS instead.

He is working to create a community around the records and how they are being used. His blog talks about creating citizen archivists. In addition, he is working to declassify 100 million documents a year. There is an enormous backlog of these documents going back to WWII. Each record must be reviewed by the agency who initially classified them, and there are 2400 classification guides that are supposed to be reviewed every five years, but around 50% of them have not.

You can’t have an open government if you don’t have good records. When records are created, they need to be ready to migrate formats as needed. There will be a meeting between the chief information officers and the record managers to talk about how to tackle this problem. These two groups have historically not communicated very well.

He’s also working to open up the archives to groups that we don’t often think of being archive users. There will be programs for grade school groups, and more than just tours.

Large digitization projects with commercial entities lock up content for periods of time, including national archives. He recognizes the value that commercial entities bring to the content, but he’s concerned about the access limitations. This may be a factor in what is decided when the contract with Ancestry.com is up.

“It’s nice having a boss down the street, but not, you know, in my face.” (on having not yet met President Obama)

Ferriero thinks we need to save smarter and preserve more digital content.

ER&L 2010: Usage Statistics for E-resources – is all that data meaningful?

Speaker: Sally R. Krash, vendor

Three options: do it yourself, gather and format to upload to a vendor’s collection database, or have the vendor gather the data and send a report (Harrassowitz e-Stats). Surprisingly, the second solution was actually more time-consuming than the first because the library’s data didn’t always match the vendor’s data. The third is the easiest because it’s coming from their subscription agent.

Evaluation: review cost data; set cut-off point ($50, $75, $100, ILL/DocDel costs, whatever); generate list of all resources that fall beyond that point; use that list to determine cancellations. For citation databases, they want to see upward trends in use, not necessarily cyclical spikes that average out year-to-year.

Future: Need more turnaway reports from publishers, specifically journal publishers. COUNTER JR5 will give more detail about article requests by year of publication. COUNTER JR1 & BR1 combined report – don’t care about format, just want download data. Need to have download information for full-text subscriptions, not just searches/sessions.

Speaker: Benjamin Heet, librarian

He is speaking about University of Notre Dame’s statistics philosophy. They collect JR1 full text downloads – they’re not into database statistics, mostly because fed search messes them up. Impact factor and Eigen factors are hard to evaluate. He asks, “can you make questionable numbers meaningful by adding even more questionable numbers?”

At first, he was downloading the spreadsheets monthly and making them available on the library website. He started looking for a better way, whether that was to pay someone else to build a tool or do it himself. He went with the DIY route because he wanted to make the numbers more meaningful.

Avoid junk in junk out: HTML vs. PDF downloads depends on the platform setup. Pay attention to outliers to watch for spikes that might indicate unusual use by an individual. The reports often have bad data or duplicate data on the same report.

CORAL Usage Statistics – local program gives them a central location to store user names & passwords. He downloads reports quarterly now, and the public interface allows other librarians to view the stats in readable reports.

Speaker: Justin Clarke, vendor

Harvesting reports takes a lot of time and requires some administrative costs. SUSHI is a vehicle for automating the transfer of statistics from one source to another. However, you still need to look at the data. Your subscription agent has a lot more data about the resources than just use, and can combine the two together to create a broader picture of the resource use.

Harrassowitz starts with acquisitions data and matches the use statistics to that. They also capture things like publisher changes and title changes. Cost per use is not as easy as simple division – packages confuse the matter.

High use could be the result of class assignments or hackers/hoarders. Low use might be for political purchases or new department support. You need a reference point of cost. Pricing from publishers seems to have no rhyme or reason, and your price is not necessarily the list price. Multi-year analysis and subject-based analysis look at local trends.

Rather than usage statistics, we need useful statistics.