battle decks

my #erl15 Battle Decks topic
my #erl15 Battle Decks topic

I participated in my first Battle Decks competition at ER&L this year. I almost did last year, and a friend encouraged me to put my name in the hat this year, so I did.

I was somewhat surprisingly not nervous as I waited for my name to be chosen to present next (the order was random — names drawn from a bag). Rather, I was anxiously waiting for my turn, because I knew I could pull it off, and well.

This confidence is not some arrogance I carry with me all the time. I’ve got spades of impostor syndrome when it comes to conference presentations and the like. Battle Decks, however, is not a presentation on a topic I’m supposed to know more about but secretly suspect I know less about than the audience. They are more in the dark than I am, and my job isn’t to inform so much as to entertain.

Improv — I can do that. I spent a few seasons with the improv troupe in college, and while I was certainly not remarkable or talented, I did learn a lot about “yes, and”. My “yes, and” with the Battle Decks was the slides — no matter what came up, I took it and connected it back to the topic and vice-versa.

There was one slide that came up that was dense with text or imagery or something that just couldn’t register in the split second I looked at it. I turned back to the audience and found I had nothing to say, so I looked at it again, and then made an apology, stating that my assistant had put together the slide deck and I wasn’t sure what this one was supposed to be about. It brought the laughs and on I went.

I would like to take this opportunity to thank Jesse Koennecke for organizing the event, as well as Bonnie Tijerina, Elizabeth Winters, and Carmen Mitchell for judging the event. And, of course, thanks also to April Hathcock for sharing the win with me.

#erl15 Battledecks Monday
photo by Sandy Tijerina

ER&L 2015 – All Thing Distributed: Collaborations Beyond Infrastructure

Collaboration
photo by Chris Lott

Speaker: Robert McDonald, Indiana University

Never use a fad term for your talk because it will make you look bad ten years later.

We’re not so tied up into having to build hardware stacks anymore. Now, that infrastructure is available through other services.

[insert trip down memory lane of the social media and mobile phones of 2005-2006]

We need to rethink how we procure IT in the library. We need better ways of thinking this out before committing to something that you may not be able to get out of it. Market shifts happen.

Server interfaces are much more user friendly now, particularly when you’re using something like AWS. However, bandwidth is still a big cost. Similar infrastructures, though, can mean better sharing of tools and services across institutions.

How much does your library invest in IT? How much of your percentage of your overall budget is that? How do you count that number? How much do we invest on open source or collaborative ventures that involve IT?

Groups have a negativity bias, which can have an impact on meetings. The outcomes need to be positive in order to move an organization forward.

Villanova opted to spend their funds on a locally developed discovery layer (VUFind), rather than dropping that and more on a commercial product. The broader community has benefitted from it as well.

Kuali OLE has received funding and support from a lot of institutions. GOKb is a sister project to develop a better knowledgebase to manage electronic resources, partnering with NCSU and their expertise on building an ERMS.

[Some stuff about HathiTrust, which is a members-only club my institution is not a part of, so I kind of tuned out.]

Something something Hydra and Avalon media system and Sufia.

Forking of open projects means that no one else can really use it, and that the local institution is on its own for maintaining it.

In summary, consider if you can spend money on investing in these kinds of projects rather than buying existing vendor products.

ER&L 2015 – Discovery Systems: Building a Better User Experience

discovery
photo by lecates

Speakers: Michael Fernandez, University of Mary Washington; Kirsten Ostergaard, Montana State University; Alex Homanchuk, OCAD University
Moderator: Kelsey Brett, University of Houston

AH:
Specialized studio art and design programs. Became a university in 2002, which meant the library needed to expand to support liberal arts programming. The had limited use of a limited number of resources and wanted to improve the visibility and exposure of the other parts of their collections.

MF:

Mid-sized liberal arts institution that is primarily undergraduate. The students want something convenient and easy to use, and they aren’t concerned with where the content is coming from. The library wanted to expose the depth of their collections.

KO:
Strong engineering and agriculture programs. Migrated to Primo from Summon recently. They had chosen Summon a few years ago for similar reasons noted by the other panelists. The decision to move was in part due to a statewide consortia, and this had some to do with the University of Montana’s decision.

AH:
They looked at how well their resources were represented in the central index. They had a lot of help from other Ontario institutions by learning from their experiences. There was also a provincial RFI from a few years ago to help inform them. They were already using the KB that would power the discovery service, so it was easier to implement. Reference staff strongly favored one particular solution, in part due to some of the features unique to it.

They began implementing in late January and planned a soft launch for March, which they felt was enough time for staff training (both back and front end). It was slightly rough start because they implemented with Summon 2, and in the midst of this ProQuest also moved to a new ticketing system.

MF:
They did trials. They looked at costs, including setup fees and rate increases and potential discounts. They looked at content coverage and gaps. They looked at the functionality of the user interface and search relevancy for general and known item resources. They looked at the systems aspects, including ILS integration and other systems via API, and the frequency and timeliness of catalog updates.

They opted to not implement the federated searching widgets in EDS to search the ProQuest content. Instead, they use the database recommender to direct students to relevant, non-EBSCO databases.

KO:
They wanted similar usability to what they had in Summon, and similar coverage. The timeline for implementation was longer than they initially planned, in part due to the consortial setup and decisions about how content would be displayed at different institutions. This gets complicated with ebook licenses that are for specific institutions only. Had to remove deduplication of records, which makes for slightly messy search results, but the default search is only for local content.

They had to migrate their eresources to a new KB, and the collections don’t always match up. They are conducting an audit of the data. They still have 360 Core while they are migrating to SFX.

AH:
The implementation team included representatives from across the library, which helped for getting buy-in. Feedback responsiveness was important, too. Staff and faculty comments influenced their decisions about user interface options. Instruction librarians vigorously promoted it, particularly in the first year courses.

MF:
Similar to the previous speaker’s experience.

KO:
They wanted to make sure the students were comfortable with the familiar, but also market the new/different functionality and features of Primo. They promoted them through the newsletter, table tents, library homepage, press release, and Friends of the Library newsletter.

AH:
Launched a web survey to get user feedback. The reception has been favorable, with the predictable issues. They’ve seen a bump in the use of their materials in general, but a decline in the multi-disciplinary databases. The latter is due in part to a lower priority of those resources in the rankings and a lack of IEDL for that content.

MF:
They did surveys of the staff and student assistants during the trials. The students indicated that there is a learning curve for the discovery systems, and they were using the facets. They also use Google Analytics to analyze usage and also determine which days are lower use for the catalog update.

KO:
There hasn’t been any feedback from the website form. Staff report errors. They have done some user testing in the library of known item and general searches. They are working on the branding to take Ex Libris out and put more MSU.

ER&L 2015 – Link Resolvers and Analytics: Using Analytics Tools to Identify Usage Trends and Access Problems

Google Analytics (3rd ed)

Speaker: Amelia Mowry, Wayne State University

Setting up Google Analytics on a link resolver:

  1. Create a new account in Analytics and put the core URL in for your link resolver, which will give you the tracking ID.
  2. Add the tracking code to the header or footer in the branding portion of the link resolver.

Google Analytics was designed for business. If someone spends a lot of time on a business site it’s good, but not necessarily for library sites. Brief interactions are considered to be bounces, which is bad for business, but longer times spent on a link resolver page could be a sign of confusion or frustration rather than success.

The base URL refers to several different pages the user interacts with. Google Analytics, by default, doesn’t distinguish them. This can hide some important usage and trends.

Using custom reports, you can tease out some specific pieces of information. This is where you can filter down to specific kinds of pages within the link resolver tool.

You can create views that will allow you to see what a set of IP ranges are using, which she used to filter to the use by computers in the library and computers not in the library. IP data is not collected by default, so if you want to do this, set it up at the beginning.

To learn where users were coming from to the link resolver, she created another custom report with parameters that would include the referring URLs. She also created a custom view that included the error parameter “SS_Error”. Some were from LibGuides pages, some were from the catalog, and some were from databases.

Ask specific and relevant questions of your data. Apply filters carefully and logically. Your data is a starting point to improving your service.

Google Analytics (3rd edition) by Ledford, Tyler, and Teixeira (Wiley) is a good resource, though it is business focused.