ER&L 2015 – Discovery Systems: Building a Better User Experience

discovery
photo by lecates

Speakers: Michael Fernandez, University of Mary Washington; Kirsten Ostergaard, Montana State University; Alex Homanchuk, OCAD University
Moderator: Kelsey Brett, University of Houston

AH:
Specialized studio art and design programs. Became a university in 2002, which meant the library needed to expand to support liberal arts programming. The had limited use of a limited number of resources and wanted to improve the visibility and exposure of the other parts of their collections.

MF:

Mid-sized liberal arts institution that is primarily undergraduate. The students want something convenient and easy to use, and they aren’t concerned with where the content is coming from. The library wanted to expose the depth of their collections.

KO:
Strong engineering and agriculture programs. Migrated to Primo from Summon recently. They had chosen Summon a few years ago for similar reasons noted by the other panelists. The decision to move was in part due to a statewide consortia, and this had some to do with the University of Montana’s decision.

AH:
They looked at how well their resources were represented in the central index. They had a lot of help from other Ontario institutions by learning from their experiences. There was also a provincial RFI from a few years ago to help inform them. They were already using the KB that would power the discovery service, so it was easier to implement. Reference staff strongly favored one particular solution, in part due to some of the features unique to it.

They began implementing in late January and planned a soft launch for March, which they felt was enough time for staff training (both back and front end). It was slightly rough start because they implemented with Summon 2, and in the midst of this ProQuest also moved to a new ticketing system.

MF:
They did trials. They looked at costs, including setup fees and rate increases and potential discounts. They looked at content coverage and gaps. They looked at the functionality of the user interface and search relevancy for general and known item resources. They looked at the systems aspects, including ILS integration and other systems via API, and the frequency and timeliness of catalog updates.

They opted to not implement the federated searching widgets in EDS to search the ProQuest content. Instead, they use the database recommender to direct students to relevant, non-EBSCO databases.

KO:
They wanted similar usability to what they had in Summon, and similar coverage. The timeline for implementation was longer than they initially planned, in part due to the consortial setup and decisions about how content would be displayed at different institutions. This gets complicated with ebook licenses that are for specific institutions only. Had to remove deduplication of records, which makes for slightly messy search results, but the default search is only for local content.

They had to migrate their eresources to a new KB, and the collections don’t always match up. They are conducting an audit of the data. They still have 360 Core while they are migrating to SFX.

AH:
The implementation team included representatives from across the library, which helped for getting buy-in. Feedback responsiveness was important, too. Staff and faculty comments influenced their decisions about user interface options. Instruction librarians vigorously promoted it, particularly in the first year courses.

MF:
Similar to the previous speaker’s experience.

KO:
They wanted to make sure the students were comfortable with the familiar, but also market the new/different functionality and features of Primo. They promoted them through the newsletter, table tents, library homepage, press release, and Friends of the Library newsletter.

AH:
Launched a web survey to get user feedback. The reception has been favorable, with the predictable issues. They’ve seen a bump in the use of their materials in general, but a decline in the multi-disciplinary databases. The latter is due in part to a lower priority of those resources in the rankings and a lack of IEDL for that content.

MF:
They did surveys of the staff and student assistants during the trials. The students indicated that there is a learning curve for the discovery systems, and they were using the facets. They also use Google Analytics to analyze usage and also determine which days are lower use for the catalog update.

KO:
There hasn’t been any feedback from the website form. Staff report errors. They have done some user testing in the library of known item and general searches. They are working on the branding to take Ex Libris out and put more MSU.

Leave a Reply

css.php