Moving Up to the Cloud, a panel lecture hosted by the VCU Libraries

“Sky symphony” by Kevin Dooley

“Educational Utility Computing: Perspectives on .edu and the Cloud”
Mark Ryland, Chief Solutions Architect at Amazon Web Services

AWS has been a part of revolutionizing the start-up industries (i.e. Instagram, Pinterest) because they don’t have the cost of building server infrastructures in-house. Cloud computing in the AWS sense is utility computing — pay for what you use, easy to scale up and down, and local control of how your products work. In the traditional world, you have to pay for the capacity to meet your peak demand, but in the cloud computing world, you can level up and down based on what is needed at that moment.

Economies, efficiencies of scale in many ways. Some obvious: storage, computing, and networking equipment supply change; internet connectivity and electric power; and data center sitting, redundancy, etc. Less obvious: security and compliance best practices; datacenter internal innovations in networking, power, etc.

AWS and .EDU: EdX, Coursera, Texas Digital Library, Berkeley AMP Lab, Harvard Medical, University of Phoenix, and an increasing number of university/school public-facing websites.

Expects that we are heading toward cloud computing utilities to function much like the electric grid — just plug in and use it.


“Libraries in Transition”
Marshall Breeding, library systems expert

We’ve already seen the shift of print to electronic in academic journals, and we’re heading that way with books. Our users are changing in the way they expect interactions with libraries to be, and the library as space is evolving to meet that, along with library systems.

Web-based computing is better than client/server computing. We expect social computing to be integrated into the core infrastructure of a service, rather than add-ons and afterthoughts. Systems need to be flexible for all kinds of devices, not just particular types of desktops. Metadata needs to evolve from record-by-record creation to bulk management wherever possible. MARC is going to die, and die soon.

How are we going to help our researchers manage data? We need the infrastructure to help us with that as well. Semantic web — what systems will support it?

Cooperation and consolidation of library consortia; state-wide implementations of SaaS library systems. Our current legacy ILS are holding libraries back from being able to move forward and provide the services our users want and need.

A true cloud computing system comes with web-based interfaces, externally hosted, subscription OR utility pricing, highly abstracted computing model, provisioned on demand, scaled according to variable needs, elastic.


“Moving Up to the Cloud”
Mark Triest, President of Ex Libris North America

Currently, libraries are working with several different systems (ILS, ERMS, DRs, etc.), duplicating data and workflows, and not always very accurately or efficiently, but it was the only solution for handling different kinds of data and needs. Ex Libris started in 2007 to change this, beginning with conversations with librarians. Their solution is a single system with unified data and workflows.

They are working to lower the total cost of ownership by reducing IT needs, minimize administration time, and add new services to increase productivity. Right now there are 120+ institutions world-wide who are in the process of or have gone live with Alma.

Automated workflows allow staff to focus on the exceptions and reduce the steps involved.

Descriptive analytics are built into the system, with plans for predictive analytics to be incorporated in the future.

Future: collaborative collection development tools, like joint licensing and consortial ebook programs; infrastructure for ad-hoc collaboration


“Cloud Computing and Academic Libraries: Promise and Risk”
John Ulmschneider, Dean of Libraries at VCU

When they first looked at Alma, they had two motivations and two concerns. They were not planning or thinking about it until they were approached to join the early adopters. All academic libraries today are seeking to discover and exploit new efficiencies. The growth of cloud-resident systems and data requires academic libraries to reinvigorate their focus on core mission. Cloud-resident systems are creating massive change throughout out institutions. Managing and exploiting pervasive change is a serious challenge. Also, we need to deal with security and durability of data.

Cloud solutions shift resources from supporting infrastructure to supporting innovation.

Efficiencies are not just nice things, they are absolutely necessary for academic libraries. We are obligated to upend long-held practice, if in doing so we gain assets for practice essential to our mission. We must focus recovered assets on the core library mission.

Agility is the new stability.

Libraries must push technology forward in areas that advance their core mission. Infuse technology evolution for libraries with the values needs of libraries. Libraries must invest assets as developers, development partners, and early adopters. Insist on discovery and management tools that are agnostic regarding data sources.

Managing the change process is daunting.. but we’re already well down the road. It’s not entirely new, but it does involve a change in culture to create a pervasive institutional agility for all staff.

my love/hate relationship with reading books

ALA Read mini-poster
“ALA Read mini-poster” by me

This year I participated in the “set your own challenge” book reading thinger on Goodreads. Initially, I set mine at 25, as a little stretch goal from my average of 19 books per year over the past four years. But, I was doing so well in the early part of the year that I increased it to 30. The final total was 27, but I’m part-way through several books that I just didn’t have time to finish as the clocked ticked down to the end of the year.

What worked well for me this time: audiobooks. I read more of them than paper books this year, and it forced me to expand to a variety of topics and styles I would not have patience for in print.

What failed me this time: getting hung up on a book I felt obligated to finish, but did not excite me to continue on it, so I kept avoiding it. To be fair, part of what turned me off was on disc two, I accidentally set my car’s CD player to shuffle. This is great for adding some variety to music listening, but it made for confusing and abrupt transitions from one topic/focus to another.

I read a lot of non-fiction, because that works better in audio format for me, and I read more audio than printed (either in paper or electronic) books. For 2013, I’d like to read more fiction, which means reading more in print. Which means making time for my “must read the whole book cover to cover” method of reading fiction.

audiobook: 20
print book: 7
ebook: 0

fiction: 5
non-fiction: 22

books read in 2012



More of Anna’s books »

Book recommendations, book reviews, quotes, book clubs, book trivia, book lists

I created something delicious last weekend

chocolate salted caramel peanut butter cookies
chocolate salted caramel peanut butter cookies

Some friends host a cookie exchange party every year, and they have a panel of judges determine which ones are the best. I decided to do something a little different this year, rather than following a basic recipe for the same old, same old. I started thinking about it shortly after Thanksgiving, which may be why I decided to take my inspiration from the turducken.

I began with a basic peanut butter cookie dough (mine came from the Better Homes and Gardens New Cook Book), which I chilled while I ran some errands and then made a chocolate ganache (warning: that recipe makes far more than you really need for this). I’d picked up some salted caramels from Trader Joe’s recently, and I chilled them in the freezer before chopping into three pieces each.

Next, I shaped the peanut butter cookie dough into a log and divided it into 24 slices. Carefully, I shaped and flattened each slice into a cookie round, as thin as I could while keeping it from falling apart. I spooned some ganache on a round, added a piece of the salted caramel, and then put another flattened round on top. I sealed the edges together, making a little pie/turnover out of the cookie, and then placed that carefully on the baking sheet. They baked beautifully, and spread out more than I was expecting, so the second round were spaced a bit more.

Ultimately, they did not win the competition, but I received an honorable mention and plenty of compliments. Well worth the effort.

Does anyone have suggestions for what to do with a bowl full of well-refrigerated chocolate ganache?

thankful for Thanksgiving

Banana boat tragedy by Robbie V
“Banana boat tragedy” by Robbie V

Next week is a two-day work week, and my schedule for those two days is almost completely wide open. This means, if all goes well, I might actually recover from being away for Charleston last week and being away two days this week for meetings. There are about 50 action items on my list, ranging from a few minutes attention to a few hours attention. And that’s just the “must deal with now” stuff. Forget doing any of my ongoing projects.

The blessing and curse of travel — you get to do cool things, see cool places, and meet cool people, but then you spend several days of work hell trying to atone for the sin of not being there.

carpe diem

Mac & Cheese is a vegetable!
Mac & Cheese is a vegetable!

I told a friend yesterday that I felt like I didn’t carpe enough diem at Charleston Conference. It was my first time attending, and I didn’t have a good sense of the flow. I wasn’t prepared for folks to be leaving so early on Saturday, I didn’t know about the vendor showcase on Wednesday until after I made my travel arrangements, and I felt like I didn’t make the most of the limited time I had.

Next time will be better. And yes, there will be a next time, but maybe after a year or two. I understand from some regulars that the plenary sessions were below average this year, which matched my disappointed expectations. Now knowing that there is little vetting of the concurrent sessions, I will be more particular in my choices the next time, and hopefully select sessions where the content matches my expectations based on the abstracts.

The food in Charleston definitely met my expectations. I had tasty shrimp & grits a couple times, variations on fried chicken nearly every day, and a yummy cup of she crab soup. Tried a few local brews, and a dark & stormy from a cool bar that brews their own ginger beer. I’d go back for the food for sure.

Charleston 2012: EWWW!: Electronic Resources in the 21st Century (or How I Learned to Stop Worrying about the Catalog and Love the MARC Records Service)

15/52 : Titanic by Eric Constantineau
“15/52 : Titanic” by Eric Constantineau

Speakers: Ladd Brown, Andi Ogier, and Annette Bailey, Virginia Tech

Libraries are not about the collections anymore, they’re about space. The library is a place to connect to the university community. We are aggressively de-selecting, buying digital backfiles in the humanities to clear out the print collections.

Guess what? We still have our legacy workflows. They were built for processing physical items. Then eresources came along, and there were two parallel processes. Ebooks have the potential of becoming a third process.

Along with the legacy workflows, they have a new Dean, who is forward thinking. The Dean says it’s time to rip off the bandaid. (Titanic = old workflow; iceberg = eresources; people in life boats = technical resources team) Strategic plans are living documents kept on top of the desk and not in the drawer.

With all of this in mind, acquisitions leaders began meeting daily in a group called Eresources Workflow Weekly Work, planning the changes they needed to make. They did process mapping with sharpies, post-its, and incorporated everyone in the library that had anything to do with eresources. After lots of meetings, position descriptions began to emerge.

Electronic Resource Supervisor is the title of the former book and serials acquisitions heads. The rest — wasn’t clear from the description.

They had a MARC record service for ejournals, but after this reorganization process, they realized they needed the same for ebooks, and could be handled by the same folks.

Two person teams were formed based on who did what in the former parallel processes, and they reconfigured their workspace to make this more functional. The team cubes are together, and they have open collaboration spaces for other groupings.

They shifted focus from maintaining MARC records in their ILS to maintaining accurate title lists and data in their ERMS. They’re letting the data from the ERMS populate the ILS with appropriate MARC records.

They use some Python scripts to help move data from system to system, and more staff are being trained to support it. They’re also using the Google Apps portal for collaborative projects.

They wanted to take risks, make mistakes, fail quickly, but also see successes come quickly. They needed someplace to start, and to avoid reinventing the wheel, so they borrowed heavily from the work done by colleagues at James Madison University. They also hired Carl Grant as a consultant to ask questions and facilitate cross-departmental work.

Big thing to keep in mind: Administration needs to be prepared to allow staff to spend time learning new processes and not keeping up with everything they used to do at the same time. And, as they let go of the work they used to do, please tell them it was important or they won’t adopt the new work.

Charleston 2012: bX Usage-Based Services

Blogger recommend Sign by Davich Klinadung
“Blogger recommend Sign” by Davich Klinadung

Speaker: Christine Stohn, bX project manager

There are two components — the recommender and hot articles.

This began in 2009 with the article recommended, and as of this year, it’s used by over 1100 institutions. This year they added the hot articles service, with “popularity reports”. And, there is a mobile app for the hot articles service. Behind the scenes, there is the bX Data Lab, where they run experiments and quality control. They’re also interested in data mining researchers who might want to take the data and use it for their own work.

The data for bX comes from SFX users who actively contribute the data from user clicks at their institutions. It’s content-neutral, coming from many institutions.

bX is attempting to add some serendipity to searches that by definition require some knowledge of what you are looking for. When you find something from your searching, the bX recommender will find other relevant articles for you, based on what other people have used in the past. The hot articles component will list the most used articles from the last month that are on the same topic as your search result.

It currently works only with articles, but they are collecting data on ebooks that may eventually lead to the ability to recommend them as well.

The hot articles component is based on HILCC subjects that have been assigned to journal titles, so it’s not as precise as the recommender.

You can choose to limit the recommendations to only your holdings, but that limits the discovery. You can have indicators that show whether the item is available locally or not.

It’s available in SFX, Primo, Scopus, and the Science Direct platform. Hot articles can be embedded in LibGuides.

Atmetrics – probably will be incorporated to enhance the recommender service.

They are looking at article metrics calculated as a percentile rank per topic, which is more relevant today than the citations that may come five years down the road. It’s based on usage through SFX and bX, but not direct links or DOI links.

Charleston 2012: Wasted Words? Current Trends in CD Policies

Dad's Desk II by Chris Jagers
“Dad’s Desk II” by Chris Jagers

Speakers: Matt Torrence, Audrey Powers, & Megan Sheffield, University of South Florida

Are collection development policies viable today? In order answer this, they sent out a survey to ARL libraries to see if they are using them or if they’re experimenting with something else. They were also interested to know when and how data is being used in the process.

The survey results will be published in the proceedings. I will note anything here that seems particularly interesting, but it looks like all they are doing now is reading that to us.

Are collection development policies being used? Yes, sort of. Although most libraries in the survey do have them, they tend to be used for accreditation and communication, and often they are not consistently available either publicly or internally.

What are the motivations for using collection development policies? Tends to be more for external/marketing than for internal workflows.

They think that a collection  development “philosophy” may be a more holistic response to the changing nature of collection development.

Speakers: two people from the University of Arkansas at Little Rock, but they had four names on the PPT, and I didn’t catch who was who

They recently decided to revise their collection development policy/guidelines based on a recommendation from a strategic planning ARL Collection Analysis Project. They also had quite a few new librarians who needed to work with faculty selectors.

They did a literature review and gathered information on practices from peer institutions. They actually talked to the Office of Institutional Research about data on academic degree programs. And, like students, they looked online to see if they could borrow from existing documents.

One thing they took away from the review of what other libraries have out there was that they needed to have the document live on the web, and not just on paper in a binder in someone’s office.

Policies/guidelines should be continuously updating, flexible, acknowledge consortia memberships, acknowledge new formats, and strike a balance between being overly detailed and too general.

They see that the project has had some benefits, not only to themselves but also to provide a guide for current and future users of the policies. It is also a valuable tool for transmitting institutional memory.

css.php