Moving Up to the Cloud, a panel lecture hosted by the VCU Libraries

“Sky symphony” by Kevin Dooley

“Educational Utility Computing: Perspectives on .edu and the Cloud”
Mark Ryland, Chief Solutions Architect at Amazon Web Services

AWS has been a part of revolutionizing the start-up industries (i.e. Instagram, Pinterest) because they don’t have the cost of building server infrastructures in-house. Cloud computing in the AWS sense is utility computing — pay for what you use, easy to scale up and down, and local control of how your products work. In the traditional world, you have to pay for the capacity to meet your peak demand, but in the cloud computing world, you can level up and down based on what is needed at that moment.

Economies, efficiencies of scale in many ways. Some obvious: storage, computing, and networking equipment supply change; internet connectivity and electric power; and data center sitting, redundancy, etc. Less obvious: security and compliance best practices; datacenter internal innovations in networking, power, etc.

AWS and .EDU: EdX, Coursera, Texas Digital Library, Berkeley AMP Lab, Harvard Medical, University of Phoenix, and an increasing number of university/school public-facing websites.

Expects that we are heading toward cloud computing utilities to function much like the electric grid — just plug in and use it.


“Libraries in Transition”
Marshall Breeding, library systems expert

We’ve already seen the shift of print to electronic in academic journals, and we’re heading that way with books. Our users are changing in the way they expect interactions with libraries to be, and the library as space is evolving to meet that, along with library systems.

Web-based computing is better than client/server computing. We expect social computing to be integrated into the core infrastructure of a service, rather than add-ons and afterthoughts. Systems need to be flexible for all kinds of devices, not just particular types of desktops. Metadata needs to evolve from record-by-record creation to bulk management wherever possible. MARC is going to die, and die soon.

How are we going to help our researchers manage data? We need the infrastructure to help us with that as well. Semantic web — what systems will support it?

Cooperation and consolidation of library consortia; state-wide implementations of SaaS library systems. Our current legacy ILS are holding libraries back from being able to move forward and provide the services our users want and need.

A true cloud computing system comes with web-based interfaces, externally hosted, subscription OR utility pricing, highly abstracted computing model, provisioned on demand, scaled according to variable needs, elastic.


“Moving Up to the Cloud”
Mark Triest, President of Ex Libris North America

Currently, libraries are working with several different systems (ILS, ERMS, DRs, etc.), duplicating data and workflows, and not always very accurately or efficiently, but it was the only solution for handling different kinds of data and needs. Ex Libris started in 2007 to change this, beginning with conversations with librarians. Their solution is a single system with unified data and workflows.

They are working to lower the total cost of ownership by reducing IT needs, minimize administration time, and add new services to increase productivity. Right now there are 120+ institutions world-wide who are in the process of or have gone live with Alma.

Automated workflows allow staff to focus on the exceptions and reduce the steps involved.

Descriptive analytics are built into the system, with plans for predictive analytics to be incorporated in the future.

Future: collaborative collection development tools, like joint licensing and consortial ebook programs; infrastructure for ad-hoc collaboration


“Cloud Computing and Academic Libraries: Promise and Risk”
John Ulmschneider, Dean of Libraries at VCU

When they first looked at Alma, they had two motivations and two concerns. They were not planning or thinking about it until they were approached to join the early adopters. All academic libraries today are seeking to discover and exploit new efficiencies. The growth of cloud-resident systems and data requires academic libraries to reinvigorate their focus on core mission. Cloud-resident systems are creating massive change throughout out institutions. Managing and exploiting pervasive change is a serious challenge. Also, we need to deal with security and durability of data.

Cloud solutions shift resources from supporting infrastructure to supporting innovation.

Efficiencies are not just nice things, they are absolutely necessary for academic libraries. We are obligated to upend long-held practice, if in doing so we gain assets for practice essential to our mission. We must focus recovered assets on the core library mission.

Agility is the new stability.

Libraries must push technology forward in areas that advance their core mission. Infuse technology evolution for libraries with the values needs of libraries. Libraries must invest assets as developers, development partners, and early adopters. Insist on discovery and management tools that are agnostic regarding data sources.

Managing the change process is daunting.. but we’re already well down the road. It’s not entirely new, but it does involve a change in culture to create a pervasive institutional agility for all staff.

house concert, of sorts

The Kennedy Center has a streaming video recording of Wishing Chair‘s performance from last week. Almost a full hour of bliss.

The Kennedy Center has a streaming video recording of Wishing Chair‘s performance from last week. Almost a full hour of bliss.

torrent

I have a new obsession hobby. This past weekend, I discovered BitTorrent, a cooperative distribution protocol for sharing large files. In my case, it’s a way to get concert recordings (of artist who approve of that sort of thing) without having to set up a snail mail trade with someone and hope it works out … Continue reading “torrent”

I have a new obsession hobby. This past weekend, I discovered BitTorrent, a cooperative distribution protocol for sharing large files. In my case, it’s a way to get concert recordings (of artist who approve of that sort of thing) without having to set up a snail mail trade with someone and hope it works out for the best. It’s fast way to distribute shows, too. I was able to download a show yesterday that was recorded the day before!

The key to scalable and robust distribution is cooperation. With BitTorrent, those who get your file tap into their upload capacity to give the file to others at the same time. Those that provide the most to others get the best treatment in return. (“Give and ye shall receive!”)

Cooperative distribution can grow almost without limit, because each new participant brings not only demand, but also supply. Instead of a vicious cycle, popularity creates a virtuous circle. And because each new participant brings new resources to the distribution, you get limitless scalability for a nearly fixed cost.

So, what’s the library world application of this kind of protocol? The obvious one that comes to my mind is distribution of electronic publications. If part of the difficulties of publishing electronically is the cost of maintaining the backfiles, then perhaps publisher should look into a BitTorrent-type distribution model where everyone who is interested in the electronic content shares the burden of distributing it on demand.

I’m not sure if this would work in libraryland, but it certainly makes it a lot easier to share concert recordings.

css.php