NASIG 2011: Gateway to Improving ERM System Deliverables – NISO’s ERM Data Standards and Best Practices Review

Speaker: Bob McQuillan

I had notes from this session that were lost to a glitch in the iPad WordPress app. Essentially, it was an overview of why the ERM Data Standards and Best Practices Review working group was created followed by a summary of their findings. The final report will be available soon, and if the grid/groupings of ERM standards and best practices that Bob shared in his presentation are included in the report, I would highly recommend it as a clear and efficient tool to identify the different aspects of ERMS development and needs.

NASIG 2011: Managing Ebook Acquisition — the Coordination of “P” and “E” Publication Dates

Speaker: Sarah Forzetting & Gabrielle Wiersma

They are sending bib records to their book supplier weekly in order to eliminate duplication of format and other ebook packages. This might be helpful for libraries that purchase ebooks through publisher platforms in addition to through their vendor.

One of the challenges of ebook acquisition is that publishers are delaying publication or embargoing access on aggregators in order to support the print book sales. Fortunately the delay between print and ebook publication is diminishing — the average delay has gone down from 185 days to 21 since 2008.

For certain profiles in the approval plan, Coutts will set aside books that match for a certain period of time until the ebook is available. If the ebook is not available in that time, they will ship the print. If the librarian does not want to wait for the ebook, they can stop the wait process and move forward with the print purchase right away.

Part of the profile setup for e-preferred or print-preferred not only includes the subject areas, but also content type. For example, some reference works are more useful in electronic format.

Oh, my! They have their PDA set up so that two uses trigger a purchase. I should find out what constitutes a use.

NASIG 2011: Polishing the Crystal Ball — Using Historical Data to Project Serials Trends and Pricing

Speakers: Steve Bosch & Heather Klusendorf

The Library Journal periodicals price survey was developed in partnership with EBSCO when the ALA pulled the old column to publish in American Libraries. There is a similar price survey being done by the AALL for law publications.

There is a difference between a price survey and a price index. A price survey is a broad look, and a price index attempts to control the categories/titles included.

[The next bit was all about the methodology behind making the LJ survey. Not why I am interested, so not really taking notes on it.]

Because of the challenge of getting pricing for ejournals, the survey is based mainly on print prices. That being said, the trends in pricing for print is similar to that of electronic.

Knowing the trends for pricing in your specific set of journals can help you predict what you need to budget for. While there are averages across the industry, they may not be accurate depending on the mix in your collection. [I am thinking that this means that the surveys and indexes are useful for broad picture looks at the industry, but maybe not for local budget planning?]

It is important to understand what goes into a pricing tool and how it resembles or departs from local conditions in order to pick the right one to use.

Budgets for libraries and higher education are not in “recovery.” While inflation calmed down last year, they are on the rise this year, with an estimate of 7-8%. The impact may be larger than at the peak of the serials pricing crisis in the 1990s. Libraries will have less buying power, and users will have less resources, and publishers will have fewer customers.

Why is the inflation rate for serials so much higher than the consumer price index inflation rate? There has been an expansion of higher education, which adds to the amount of stuff being published. The rates of return for publishers are pretty much normal for their industry. There isn’t any one reason why.

NASIG 2011: Books in Chains

Speaker: Paul Duguid

Unlike the automotive brand wars, tech brand wars still require a level of coordination and connectivity between each other. Intel, Windows, and Dell can all be in one machine, and it became a competition as to which part motivated the purchase.

The computer/tech supply chain is odd. The most important and difficult component to replace is the hard drive, and yet most of us don’t know who makes the drives in our computers. It makes a huge difference in profit when your name is out front.

Until the mid 1800s, the wine sold had the retailer name on it, not the vineyard. Eventually, that shifted, and then shifted again to being sold by the name of the varietal.

In the book supply chain, there are many links, and the reader who buys the book may not see any of the names involved, and at different times in history, the links were the brand that sold it. Mark Twain and Rudyard Kipling tried to trademark their names so that publishers could not abuse them.

In academia, degrees are an indication of competency, and the institution behind the degree is a part of the brand. Certification marks began with unions in the US, and business schools were among the first to go out and register their names. However, it gets tricky when the institution conferring the degrees is also taking in fees from students. Is it certification or simply selling the credentials?

Who brands in publishing? We think the author, but outside of fiction, that starts to break down. Reference works are generally branded by the publisher. Reprint series are branded by the series. Romances are similar. Do we pay attention to who wrote the movie, TV series, or even newspaper article?

What happens when we go digital? The idealist’s view is that information wants to be free. The pragmatic view is that information needs to be constrained. Many things that are constraints are also resources. The structure and organization of a newspaper has much to do with the paper it is on. Also, by limiting to what fits on the paper, it conveys an indication of importance if it makes it into print. Free information suffers from a lack of filters to make the important bits rise to the top.

We think of technologies replacing each other, but in fact they tend to create new niches by taking away some but not all of the roles of the old tech. What goes and what stays is what you see as integral.

NASIG 2011: Reporting on Collections

Speakers: Sandy Hurd, Tina Feick, & John Smith

Development begins with internal discussion, a business case, and a plan for how the data will be harvested. And discussion may need to include the vendors who house or supply the data, like your ILS or ERM.

Product development on the vendor side can be prompted by several things, including specific needs, competition, and items in an RFP. When customers ask for reports, they need to determine if it is a one-time thing, something that can be created by enhancing what they already have, or something they aren’t doing yet. There may be standards, but collaborative data is still custom development between two entities, every time.

Have you peeked under the rug? The report is only as good as the data you have. How much cleanup are you willing to do? How can your vendor help? Before creating reports, think about what you have to solve and what you wish you could solve, statistics you need, the time available to generate them, and whether or not you can do it yourself.

There are traditional reporting tools like spreadsheets, and increasingly there are specialized data storage and analysis tools. We are looking at trends, transactional data, and projections, and we need this information on demand and more frequently than in the past. And the data needs to be interoperable. (Dani Roach is quietly shouting, “CORE! CORE!”) Ideally, we would be able to load relevant data from our ERMS, acquisitions modules, and other systems.

One use of the data can be to see who is using what, so properly coded patron records are important. The data can also be essential for justifying the redistribution of resources. People may not like what they hear, but at least you have the data to back it up.

The spreadsheets are not the reports. They are the data.

NASIG 2011: Using Assessment to Make Collection Development Decisions

Speaker: Mary Ann Trail & Kerry Chang FitzGibbon

It is not in the interest of faculty to cut journal titles because it may be perceived as an admission that it is not needed. With relying on faculty input for collection decisions, the collection can become skewed when certain faculty are more vocal than others.

When a new director arrived in 2000, they began to use more data to make decisions. And, the increase in aggregator databases and ejournals changed what was being collected. In addition to electronic publishing, electronic communication has changed the platform and audience for faculty communicating with each other and administrators, which can be both good and bad for library budgets.

In 2005, after some revision of collection methods, cancellations, and reallocation, they went to a periodicals allocation formula. This didn’t work out as well as expected, and was abandoned in 2008.

As a part of their assessment projects in 2008, they looked at the overlap between print and electronic titles to see if they could justify canceling the print in order to address the budget deficit. Most importantly, they wanted to proactively calm the faculty, who were already upset about past cancellations, with assurances that they would not lose access to the titles.

They used their ERMS to generate overlap analysis report, and after some unnecessary and complicated exporting and sorting, she was able to identify overlaps with their print collection. Then she identified the current subscriptions before going to the databases to verify that the access is correct and noted any embargo information. This was then combined with budget line, costs, and three years of usage (both print and electronic for non-aggregator access).

They met their budget target by canceling the print journals, and they used the term “format change” instead of cancel when they communicated with faculty. Faculty showed more support for this approach, and were more willing to advocate for library funds.

Did they consider publications that have color illustrations or other materials that are better in print? Yes, and most of them were retained in print.

Did they look at acquiring other databases to replace additional print cancellations? No, not with their funding situation.

What was the contingency plan for titles removed from the aggregator? Would resubscribe if the faculty asked for it, but funds would likely come from the monograph budget.

NASIG 2011: Leaving the Big Deal – Consequences and Next Steps

Speaker: Jonathan Nabe

His library has left the GWLA Springer/Kluwer and Wiley-Blackwell consortia deals, and a smaller consortia deal for Elsevier. The end result is a loss of access to a little less than 2000 titles, but most of the titles had fewer than 1 download per month in the year prior to departure. So, they feel that ILL is a better price than subscription for them.

Because of the hoops jumped for ILL, he thinks those indicate more of a real need than downloading content available directly to the user. Because they retain archival access, withdrawing from the deals only impacts current volumes, and the time period has been too short to truly determine the impact, as they left the deals in 2009 and 2010. However, his conclusion based on the low ILL requests is that the download stats are not accurate due to incidental use, repeat use, convenience, and linking methods.

The other area of impact is reaction and response, and so far they have had only three complaints. It could be because faculty are sympathetic, or it could be because they haven’t needed the current content, yet. They have used this as an opportunity to educate faculty about the costs. They also opened up cancellations from the big publishers, spreading the pain more than they could in the past.

In the end, they saved the equivalent of half their monograph budget by canceling the big deals and additional serials. Will the collection be based on the contracts they have or by the needs of the community?

Moving forward, they have hit some issues. One is that a certain publisher will impose a 25% content fee to go title by title. Another issue is that title by title purchasing put them back at the list price which is much higher than the capped prices they had under the deal. They were able to alleviate some issues with negotiation and agreeing to multi-year deals that begin with the refreshed lists of titles.

The original GWLA deal with Springer allowed for LOCKSS as a means for archival access. However, they took the stance that they would not work with LOCKSS, so the lawyers got involved with the apparent breech of contract. In the end, Springer agreed to abide by the terms of the contract and make their content available to LOCKSS harvesting.

Make sure you address license issues before the end of the terms.

Speaker: David Fowler

They left the Elsevier and Wiley deals for their consortias. They have done cost savings measures in the past with eliminating duplication of format and high cost & low use titles, but in the end, they had to consider their big deals.

The first thing they eliminated was the pay per use access to Elsevier due to escalating costs and hacking abuse. The second thing they did was talk to OSU and PSU about collaborative collection development, including a shared collection deal with Elsevier. Essentially, they left the Orbis Cascade deal to make their own.

Elsevier tried to negotiate with the individual schools, but they stood together and were able to reduce the cancellations to 14% due to a reduced content fee. So far, the 2 year deal has been good, and they are working on a 4 year deal, and they won’t exceed their 2009 spend until 2014.

They think that ILL increase has more to do with WorldCat Local implementation, and few Elsevier titles were requested. Some faculty are concerned about the loss of low use high cost titles, so they are considering a library mediated pay-per-view option.

The Wiley deal was through GWLA, and when it came to the end, they determined that they needed to cancel titles that were not needed anymore, which meant leaving the deal. They considered going the same route they did with Elsevier, but were too burnt out to move forward. Instead, they have a single-site enhanced license.

We cannot continue to do business as usual. They expect to have to do a round of cancellations in the future.

NASIG 2011: Science Re-Imagined

Speaker: Adam Bly

Coming from the perspective of science having the potential to improve the state of the world. We are in a moment of discovery of things we have never seen before. We also have the capabilities to manipulate nature, and it is prompting a need for an ethical framework. And, science is being done in cultures that have a rich scientific history, but are not part of the Western traditions.

The amount of data we are now creating is producing a moment of incredible opportunity. Information scientists have the opportunity to influence the literacy of society so that we can take ownership of and understand the data created, particularly that data about ourselves.

Science has the potential to change the world but only if two conditions exist. We need to think of science literacy as something for every single person on this planet. The second condition is open science — science can’t be proprietary for a company or a country.

We need a new philosophy of scientific literacy. Science is not just about its output. It is a way of thinking. Science is a lens through which we can solve the world’s problems. Science must engage through culture and ideas.

Art has a role in exploring the possibilities of science. Take the book Flatland as an example of exploring something we have trouble understanding.

We didn’t grow up hating science because we didn’t know it was science. We were once all scientists by our actions and experimentation as children. It is when we associate it with exams and challenging tasks that we began to hate it.

The future of science is open, not because it should be, but because it needs to be in order for it to progress. Nature does not recognize our disciplines — everything is complex and connected. Science is not a closed system — 65% of scientists cite that literature has an influence on their work.

The web that was created by science is not ideal for science. It is disorganized, fragmented, and inefficient. Info is organized by disciplines, decisions are made by lagging indications, and innovations are driven to preserve a business model and not for research. Scientists deserve better, and society needs better.

Scientists aren’t waiting. They hack things. They fix things. They don’t wait on the product cycle.

Open science that could work is based in a digital core that does not rely on the limitations of paper with mandated free flow of information with subsidized peer review, open standards and interoperability, knowledge from information, and modern metrics.

The information of today is complex and vast, and we need a new way to visualize it. We are bringing design and computer science together. Visualizing.org hopes to connect open data with open design.

There is no such thing as “open access,” “open data,” or “open science” — we solve society’s problems with science as a tool and a lens. A 21st century renaissance is science integrated with and not standing in opposition to other forces.