NASIG 2013: Collaboration in a Time of Change

CC BY 2.0 2013-06-10
“soccer practice” by woodleywonderworks

Speaker: Daryl Yang

Why collaborate?

Despite how popular Apple products are today, they almost went bankrupt in the 90s. Experts believe that despite their innovation, their lack of collaboration led to this near-downfall. iTunes, iPod, iPad — these all require working with many developers, and is a big part of why they came back.

Microsoft started off as very open to collaboration and innovation from outside of the company, but that is not the case now. In order to get back into the groove, they have partnered with Nokia to enter the mobile phone market.

Collaboration can create commercial success, innovation, synergies, and efficiencies.

What change?

The amount of information generated now is vastly more than has ever been collected in the past. It is beyond our imagination.

How has library work changed? We still manage collections and access to information, but the way we do so has evolved with the ways information is delivered. We have had to increase our negotiation skills as every transaction is uniquely based on our customer profile. We have also needed to reorganize our structures and workflows to meet changing needs of our institutions and the information environment.

Deloitte identified ten key challenges faced by higher education: funding (public, endowment, and tuition), rivalry (competing globally for the best students), setting priorities (appropriate use of resources), technology (infrastructure & training), infrastructure (classroom design, offices), links to outcomes (graduation to employment), attracting talent (and retaining them), sustainability (practicing what we preach), widening access (MOOC, open access), and regulation (under increasing pressure to show how public funding is being used, but also maintaining student data privacy).

Libraries say they have too much stuff on shelves, more of it is available electronically, and it keeps coming. Do we really need to keep both print and digital when there is a growing pressure on space for users?

The British Library Document Supply Centre plays an essential role in delivering physical content on demand, but the demand is falling as more information is available online. And, their IT infrastructure needs modernization.

These concerns sparked conversations that created UK Research Reserve, and the evaluation of print journal usage. Users prefer print for in-depth reading, and HSS still have a high usage of print materials compared to the sciences. At least, that was the case 5-6 years ago when UKRR was created.

Ithaka S+R, JISC, and RLUK sent out a survey to faculty about print journal use, and they found that this is still fairly true. They also discovered that even those who are comfortable with electronic journal collections, they would not be happy to see print collections discarded. There was clearly a demand that some library, if not their own, maintain a collection of hard copies of journals. Libraries don’t have to keep them, but SOMEONE has to.

It is hard to predict research needs in the future, so it is important to preserve content for that future demand, and make sure that you still own it.

UKRR’s initial objectives were to de-duplicate low-use journals and allow their members to release space and realize savings/efficiency, and to preserve research material and provide access for researchers. They also want to achieve cultural change — librarians/academics don’t like to throw away things.

So far, they have examined 60,700 holdings, and of that, only 16% has been retained. They intend to keep at least 3 copies among the membership, so there was a significant amount of overlap in holdings across all of the schools.

NASIG 2013: Adopting and Implementing an Open Access Policy — The Library’s Role

CC BY-NC-SA 2.0 2013-06-10
“Open Access promomateriaal” by biblioteekje

Speaker: Brian Kern

Open access policy was developed late last year and adopted/implemented in March. They have had it live for 86 days, so he’s not an expert, but has learned a lot in the process.

His college is small, and he expects less than 40 publications submitted a year, and they are using the institutional repository to manage this.

They have cut about 2/3 of their journal collections over the past decade, preferring publisher package deals and open access publications. They have identified the need to advocate for open access as a goal of the library. They are using open source software where they can, hosted and managed by a third party.

The policy borrowed heavily from others, and it is a rights-retention mandate in the style of Harvard. One piece of advice they had was to not focus on the specifics of implementation within the policy.

The policy states that it will be automatically granted, but waivers are available for embargo or publisher prohibitions. There are no restrictions on where they can publish, and they are encouraged to remove restrictive language from contracts and author addendum. Even with waivers, all articles are deposited to at least a “closed” archive. It stipulates that they are only interested in peer-reviewed articles, and are not concerned with which version of the article is deposited. Anything published or contracted to be published before the adoption date is not required to comply, but they can if they want to.

The funding, as one may expect, was left out. The library is going to cover the open access fees, with matching funds from the provost. Unused funds will be carried over year to year.

This was presented to the faculty as a way to ensure that their rights were being respected when they publish their work. Nothing was said about the library and our traditional concerns about saving money and opening access to local research output.

The web hub will include the policy, a FAQ, recommended author addendum based on publisher, funding information, and other material related to the process. The faculty will be self-depositing, with review/edit by Kern.

They have a monthly newsletter/blog to let the campus know about faculty and student publications, so they are using this to identify materials that should be submitted to the collection. He’s also using Stephen X. Flynn’s code to identify OA articles via SHERPA/RoMEO to find the ones already published that can be used to populate the repository.

They are keeping the senior projects closed in order to keep faculty/student collaborations private (and faculty research data offline until they publish).

They have learned that the policy is dependent on faculty seeing open access as a reality and library keeping faculty informed of the issues. They were not prepared for how fast they would get this through and that submissions would begin. Don’t expect faculty to be copyright lawyers. Keep the submission process as simple as possible, and allow them to use alternatives like email or paper.

NASIG 2012: Managing E-Publishing — Perfect Harmony for Serialists

Presenters: Char Simser (Kansas State University) & Wendy Robertson (University of Iowa)

Iowa looks at e-publishing as an extension of the central mission of the library. This covers not only text, but also multimedia content. After many years of ad-hoc work, they formed a department to be more comprehensive and intentional.

Kansas really didn’t do much with this until they had a strategic plan that included establishing an open access press (New Prairie). This also involved reorganizing personnel to create a new department to manage the process, which includes the institutional depository. The press includes not only their own publications, but also hosts publications from a few other sources.

Iowa went with BEPress’ Digital Commons to provide both the repository and the journal hosting. Part of why they went this route for their journals was because they already had it for their repository, and they approach it more as being a hosting platform than as being a press/publisher. This means they did not need to add staff to support it, although they did add responsibilities to exiting staff in addition to their other work.

Kansas is using Open Journal Systems hosted on a commercial server due to internal politics that prevented it from being hosted on the university server. All of their publications are Gold OA, and the university/library is paying all of the costs (~$1700/year, not including the .6 FTE staff hours).

Day in the life of New Prairie Press — most of the routine stuff at Kansas involves processing DOI information for articles and works-cited, and working with DOAJ for article metadata. The rest is less routine, usually involving journal setups, training, consultation, meetings, documentation, troubleshooting, etc.

The admin back-end of OJS allows Char to view it as if she is different types of users (editor, author, etc.) to be able to trouble-shoot issues for users. Rather than maintaining a test site, they have a “hidden” journal on the live site that they use to test functions.

A big part of her daily work is submitting DOIs to CrossRef and going through the backfile of previously published content to identify and add DOIs to the works-cited. The process is very manual, and the error rate is high enough that automation would be challenging.

Iowa does have some subscription-based titles, so part of the management involves keeping up with a subscriber list and IP addresses. All of the titles eventually fall into open access.

Most of the work at Iowa has been with retrospective content — taking past print publications and digitizing them. They are also concerned with making sure the content follows current standards that are used by both library systems and Google Scholar.

There is more. I couldn’t take notes and keep time towards the end.

reason #237 why JSTOR rocks

For almost two decades, JSTOR has been digitizing and hosting core scholarly journals across many disciplines. Currently, their servers store more than 1,400 journals from the first issue to a rolling wall of anywhere from 3-5 years ago (for most titles). Some of these journals date back several centuries.

They have backups, both digital and virtual, and they’re preserving metadata in the most convertible/portable formats possible. I can’t even imagine how many servers it takes to store all of this data. Much less how much it costs to do so.

And yet, in the spirit of “information wants to be free,” they are making the pre-copyright content open and available to anyone who wants it. That’s stuff from before 1923 that was published in the United States, and 1870 for everything else. Sure, it’s not going to be very useful for some researchers who need more current scholarship, but JSTOR hasn’t been about new stuff so much as preserving and making accessible the old stuff.

So, yeah, that’s yet another reason why I think JSTOR rocks. They’re doing what they can with an economic model that is responsible, and making information available to those who can’t afford it or are not affiliated with institutions that can purchase it. Scholarship doesn’t happen in a vacuum, and  innovators and great minds aren’t always found solely in wealthy institutions. This is one step towards bridging the economic divide.

NASIG 2011: Leaving the Big Deal – Consequences and Next Steps

Speaker: Jonathan Nabe

His library has left the GWLA Springer/Kluwer and Wiley-Blackwell consortia deals, and a smaller consortia deal for Elsevier. The end result is a loss of access to a little less than 2000 titles, but most of the titles had fewer than 1 download per month in the year prior to departure. So, they feel that ILL is a better price than subscription for them.

Because of the hoops jumped for ILL, he thinks those indicate more of a real need than downloading content available directly to the user. Because they retain archival access, withdrawing from the deals only impacts current volumes, and the time period has been too short to truly determine the impact, as they left the deals in 2009 and 2010. However, his conclusion based on the low ILL requests is that the download stats are not accurate due to incidental use, repeat use, convenience, and linking methods.

The other area of impact is reaction and response, and so far they have had only three complaints. It could be because faculty are sympathetic, or it could be because they haven’t needed the current content, yet. They have used this as an opportunity to educate faculty about the costs. They also opened up cancellations from the big publishers, spreading the pain more than they could in the past.

In the end, they saved the equivalent of half their monograph budget by canceling the big deals and additional serials. Will the collection be based on the contracts they have or by the needs of the community?

Moving forward, they have hit some issues. One is that a certain publisher will impose a 25% content fee to go title by title. Another issue is that title by title purchasing put them back at the list price which is much higher than the capped prices they had under the deal. They were able to alleviate some issues with negotiation and agreeing to multi-year deals that begin with the refreshed lists of titles.

The original GWLA deal with Springer allowed for LOCKSS as a means for archival access. However, they took the stance that they would not work with LOCKSS, so the lawyers got involved with the apparent breech of contract. In the end, Springer agreed to abide by the terms of the contract and make their content available to LOCKSS harvesting.

Make sure you address license issues before the end of the terms.

Speaker: David Fowler

They left the Elsevier and Wiley deals for their consortias. They have done cost savings measures in the past with eliminating duplication of format and high cost & low use titles, but in the end, they had to consider their big deals.

The first thing they eliminated was the pay per use access to Elsevier due to escalating costs and hacking abuse. The second thing they did was talk to OSU and PSU about collaborative collection development, including a shared collection deal with Elsevier. Essentially, they left the Orbis Cascade deal to make their own.

Elsevier tried to negotiate with the individual schools, but they stood together and were able to reduce the cancellations to 14% due to a reduced content fee. So far, the 2 year deal has been good, and they are working on a 4 year deal, and they won’t exceed their 2009 spend until 2014.

They think that ILL increase has more to do with WorldCat Local implementation, and few Elsevier titles were requested. Some faculty are concerned about the loss of low use high cost titles, so they are considering a library mediated pay-per-view option.

The Wiley deal was through GWLA, and when it came to the end, they determined that they needed to cancel titles that were not needed anymore, which meant leaving the deal. They considered going the same route they did with Elsevier, but were too burnt out to move forward. Instead, they have a single-site enhanced license.

We cannot continue to do business as usual. They expect to have to do a round of cancellations in the future.