NASIG 2010: Publishing 2.0: How the Internet Changes Publications in Society

Presenter: Kent Anderson, JBJS, Inc

Medicine 0.1: in dealing with the influenza outbreak of 1837, a physician administered leeches to the chest, James’s powder, and mucilaginous drinks, and it worked (much like take two aspirin and call in the morning). All of this was written up in a medical journal as a way to share information with peers. Journals have been the primary source of communicating scholarship, but what the journal is has become more abstract with the addition of non-text content and metadata. Add in indexes and other portals to access the information, and readers have changed the way they access and share information in journals. “Non-linear” access of information is increasing exponentially.

Even as technology made publishing easier and more widespread, it was still producers delivering content to consumers. But, with the advent of Web 2.0 tools, consumers now have tools that in many cases are more nimble and accessible than the communication tools that producers are using.

Web 1.0 was a destination. Documents simply moved to a new home, and “going online” was a process separate from anything else you did. However, as broadband access increases, the web becomes more pervasive and less a destination. The web becomes a platform that brings people, not documents, online to share information, consume information, and use it like any other tool.

Heterarchy: a system of organization replete with overlap, multiplicity, mixed ascendandacy and/or divergent but coextistent patterns of relation

Apomediation: mediation by agents not interposed between users and resources, who stand by to guide a consumer to high quality information without a role in the acquisition of the resources (i.e. Amazon product reviewers)

NEJM uses terms by users to add related searches to article search results. They also bump popular articles from searches up in the results as more people click on them. These tools improved their search results and reputation, all by using the people power of experts. In addition, they created a series of “results in” publications that highlight the popular articles.

It took a little over a year to get to a million Twitter authors, and about 600 years to get to the same number of book authors. And, these are literate, savvy users. Twitter & Facebook count for 1.45 million views of the New York Times (and this is a number from several years ago) — imagine what it can do for your scholarly publication. Oh, and NYT has a social media editor now.

Blogs are growing four times as fast as traditional media. The top ten media sites include blogs and the traditional media sources use blogs now as well. Blogs can be diverse or narrow, their coverage varies (and does not have to be immediate), they are verifiably accurate, and they are interactive. Blogs level that media playing field, in part by watching the watchdogs. Blogs tend to investigate more than the mainstream media.

It took AOL five times as long to get to twenty million users than it did for the iPhone. Consumers are increasingly adding “toys” to their collection of ways to get to digital/online content. When the NEJM went on the Kindle, more than just physicians subscribed. Getting content into easy to access places and on the “toys” that consumers use will increase your reach.

Print digests are struggling because they teeter on the brink of the daily divide. Why wait for the news to get stale, collected, and delivered a week/month/quarter/year later? People are transforming. Our audiences don’t think of information as analogue, delayed, isolated, tethered, etc. It has to evolve to something digital, immediate, integrated, and mobile.

From the Q&A session:

The article container will be here for a long time. Academics use the HTML version of the article, but the PDF (static) version is their security blanket and archival copy.

Where does the library as source of funds when the focus is more on the end users? Publishers are looking for other sources of income as library budgets are decreasing (i.e. Kindle, product differentiation, etc.). They are looking to other purchasing centers at institutions.

How do publishers establish the cost of these 2.0 products? It’s essentially what the market will bear, with some adjustments. Sustainability is a grim perspective. Flourishing is much more positive, and not necessarily any less realistic. Equity is not a concept that comes into pricing.

The people who bring the tremendous flow of information under control (i.e. offer filters) will be successful. One of our tasks is to make filters to help our users manage the flow of information.

NASIG 2010: Let the Patron Drive: Purchase on Demand of E-books

Presenters: Jonathan Nabe, Southern Illinois University-Carbondale and Andrea Imre, Southern Illinois University-Carbondale

As resources have dwindled over the years, libraries want to make sure every dollar spent is going to things patrons will use. Patron-driven acquisition (PDA) means you’re only buying things that your users want.

With the Coutts MyiLibrary, they have access to over 230,000 titles from more than 100 publishers, but they’ve set up some limitations and parameters (LC class, publication year, price, readership level) to determine which titles will be made available to users for the PDA program. You can select additional title after the initial setup, so the list is constantly being revised and enhanced. And, they were able to upload their holdings to eliminate duplications.

[There are, of course, license issues that you should consider for your local use, as with any electronic resource. eBooks come with different sorts of use concerns than journals, but by now most of us are familiar with them. However, those of us in the session are blessed with a brief overview of these concerns. I recommend doing a literature study if this interests you.]

They opted for a deposit account to cover the purchases, and when a title is purchased, they add a purchase order to the bibliographic record already in the catalog. (Records for available titles in the program are added to the catalog to begin with, and titles are purchased after they have been accessed three times.)

[At this point, my attention waned even further. More interested in hearing about how it’s working for them than about the processes they use to set up and manage it, as I’m familiar with how that’s supposed to work.]

They’ve spent over $54,000 since November 2008 and purchased 470 titles (approx $115/title on average). On average, 95 pages are viewed per purchased title, which is a stat you can’t get from print. Half of the titles have been used after the initial purchase, and over 1,000 titles were accessed once or twice (prior to purchase and not enough to initiate purchase).

Social sciences and engineering/technology are the high users, with music and geography at the low end. Statistically, other librarians have pushed back against PDA more than users, and in their case, the humanities librarian decided this wasn’t a good process and withdrew those titles from the program.

During the same time period, they purchased almost 17,000 print titles, and due to outside factors that delayed purchases 77% of those titles have never circulated. Only 1% circulated more than four times. [Hard to compare the two, since ebooks may be viewed several times by one person as they refer back to it, when a print book only has the checkout stat and no way to count the number of times it is “viewed” in the same way.]

Some issues to consider:

  • DRM (digital rights management) can cause problems with using the books for classroom/course reserves. DRM also often prevents users from downloading the books to preferred portable, desktop, or other ebook readers. There are also problems with incompatible browsers or operating systems.
  • Discovery options also provide challenges. Some publishers are better than other at making their content discoverable through search tools.
  • ILL is non-existent for ebooks. We’ve solved this for ejournals, but ebooks are still a stumbling block for traditional borrowing and lending.
  • There are other ebook purchasing options, and the “big deal” may actually be more cost-effective. They provide the wide access options, but at a lower per-book cost.
  • Archival copies may not be provided, and if it is, there are issues with preservation and access that shift long-term storage from free to an undetermined cost.

one less totebag this year

The North American Serials Interest Group (NASIG) is meeting at Rancho Las Palmas in Palm Springs (CA) this year, and rather than having a bunch of new totebags made for carrying around conference swag, they’re holding a contest that asks participants to bring bags from past conferences. Any bag will do, whether it be from a NASIG conference or not, and there are four categories to compete in: oldest NASIG bag, oldest conference bag in general, bag from a conference furthest from Palm Springs, and ugliest bag.

I don’t have any that could compete well in this contest, so I won’t be participating other than bringing the plain old NASIG-branded canvas tote bag I bought last year. It’s easier to get the netbook and other swag in and out of that between sessions than it is to use my backpack.

ER&L 2010: Comparison Complexities – the challenges of automating cost-per-use data management

Speakers: Jesse Koennecke & Bill Kara

We have the use reports, but it’s harder to pull in the acquisitions information because of the systems it lives in and the different subscription/purchase models. Cornell had a cut in staffing and an immediate need to assess their resources, so they began to triage statistics cost/use requests. They are not doing systematic or comprehensive reviews of all usage and cost per use.

In the past, they have tried doing manual preparation of reports (merging files, adding data), but that’s time-consuming. They’ve had to set up processes to consistently record data from year to year. Some vendor solutions have been partially successful, and they are looking to emerging options as well. Non-publisher data such as link resolver use data and proxy logs might be sufficient for some resources, or for adding a layer to the COUNTER information to possibly explain some use. All of this has required certain skill sets (databases, spreadsheets, etc.)

Currently, they are working on managing expectations. They need to define the product that their users (selectors, administrators) can expect on a regular basis, what they can handle on request, and what might need a cost/benefit decision. In order to get accurate time estimates for the work, they looked at 17 of their larger publisher-based accounts (not aggregated collections) to get an idea of patterns and unique issues. As an unfortunate side effect, every time they look at something, they get an idea of even more projects they will need to do.

The matrix they use includes: paid titles v. total titles, differences among publishers/accounts, license period, cancellations/swaps allowed, frontfile/backfile, payment data location (package, title, membership), and use data location and standard. Some of the challenges with usage data include non-COUNTER compliance or no data at all, multiple platforms for the same title, combined subscriptions and/or title changes, titles transferred between publishers, and subscribed content v. purchased content. Cost data depends on the nature of the account and the nature of the package.

For packages, you can divide the single line item by the total use, but that doesn’t help the selectors assess the individual subset of titles relevant to their areas/budgets. This gets more complicated when you have packages and individual titles from a single publisher.

Future possibilities: better automated matching of cost and use data, with some useful data elements such as multiple cost or price points, and formulas for various subscription models. They would also like to consolidate accounts within a single publisher to reduce confusion. Also, they need more documentation so that it’s not just in the minds of long-term staff. 

ER&L 2010: Patron-driven Selection of eBooks – three perspectives on an emerging model of acquisitions

Speaker: Lee Hisle

They have the standard patron-driven acquisitions (PDA) model through Coutts’ MyiLibrary service. What’s slightly different is that they are also working on a pilot program with a three college consortia with a shared collection of PDA titles. After the second use of a book, they are charged 1.2-1.6% of the list price of the book for a 4-SU, perpetual access license.

Issues with ebooks: fair use is replaced by the license terms and software restrictions; ownership has been replaced by licenses, so if Coutts/MyiLibrary were to go away, they would have to renegotiate with the publishers; there is a need for an archiving solution for ebooks much like Portico for ejournals; ILL is not feasible for permissible; potential for exclusive distribution deals; device limitations (computer screens v. ebook readers).

Speaker: Ellen Safley

Her library has been using EBL on Demand. They are only buying 2008-current content within specific subjects/LC classes (history and technology). They purchase on the second view. Because they only purchase a small subset of what they could, the number of records they load fluxuates, but isn’t overwhelming.

After a book has been browsed for more than 10 minutes, the play-per-view purchase is initiated. After eight months, they found that more people used the book at the pay-per-view level than at the purchase level (i.e. more than once).

They’re also a pilot for an Ebrary program. They had to deposit $25,000 for the 6 month pilot, then select from over 100,000 titles. They found that the sciences used the books heavily, but there were also indications that the humanities were popular as well.

The difficulty with this program is an overlap between selector print order requests and PDA purchases. It’s caused a slight modification of their acquisitions flow.

Speaker: Nancy Gibbs

Her library had a pilot with Ebrary. They were cautious about jumping into this, but because it was coming from their approval plan vendor, it was easier to match it up. They culled the title list of 50,000 titles down to 21,408, loaded the records, and enabled them in SFX. But, they did not advertise it at all. They gave no indication of the purchase of a book on the user end.

Within 14 days of starting the project, they had spent all $25,000 of the pilot money. Of the 347 titles purchased, 179 of the purchased titles were also owned in print, but those print only had 420 circulations. The most popularly printed book is also owned in print and has had only two circulations. The purchases leaned more towards STM, political science, and business/economics, with some humanities.

The library tech services were a bit overwhelmed by the number of records in the load. The MARC records lacked OCLC numbers, which they would need in the future. They did not remove the records after the trial ended because of other more pressing needs, but that caused frustration with the users and they do not recommend it.

They were surprised by how quickly they went through the money. If they had advertised, she thinks they may have spent the money even faster. The biggest challenge they had was culling through the list, so in the future running the list through the approval plan might save some time. They need better match routines for the title loads, because they ended up buying five books they already have in electronic format from other vendors.

Ebrary needs to refine circulation models to narrow down subject areas. YBP needs to refine some BISAC subjects, as well. Publishers need to communicate better about when books will be made available in electronic format as well as print. The library needs to revise their funding models to handle this sort of purchasing process.

They added the records to their holdings on OCLC so that they would appear in Google Scholar search results. So, even though they couldn’t loan the books through ILL, there is value in adding the holdings.

They attempted to make sure that the books in the list were not textbooks, but there could have been some, and professors might have used some of the books as supplementary course readings.

One area of concern is the potential of compromised accounts that may result in ebook pirates blowing through funds very quickly. One of the vendors in the room assured us they have safety valves for that in order to protect the publisher content. This has happened, and the vendor reset the download number to remove the fraudulent downloads from the library’s account.

customer service

My car was broken into last week. After I got over the initial shock and disbelief, I focused on getting the window repaired and dealing with the cleanup. The thief stole my GPS (which I’d had for about three months) and the Sony eReader Touch that was sent to me to review over the next few months (which I’d had for about a week). Replacement costs for the stolen items is around $450. The window cost a bit more than the $250 deductible from my insurance. I’m still waiting on what the insurance company will do about the property loss.

When I let Sony’s PR folks know I wouldn’t be able to write the reviews, their immediate response was sympathy for my situation and an inquiry into whether they could send me a replacement. Several days later, I have received notification that I will indeed be getting a replacement from them. The cost of the reader is nominal for Sony compared to the publicity they’re likely to get by me writing about it, so it’s probably no skin off their nose to send another one, but it sure means a lot to me that they did.

This got me to thinking about libraryland and our customer service practices. Most libraries aren’t multinational companies with huge revenues, but the way we handle situations like this with our users can have an impact on our relationships with them. What would you do if one of your users came to you with a story of their car getting broken into and the library books they checked out were stolen? Would you believe them? Would your policies allow you to waive any fines or replacement costs for the lost books?

IL2009: Technology: The Engine Driving Pop Culture-Savvy Libraries or Source of Overload?

Speaker: Elizabeth Burns

Technology and pop culture drive each other. Librarians sometimes assume that people using technology like smart phones in libraries are wasting time, both theirs and ours, but we really don’t know how they are using tech. Librarians need to learn how to use the tech that their user community employs, so don’t hinder your staff by limiting what tech they can use while in the workplace.

Libraries also have the responsibility to inform users of the services and technology available to them. Get the tools, learn how to use them, and then get to work building things with them.

Your library’s tech trendspotting group needs more than just the techie people. Get the folks who aren’t as excited about the shiny to participate and ask questions. Don’t let the fear of Betamax stop you – explore new devices and delivery methods now rather than waiting to find out if they have longevity. You never know what’s going to stick.

Speaker: Sarah Houghton-Jan

"Information overload is the Devil"

Some people think that it didn’t exist before mobile phones and home computers, but the potential has always existed. Think about the piles of books you’ve acquired but haven’t read yet. Information overload is all of the piles of things you want to learn but haven’t yet.

"We have become far more proficient in generating information than we are in managing it…"

Librarians are more equipped to handle information overload than most others. Manage your personal information consumption with the same kind of tools and skills you use in your professional life.

Some of the barriers to dealing with information overload are: lack of time or (a perceived lack of time), lack of interest or motivation, not being encouraged/threatened by management, not knowing where to start, and frustration with past attempts. Become like the automatic towel dispensers that have the towels already dispensed and ready to be torn off as needed.

Inventory your inputs and devices. Think before you send/subscribe. Schedule yourself, including unscheduled work and tasks. Use downtime (bring tech that helps you do it). Stay neat. Keep a master waiting list of things that other people "owe" you, and then periodically follow-up on them. Weed, weed, and weed again. Teach others communication etiquette (and stick to it). Schedule unplugged times, and unplug at will.

RSS/Twitter overload: Limit your feeds and following, and regularly evaluate them. Use lists to organize feeds and Twitter friends. Use RSS when applicable, and use it to send you reminders.

Interruptive technology (phone, IM, texts, Twitter, etc): Use them only when they are appropriate for you. Check it when you want to, and don’t interrupt yourself. Use your status message. Lobby for IM or Twitter at your workplace (as an alternative to phone or email, for the status message function & immediacy). Keep your phone number private. Let it ring if you are busy. Remember that work is at work and home is at home, and don’t mix the two.

Email: Stop "doing email" — start scheduling email scanning time, use it when appropriate, and deal with it by subject. Keep your inbox nearly empty and filter your messages. Limit listservs. Follow good email etiquette. Delete and archive, and keep work and personal email separate.

Physical items: Just because you can touch it, doesn’t mean you should keep it. Cancel, cancel, cancel (catalogchoice.org). Weed what you have.

Multimedia: Choose entertainment thoughtfully. Limit television viewing and schedule your entertainment time. Use your commute to your benefit.

Social networking: Schedule time on your networks. Pick a primary network and point other sites towards it. Limit your in-network IM.

Time & stress management: Use your calendar. Take breaks. Eliminate stressful interruptions. Look for software help. Balance your life and work to your own liking, not your boss’s or your spouse’s.

[Read Lifehacker!]

IL2009: Mobile Content Delivery in the Enterprise

Speakers: Britt Mueller

Often, there are more librarians who’s organizations loan ebook readers to their users than who own or use ebook readers themselves. Devices are driving all of the changes in the content, and we need to pay attention to that.

General Mills launched their ebook reader lending program in the fall of 2008 with six Kindles pre-loaded with content and attached to a purchasing card registered with each device. They’ve had over 120 loans over the past year with a wait list (two week loan periods).

Qualcomm launched a similar program at around the same time, but they went with four types of ereaders: Kindle, Sony 505, Bookeen Cybook, and Irex Iliad). They’ve had over 500 loans over the past year with a wait list, and they’ve updated the devices with the newer models as they were released.

One of the down sides to the devices is that there is no enterprise model. Users have to go through the vendor to get content, rather than getting the content directly from the library. Users liked the devices but wanted them to be as customized to their individual preferences and yet still shareable, much like borrowing other devices like laptops and netbooks from the library.

There is a uniform concern among publishers and vendors for how to track/control usage in order to pay royalties, which makes untethering the content problematic. There is a lack of standardization in format, which makes converting content to display on a wide range of devices problematic as well. And finally, the biggest stumbling block for libraries is a lack of an enterprise model for acquiring and sharing content on these devices.

Implications for the future: integration into the ILS, staff time to manage the program, cost, and eventually, moving away from lending devices and moving towards lending the content.

Internet Librarian 2009 begins

Yesterday was my first time touching California soil (I had previously spent some time in LAX, but I don’t think that counts), and I have to say, Monterey is as beautiful as everyone says it is. Also, the Crown & Anchor is a fantastic place to gather with friends who arrived and left through the evening last night. Good times.

I arrived too late this morning to get a seat at the opening keynote session with Vint Cerf, Chief Internet Evangelist for Google, so I stood in the back and listened for most of it. Look around and you’ll probably find some good write-ups, and it was streamed live and the recording is available on Ustream. Pay attention to the Ustream channel to catch more of IL 2009!

This afternoon, I will be co-presenting on some of (IMHO) the best tools for collaboration using cloud computing resources. We have our presentation posted on Slide Share already, if you’re interested (and that way, you don’t have to be there and see how nervous I can be when speaking in front of a group of people who are probably smarter than me).

o hai!

No posts since July? No, this blog isn’t dead yet, but it’s certainly on life support. I am finally admitting that with all of the other venues for getting my thoughts and opinions out to the world, this blog is falling by the wayside.

I like having a space to share things that need more than 140 characters, but as you can see, I don’t always have time to make use of it. I encourage you to keep this in your feed reader, as I do intend to continue to do summary notes of conference sessions I attend, and occasionally throw in a book review or a rant about some current news item.

If you want to keep up with my other activities, check the sidebar for links and such. I’m also on Twitter, although I’ve been relatively quiet there, too.

css.php