NASIG 2009: What Color Is Your Paratext?

Presenter: Geoffrey Bilder, CrossRef

The title is in reference to a book that is geared towards preparing for looking for a new job or changing careers, which is relevant to what the serials world is facing, both personnel and content. Paratext is added content that prepares the audience/reader for the meat of the document. We are very good at controlling and evaluating credibility, which is important with conveying information via paratext.

The internet is fraught with false information, which undermines credibility. The publisher’s value is being questioned because so much of their work can be done online at little or no cost, and what can’t be done cheaply is being questioned. Branding is increasingly being hidden by layers like Google which provide content without indicating the source. The librarian’s problem is similar to the publisher’s. Our value is being questioned when the digital world is capable of managing some of our work through distributed organizational structures.

“Internet Trust Anti-Pattern” — a system starts out as being a self-selected core of users with an understanding of trust, but as it grows, that can break down unless there is a structure or pervasive culture that maintains the trust and authority.

Local trust is that which is achieved through personal acquaintance and is sometimes transitive. Global trust extends through proxy, which transitively extends trust to “strangers.” Local is limited and hard to expand, and global increases systemic risk.

Horizontal trust occurs among equals with little possibility of coercion. Vertical trust occurs within a hierarchy, and coercion can be used to enforce behavior, which could lead to abuse.

Internet trust is in the local and horizontal quadrant. Scholarly trust falls in the vertical and global quadrant. It’s no wonder we’re having trouble figuring out how to do scholarship online!

Researchers have more to read and less time to read it, and it’s increasing rapidly. We need to remember that authors and readers are the same people. The amazing ways that technology has opened up communication is also causing the overload. We need something to help identify credible information.

Dorothea Salo wrote that for people who put a lot of credibility in authoritative information, we don’t do a very good job of identifying it. She blames librarians, but publishers have a responsibility, too. Heuristics are important in knowing who the intended audience is meant to be.

If you find a book at a bargain store, the implication is that it is going to be substantially less authoritative than a book from a grand, old library. (There are commercial entities selling leather bound books by the yard for buyers to use to add gravitas to their offices and personal libraries.) Scholarly journals are dull and magazines are flashy & bright. Books are traditionally organized with all sorts of content that tells academics whether or not they need to read them (table of contents, index, blurbs, preface, bibliography, etc.).

If you were to black out the text of a scholarly document, you would still be able to identify the parts displayed. You can’t do that very well with a webpage.

When we evaluate online content, we look at things like the structure of the URL and where it is linked from. In the print world, citations and footnotes were essential clues to following conversations between scholars. Linking can do that now, but the convention is still more formal. Logos can also tell us whether or not to put trust in content.

Back in the day, authors were linked to printers, but that lead to credibility problems, so publishers stepped in. Authors and readers could trust that the content was accurate and properly presented. Now it’s not just publishers — titles have become brands. A journal reputation is almost more important than who is publishing it.

How do we help people learn and understand the heuristics in identifying scholarly information? The processes for putting out credible information is partially hidden — the reader or librarian doesn’t know or see the steps involved. We used to not want to know, but now we do, particularly since it allows us to differentiate between the good players and the bad players.

The idea of the final version of a document needs to be buried. Even in the print world (with errata and addenda) we were deluding ourselves in thinking that any document was truly finished.

Why don’t we have a peer reviewed logo? Why don’t we have something that assures the reader that the document is credible? Peer review isn’t necessarily perfect or the only way.

How about a Version of Record record? Show us what was done to a document to get it to where it is now. For example, look at Creative Commons. They have a logo that indicates something about the process of creating the document which leads to machine-readable coding. How about a CrossMark that indicates what a publisher has done with a document, much like what a CC logo will lead to?

Knowmore.org created a Firefox plugin to monitor content and provides icons that flags companies and websites for different reasons. Oncode is a way of identifying organizations that have signed a code of conduct. We could do this for scholarly content.

Tim Berners Lee is actively advocating for ways to overlay trust measures on the internet. It was originally designed by academics who didn’t need it, but like the internet anti-trust pattern, the “unwashed masses” have corrupted that trust.

What can librarians and publishers do to recreate the heuristics that have been effective in print? We are still making facsimiles of print in electronic format. How are we going to create the tools that will help people evaluate digital information?

NASIG 2009: ERMS Integration Strategies – Opportunity, Challenge, or Promise?

Speakers: Bob McQuillan (moderator), Karl Maria Fattig, Christine Stamison, and Rebecca Kemp

Many people have an ERM, some are implementing it, but few (in the room) are where they consider to be finished. ERMS present new opportunity and challenges with workflow and staffing, and the presenters intend to provide some insight for those in attendance.

At Fattig’s library, their budget for electronic is increasing as print is decreasing, and they are also running out of space for their physical collections. Their institution’s administration is not supportive of increasing space for materials, so they need to start thinking about how to stall or shrink their physical collection. In addition, they have had reductions in technical services staffing. Sound familiar?

At Kemp’s library, she notes that about 40% of her time is spent on access setup and troubleshooting, which is an indication of how much of their resources is allocated for electronic resources. Is it worth it? They know that many of their online resources are heavily used. Consorital “buying clubs” makes big deals possible, opening up access to more resources than they could afford on their own. Electronic is a good alternative to adding more volumes to already over-loaded shelves.

Stamison (SWETS) notes that they have seen a dramatic shift from print to electronic. At least two-thirds of the subscriptions they handle have an electronic component, and most libraries are going e-only when possible. Libraries tell them that they want their shelf space. Also, many libraries are going direct to publishers for the big deals, with agents getting involved only for EDI invoicing (cutting into the agent’s income). Agents are now investing in new technologies to assist libraries in managing e-collections, including implementing access.

Kemp’s library had a team of three to implement Innovative’s ERM. It took a change in workflow and incorporating additional tasks with existing positions, but everyone pulled through. Like libraries, Stamison notes that agents have had to change their workflow to handle electronic media, including extensive training. And, as libraries have more people working with all formats of serials, agents now have many different contacts within both libraries and publishers.

Fattig’s library also reorganized some positions. The systems librarian, acquisitions librarian, and serials & electronic resources coordinator all work with the ERMS, pulling from the Serials Solutions knowledge base. They have also contracted with someone in Oregon to manage their EZproxy database and WebBridge coverage load. Fattig notes that it takes a village to maintain an ERMS.

Agents with electronic gateway systems are working to become COUNTER compliant, and are heavily involved with developing SUSHI. Some are also providing services to gather those statistics for libraries.

Fattig comments that usage statistics are serials in themselves. At his library, they maintained a homegrown system for collecting usage statistics from 2000-07, then tried Serials Solutions Counter 360 for a year, but now are using an ERM/homegrown hybrid. They created their own script to clean up the files, because as we all know, COUNTER compliance means something different to each publisher. Fattig thinks that database searches are their most important statistics for evaluating platforms. They use their federated search statistics to weigh the statistics from those resources (will be broken out in COUNTER 3 compliance).

Kemp has not been able to import their use stats into ERM. One of their staff members goes in every month to download stats, and the rest come from ScholarlyStats. They are learning to make XML files out of their Excel files and hope to use the cost per use functionality in the future.

Fattig: “We haven’t gotten SUSHI to work in some of the places it’s supposed to.” Todd Carpenter from NISO notes that SUSHI compliance is a requirement of COUNTER 3.

For the next 12-18 months, Fattig expects that they will complete the creation of license and contact records, import all usage data, and implement SUSHI when they can. They will continue to work with their consorital tool, implement a discovery layer, and document everything. Plans to create a “cancellation ray gun and singalong blog” — a tool for taking criteria to generate suggested cancellation reports.

Like Fattig, Kemp plans to finish loading all of the data about license and contacts, also the coverage data. Looking forward to eliminating a legacy spreadsheet. Then, they hope to import COUNTER stats and run cost/use reports.

Agents are working with ONIX-PL to assist libraries in populating their ERMS with license terms. They are also working with CORE to assist libraries with populating acquisitions data. Stamison notes that agents are working to continue to be liaisons between publishers, libraries, and system vendors.

Dan Tonkery notes that he’s been listening to these conversations for years. No one is serving libraries very well. Libraries are working harder to get these things implemented, while also maintaining legacy systems and workarounds. “It’s too much work for something that should be simple.” Char Simser notes that we need to convince our administrations to move more staff into managing eresources as our budgets are shifting more towards them.

Another audience member notes that his main frustration is the lack of cooperation between vendors/products. We need a shared knowledge base like we have a shared repository for our catalog records. This gets tricky with different package holdings and license terms.

Audience question: When will the ERM become integrated into the ILS? Response: System vendors are listening, and the development cycle is dependent on customer input. Every library approaches their record keeping in different ways.

NASIG 2009: Ensuring Perpetual Access to Online Subscriptions

Presenters: Judy Luther (moderator), Ken DiFiore, Nancy Gibbs (contributed, but not able to attend), Selden Lamorueux, Victoria Reich, Heather Ruland Staines, and Kim Steinle

Librarians have asked publishers to ensure that they will have perpetual access available to paid subscribers, which is a reverse of the previous arrangement with print, which required the librarians to preserve access to the physical copy in their buildings. It took publishers some time to address this, and it continues to evolve.

Libraries, and academic libraries in particular, have proven staying power. Librarians, working with publishers, are responsible for providing access to information.

There are two issues with online perpetual access: maintaining access to content when a subscription ends and maintaining access to content when a publisher ceases to exist. Libraries have the resources to be custodians of the content through digital preservation practices, both alone and in community.

How are librarians & publishers managing expectations for the financial impact of moving to online-only content? First, make sure that there is some sort of preservation guarantee. You will need to start shifting staff from managing stacks/materials to managing online, which will impact training needs. There are cost savings in space and opening up the building to other relevant uses. Peter McCracken suggests that we should emphasize the service and benefits of online content.

Libraries purchase the delivery mechanism, not the content. Owning a print archive does not mean that online is automatically free. And, it’s not just the content that you get with online. The user interface is an essential component, and it isn’t cheap to develop or maintain.

The most important thing that librarians need to do is read the license agreement. If it doesn’t specify what happens to content if a subscription is canceled or if the publisher ceases to exist, negotiate for that to be added. CYA, folks! If you pay for it, make sure you’re really going to get it.

CIL 2009: Open Access: Green and Gold

Presenter: Shane Beers

Green open access (OA) is the practice of depositing and making available a document on the web. Most frequently, these are peer reviewed research and conference articles. This is not self-publishing! OA repositories allow institutions to store and showcase the research output of institutions, thus increasing their visibility within the academic community.

Institutional repositories are usually managed by either DSpace, Fedora, or EPrints, and there are third-party external options using these systems. There are also a few subject-specific repositories not affiliated with any particular institution.

The "serials crisis" results in most libraries not subscribing to every journal out there that their researchers need. OA eliminates this problem by making relevant research available to anyone who needs it, regardless of their economic barriers.

A 2008 study showed that less than 20% of all scientific articles published were made available in a green or gold OA repository. Self-archiving is at a low 15%, and incentives to do so increase it only by 30%. Researchers and their work habits are the greatest barriers that OA repository managers encounter. The only way to guarantee 100% self-archiving is with an institutional mandate.

Copyright complications are also barriers to adoption. Post-print archiving is the most problematic, particularly as publishers continue to resist OA and prohibit it in author contracts.

OA repositories are not self-sustaining. They require top-down dedication and support, not only for the project as a whole, but also the equipment/service and staff costs. A single "repository rat" model is rarely successful.

The future? More mandates, peer-reviewed green OA repositories, expanding repositories to encompass services, and integration of OA repositories into the workflow of researchers.

Presenter: Amy Buckland

Gold open access is about not having price or permission barriers. No embargos with immediate post-print archiving.

The Public Knowledge Project is an easy tool for creating an open journal that includes all the capabilities of online multi-media. For example, First Monday uses it.

Buckland wants libraries to become publishers of content by making the platforms available to the researchers. Editors and editorial boards can come from volunteers within the institution, and authors just need to do what they do.

Publication models are changing. May granting agencies are requiring OA components tied with funding. The best part: everyone in the world can see your institution’s output immediately!

Installation of the product is easy — it’s getting the word out that’s hard.

Libraries can make the MARC records freely available, and ensure that the journals are indexed in the Directory of Open Access Journals.

Doing this will build relationships between faculty and the library. Libraries become directly involved in the research output of faculty, which makes libraries more visible to administrators and budget decision-makers. University presses are struggling, but even though they are focused on revenue, OA journal publishing could enhance their visibility and status. Also, if you publish OA, the big G will find it (and other search engines).

CIL 2009: Social Network Profile Management

Speaker: Greg Schwartz

Who are you online? Identity is what you say about you and what others say about you. However, it’s more than just that. It includes the things you buy, the tools you use, the places you spend your time, etc.

You do not own your online identity. You can’t control what people find out about you, but you can influence it.

  1. Own your user name. Pick one and stick to it. Even better if you can use your real name. (checkusernames.com)
  2. Join the conversation. Develop your identity by participating in social networks.
  3. Listen. Pay attention to what other people are saying about you.
  4. Be authentic. Ultimately, social networking is about connecting your online identity to your in-person identity.

Speaker: Michael Porter

MP was the project manager for the social tools on WebJunction. It’s designed to be for librarians and library staff.

If you are representing your organization online, be yourself, but also be sensitive to how that could be perceived. Share your library success stories!

Speaker: Sarah Houghton-Jan

Library online identities should be created with a generic email address, should be up-to-date, and should allow comment and interaction with users. Keep the tone personable.

Don’t use multiple identities. Make sure that someone is checking the contact points. You’ll get better results if you disperse the responsibility for library online identities across your institution rather than relying on one person to manage it all.

Speaker: Amanda Clay Powers

People have been telling their stories for a long time, and online social networks are just another tool for doing that. Some people are more comfortable with this than others. It’s our role to educate people about how to manage their online identities, however, our users don’t always know that librarians can help them.

On Facebook, you can manage social data by creating friends lists. This functionality is becoming more important as online social networks grow and expand.

virtual services in libraries

This started as a comment response to David Lee King’s admonition, but by the time I got to paragraph three, I decided it would be better to post it here instead.

My library (small private academic university) offers both IM and email reference services. There is a note on the IM page of the website which states, “Users at the Main Service Desk have priority over IM users. IM users are taken in a first-come, first served order. If you would prefer not to wait, you may always email a librarian.” Essentially, this is the only way we can manage IM reference service with one person handling it at the same time they are answering questions at the desk and responding to email queries. So far, our users have been understanding, and IM reference makes up approximately 10% of our reference interactions.

I don’t see this as discriminating against our virtual users. Anyone in customer service will tell you that the person standing in front of you takes priority. The culture of IM is such that a delay in responding is acceptable, if not expected. Chat doesn’t mean that you drop everything else — we’re all used to multi-tasking while having an online conversation. Chat provides a faster back and forth than email, which is why many prefer it for reference interactions, but that doesn’t mean they expect instantaneous service.

The libraries with explicitly stated response times that DLK points out are large institutions serving large populations. My library can get away with fast response times because we might get one or two IM questions an hour, at most. Larger populations result in more questions, and depending on how in-depth those questions are, it may take several hours or longer to respond with all of the information the user is seeking. I often conclude a basic IM reference transaction by providing the student with the contact information for their subject librarian and the personal appointment request form. Some research needs can’t be met exclusively in an online environment.

I get what DLK is trying to say, and I agree that we should treat our online users with the same courtesy we do our in-person users. However, the limitations in online reference tools, staffing, and resources all combine to make it difficult to create a virtual library utopia. We should strive for it, yes, but making librarians feel even more guilty because they can’t do it (for whatever reason) is not going to improve the situation.

LITA 2008: Web Site Redesign – Perspectives from the Field, Panel Discussion

Panelists: Robin Leech (Oklahoma State University Libraries), Amelia Brunskill (Dickinson College), Edward M. Corrado (Binghamton University), Elizabeth Black (Ohio State University Libraries), Russell Schelby (Ohio State University Libraries)
Moderator: Mary LaMarca (Dartmouth College Library)

Black & Schelby:

When they began the project two years ago, the website was large and maintained by 100 content submitters, most of whom had limited coding expertise. Selected and implemented a Web Content Management System, and created a team of technical experts with both coding and project management skills. Black consciously focused on team development activities in addition to the projects the team worked on.

The team made a commitment to security, usability, maintainability, and data preservation of the website content. As a part of the data preservation, they were careful to document everything from architecture to passwords.

 

Brunskill:

Four years ago, Academic Technology, Library, and Information Services merged to become one division. The website was initially integrated, but then user feedback caused it to be broken out into separate divisions again. After a few years, the library wanted to make some changes, so they did a usability study, which resulted in some menu and vocabulary changes. Then, they began to plan for a much larger redesign.

To solve the communication problem, they set up a blog, charged unit representatives to report back to their units, and circulated usability data among all library staff. The usability studies also served as a buffer for touchy political situations, since the users are a neutral party.

 

Leech:

Developed two teams. The usability team informed the web redesign team, with only the library webmaster serving on both. Suggested that usability team read Don’t Make Me Think by Steve Krug.

 

Note: I had to leave early because I could not stop coughing. The hotel HVAC was not playing nicely with my cold.

more degrees for the same pay

In a recent Chronicle article, Todd Gilman complains about the lack of job postings for librarian subject specialists who have secondary master’s or doctoral degrees. While I think he makes valid points for why subject specialists should have post-graduate education in their fields of study, particularly if they are in tenure-track positions, I think he misses the mark as to why libraries are hiring folks without those degrees.

In that job posting and many others, the most attention paid to subject expertise (in the form of a master’s or Ph.D.) is a brief mention in the list of “preferred” qualifications. That is a strong indication that the hiring institution will settle for less — much less. In fact, I’m told that in a number of recent hires, Ph.D.’s and M.A.’s — some with years of professional experience working in top academic libraries in addition to having an MLIS — have been passed over in favor of candidates straight out of library school whose only previous degree was a bachelor’s.

Were they passed over because they asked for more compensation than what the institution was willing to pay? I suspect that may play a much larger role than what Mr. Gilman is giving it.

Libraries are usually the first target for budget cuts, and one of the biggest expenses in a library is staff salaries. Someone who has post-graduate degrees beyond the MLS will likely expect to be compensated for the additional skills and knowledge they bring to the job. University administrators either don’t understand or don’t care about the value that these folks add to collections and instruction, and as a result, they are unwilling to meet the compensation demands of these “better qualified” candidates. Recent graduates in any field will cost the university less in the salary department, and that short-term benefit is the only one that (mostly short-timer) administrators care about.

Given all that, would you go through the trouble of getting a second master’s degree or a doctoral degree, knowing that unless you are already in a tenure-track position with fair compensation, it is unlikely that you’ll be payed any more than you are already? Probably not, unless you were particularly passionate about research in your field of study.

Even so, that research might not help you with tenure, as some colleagues of mine discovered when their institution’s tenure requirements changed so that scholarship in their primary field (read: library science) alone counted towards tenure and post-tenure review. Nevermind that they focused most of their scholarly research in their secondary subject specialties.

All of the above is why I took myself out of the tenure-track world. I have no interest (at this time) in becoming a subject specialist in anything but what I do every day: librarianship. I’m happy to let others make decisions about content, so long as they let me focus on my areas of expertise, such as delivery platforms, access, and licensing issues.

CiL 2008: What Do Users Really Do in Their Native Habitat?

Speakers: Pascal Lupien and Randy Oldham

Unsubstantiated assumptions about Millennials cause libraries to make poor choices in providing services and resources. Lupien and Oldham spent some time studying how students actually use the tools we think they use. They used typical survey and focus group methodologies, which make for rather boring presentation recaps, so I won’t mention them.

Study found that only 9% of students used PDAs, and tended to be among older students. 69% of students had cell phones, but only 17% of them have ever used them to browse the Internet. 93% of student have used a chat client, and most have used them for academic purposes several times per week. 50% of users had never used online social network applications for academic group work.

The focus groups found that students preferred email over online social networks for group work. Students are more willing to share the results of their work with friends than with other classmates.

42% of students has never played online games, and men were three times more likely to do so than women. Only 4.1% were involved with online virtual worlds like World of Warcraft and Second Life.

The survey respondents indicated they were more likely to go to the library’s website first rather than Google. The focus groups also confirmed this, in addition to indicating that the library had the best sources of information despite being the most difficult to manage.

Students are reluctant to mix personal and academic computing. The uptake on online social networks for academic use has been slow, but will likely increase, and we have to ask, “is this the best use of our resources and time?” Our priorities need to be more on improving the services we already offer, such as our websites and search tools. “Rather than looking at technologies & trying to find a use for them in our environment, we should determine what our students need & seek solutions to meet those needs.”


Speaker: John Law

Proquest conducted a survey of seven universities across North America and the United Kingdom, involving 60 students. As with Lupien and Oldham’s study, they conducted it anonymously. Observations were conducted in a variety of locations, from the library to dorm rooms. They used a program like web conferencing software to capture the remote sessions.

Law gave an anecdote of a fourth year student who did all the things librarians want students to do when doing research, and when he was asked why, the student gave all the right answers. Then, when he was asked how long he had been doing his research that way, he indicated something like six weeks, after a librarian had come to his class to teach them about using the library’s resources. Library instruction works.

Course instructors are also influential. “My English instructor told me to use JSTOR.”

Brand recognition is fine, but it doesn’t necessarily effect the likelihood that resources will be used more or less.

Students use abstracts to identify relevant articles, even when the full text is available. They’re comfortable navigating in several different search engines, but not as well with library websites in locating relevant resources. Users don’t always understand what the search box is searching (books, articles, etc.), and can find it to be discouraging. A-Z databases page is too unmanageable for most users, particularly when starting their research.

Students are using Google for their research, but mainly for handy look-ups and not as a primary research tool. Those who use Google as a primary research tool do so because they aren’t as concerned with quality or are insufficiently aware of library eresources or have had bad experiences with library eresources.

Librarians, students use Google and Wikipedia the same way you do. (We know you all use those tools, so don’t even try to deny it.)

Students laughed at surveyors when asked how they use online social networks for academic purposes.

CiL 2008 Keynote: Libraries Solve Problems!

Speaker: Lee Rainie, Director of the Pew Internet & American Life Project

[Prez of InfoToday, in his introduction, announced that 2202 attendees are registered for this conference with 49 states (no one from Wyoming), Puerto Rico, D.C., and 18 countries (7 Canadian provinces) represented. 186 speakers and moderators this year!]

[House-keeping note from Jane Dysart: The men’s restroom on the ballroom level is now a women’s restroom, so the guys will have to go up to the exhibit level. There was much rejoicing.]

Rainie began by apologizing for not originally including librarians as stakeholders in the work of PIALP. This year, his new grant proposal lists librarians at the top, which was well received by the audience. He thanked librarians for their active involvement with the Pew project.

Bloggers were thanked for raising awareness of the Pew project, and for praising Rainie’s past presentations. Yay, bloggers! New media rocks. “Blogging is about community and connection as much as it is about publishing.”

In 2000, studies showed that most Internet connections were via dial-up, and no one was using wireless. In 2007, more than 50% of Americans now access the Internet via broadband, and 62% connect via wireless, both through computers or through cell phones. Wireless connectivity is decreasing the digital divide, and it also responsible for the resurgence of the value of email. “The reports of the death of email are premature.”

Information and communication technology tools are now so interconnected that it’s changing the way we think about information storage and retrieval. The Internet is becoming our storage device, which we access through various portals such as cell phones, TiVo, and yes, computers.

39% of online teens share their creative content through sites like Facebook, Flickr, and YouTube. 33% of college students blog, and 54% read them. However, many are blogging through social networking tools or course management tools and they don’t necessarily identify them as blogs. Avatars are now considered to be creative content, which is something I hadn’t thought about before.

A recent grant research with funding from IMLS and in partnership with IUC, PIALP looked at how folks get information from government sources to solve problems. 79.5% of the adults surveyed had, in the past two years, had an information need that could have been satisfied by information from government agencies. Gen Y (18-30) were the most likely to have visited a library in their search for information (62%), followed closely by Gen X (31-42) at 58%. (Psst… 60% of online teens use the Internet at libraries, up from 36% in 2000!)

Don’t listen to the naysayers who claim that the Internet is killing libraries. Public library users are more likely to be Internet users. Those who are information seekers are more likely to be adventurous in exploring information sources. Broadband users are also more likely to use a public library, and there is no difference in the patronage of libraries based on ethnicity. Young adults are more likely to visit a library to solve a problem than any other age group!

Users talked to library staff to solve their information needs slightly more than using the technology provided by the library, which were the top two ways that they found solutions to their problems. Gen Y users are generationally most likely to return to a library. Rainie thinks that because Gen Y users have been forced to use libraries through school projects, and they have seen how libraries have grown and changed over the years to meet their needs, so they have a good feeling about libraries as a source for solving their problems.

Rainie’s take-away message is that libraries need to do more publicity about how they can solve problems. “The people who know you best are the ones that keep coming back.” Let’s tell our success stories to more than just each other, which we already do a pretty good job of. Give our fans the tools to evangelize and provide feedback, and they can have a significant impact on raising awareness of libraries. Create a comfortable environment for “un-patrons” so that they aren’t afraid to ask questions and learn the technology. Become a node in social networks. (For example, Facebook apps for searching library resources or communicating with reference librarians may not be as unwanted as we might think they are.)

Rainie is an engaging speaker that I look forward to hearing from him in the future.

css.php