Several years ago, I started the UKSG blog
to report on the organization’s annual conference, which provides a
forum for publishers and librarians to network and share strategic and
practical ideas. Between 2006 and 2012, I enthused on the blog about
topics including metrics, publishing evolution, innovation, research assessment, user behaviour and workflows. All
those topics still fascinate me today (expect more on all of these from
my Touchpaper postings) – and they were all covered again at UKSG this year. But this year – shock, horror! – I wasn’t blogging about them; my role for UKSG has changed, and others are carrying the blog torch now.
This frees me up to take a more reflective look at what I have
learned at UKSG, rather than trying to capture it all in realtime for
those who can’t attend. So – here on “my” new blog, TBI’s Touchpaper – is my snapshot of another great conference:
1. Let go of “publish or perish”. Accept “open or broken”.
UK academics’ submissions to REF 2020 (the process by which
government evaluates academic institutions) *must* be OA at the point of
publication. That is surely the game-changer that will mean, from this
point on, academics will be trying to submit their best work to a
publication that supports immediate OA. We may not yet have completely
worked out the kinks, but events have overtaken us; it’s time to
satisfice – adopting an imperfect model, refining it as we go. The lack
of additional government funding for article processing charges (APCs)
means that this particular mandate will have to be met as much by
“green” self-archiving OA as by “gold” version-of-record OA. Both
publishers and higher education institutions need to be sure that they
have a clear strategy for both. (More from Phil Sykes’ opening plenary)
2. Information resources should be SO much more intelligent.
We were all blown away by student Josh Harding‘s
vision of textbooks that “study me as I study them” – using learning
analytics to identify a student’s strengths and weaknesses, comparing
this to other students, adapting content as a consequence, reminding the
student to study (“telling me I’m about to forget something because I
haven’t looked back at it since I learned it”) and generally responding
to the fact that we learn not just by reading, but also by hearing,
speaking, and (inter)acting with information. (The highlight of the
conference – Josh’s talk is must-see inspiration for all publishers’ product development and innovation.)
3. Authors need help to tell better stories about their research.
With increased pressure to justify funding, and the need to
communicate more effectively with business and the general public,
researchers need to be able to highlight what’s interesting about, and
demonstrate the impact of, their work. Journal articles are but one node
in a disaggregated network that together makes up a picture of their
research. That network needs to be more cohesively visible. At the
moment, the journal article is the hub but it doesn’t do a great job of
opening up the rest of the network. I think publishers’ futures will be
shaped by the extent to which they help academics surface / tell that
whole story. (More from Paul Groth and Mike Taylor‘s breakout on altmetrics).
Click here for the original article and more from Charlie Rapple
Monday, 29 April 2013
Tuesday, 23 April 2013
Breakout C: PIRUS: a new COUNTER standard for article level and institutional repository usage statistics
Peter's slides: http://www.slideshare.net/UKSG/shepherd-pirus-april-2013
Ross's slides: http://www.slideshare.net/UKSG/mac-intyre-irusukuksgapril2013
This talk was given by Peter Shepherd (COUNTER) and Ross MacIntyre (MIMAS). Peter first told us about PIRUS (Publisher and Institutional Repository Usage Statistics) and its draft code of practice - a full report on the project is available at the given link. There were many reasons for the development of PIRUS, overwhelmingly an increasing demand for statistics, as well as increases in the number of electronic journals in general, and also as the number of journal articles held in institutional repositories and the desire to track their usage. COUNTER has now implemented XML compatibility and the SUSHI (Standardized Usage Statistics Harvesting Initiative) protocol, both of which should make reliable usage statistics down to article level easier to collect and report.
The draft code of practice, an outcome of the project, addresses how statistics are collected. It is meant to "provide the specifications and tools that will allow COUNTER-compliant publishers, repositories and other organizations to record and report usage statistics at the individual article level that are credible, compatible and consistent. COUNTER-compliant publishers may build on the existing COUNTER tools to do so, while an alternative approach is provided for non-COUNTER compliant repositories, which is tailored to their systems and capabilities". It covers many aspects and measures, amongst them article types and versions to
be counted, data elements to be measured and definition of these elements, content and format of usage reports, requirements for
data processing, requirements for auditing and guidelines to avoid duplicate
counting when intermediary gateways & aggregators are used. Peter went into detail about each of the various aspects of the code of
practice (which can be found in his slides with illustrations), then addressed the next steps, including using feedback to develop a definitive code of practice which publishers will be invited to implement. There should also be consolidation of usage data from different publisher sources, as well as development of the Central Clearing House.
There are 17 repositories involved at present, with almost 72,000 items and 2.5 million downloads (over 100,000 downloads this month!). These are pioneering sites sending data to IRUS (from both ePrints and DSpace repositories - interaction with Fedora repositories is in development), with others in the pipeline. MIMAS are currently working on ingest scripts, a portal UI (which will be basic until informed choices can be made) and publicising IRUS-UK. Community engagement has been very helpful when defining and evaluating user requirements - the project staff have been speaking to authors and repository managers, and also running IRUS-UK webinars aimed at both repository managers and tech managers. The final project webinar will be held in mid-July. Information can be found on their website (linked above).
Monday, 22 April 2013
A post from Graham Steel on the 2013 conference
Please find Graham Steel's post with links here:
http://figshare.com/blog/UKSG_-_Connecting_The_Knowledge_Community/80
Thanks Graham!
http://figshare.com/blog/UKSG_-_Connecting_The_Knowledge_Community/80
Thanks Graham!
Friday, 19 April 2013
Electronic resources and ILL - a self-contradiction?
In this breakout session Helle Brink of Aalborg University, Denmark
talked about the inter-library loan system in Denmark, specifically
focussing on access to electronic material.
Firstly, Helle gave a brief overview of the library system in Denmark, which has:
There is an automated ILL system in place in Denmark. The delivery service covers 93% of its libraries, and also includes Norway and Sweden. There are approximately 3.9 million requests in 2012 and approximately 3 million materials delivered. Helle noted that some people do not agree with titles being driven around Denmark, but many think that this is better than having the titles sitting on shelves.
DanBib holds information on all materials found in Danish libraries. This includes electronic resources if these are Danish. However, it is an individual library's decision to make foreign e-resources visible in DanLib, at least at article level. All Danish electronic resources are also accessible, but foreign published e-resources are not. There is almost no sharing of electronic resources. IFLA has produced guidelines for ILL and Danish guidelines were updated last year.
DanBib is produced by the DBC, which is publicly funded.There are two ways of searching and requesting using DanBib. The first is Netpunkt.dk, which is for professional access. The second is library.dk which is for public access.
When you search DanLib you can see on the search results page the options available to you to access get the e-book. The Table of Contents, abstract and 35 pages or one chapter of the book can all be offered through ILL.
E-articles can also be requested. There are approximately 65 million electronic articles searchable in DanBib. A pilot project has just been set up to send articles from library to library, where a printed version of an article is sent to the local library for the requester to pick up. In 10 months 16,000 articles have been delivered in this way, though this is a very small number of the overall number of articles. 155 libraries all over Denmark are involved in this pilot. The articles are delivered from the State and University Library, which is the Danish legal deposit library, to libraries. These are not sent electronically but are sent in print format.
In Aalborg University licensing and ILL rights details are manually added to the bib record, which takes a lot of time. Helle remarked that the way this information is recorded in bib records needs to be standardised.
Another project was set up in 2007 whereby the library pays fees to the Denmark Copyright Agency which allows the State and University Library to scan material from about 30,000 Danish and foreign journals. The scanned copies then put into an archive for re-use and any copies requested through library.dk are sent directly to the user in Denmark. In this initiative approximately 150,000 copies have been sent out and approximately 298,000 have been put in the archive.
New models for partial access need to be explored, including:
Helle mentioned that only the British Library requires a copyright declaration, which means getting a signature from the requester and keeping it for 5 years. This is why they try not to use the BL for ILL. Someone in the audience asked how long it takes for a requester to get the material. Helle replied that it takes one day to find and scan an item, and 3-5 days for an actual item to be delivered.
I thought that this was really interesting session. Learning about what other countries are doing to deal with specific issues is always a good thing.
Firstly, Helle gave a brief overview of the library system in Denmark, which has:
- Royal Library;
- State and University Library;
- 98 public libraries;
- 6 regional libraries;
- 6 big research/university libraries;
- 11 university college libraries;
- 30 institutional libraries;
- 250 smaller research libraries.
There is an automated ILL system in place in Denmark. The delivery service covers 93% of its libraries, and also includes Norway and Sweden. There are approximately 3.9 million requests in 2012 and approximately 3 million materials delivered. Helle noted that some people do not agree with titles being driven around Denmark, but many think that this is better than having the titles sitting on shelves.
DanBib holds information on all materials found in Danish libraries. This includes electronic resources if these are Danish. However, it is an individual library's decision to make foreign e-resources visible in DanLib, at least at article level. All Danish electronic resources are also accessible, but foreign published e-resources are not. There is almost no sharing of electronic resources. IFLA has produced guidelines for ILL and Danish guidelines were updated last year.
DanBib is produced by the DBC, which is publicly funded.There are two ways of searching and requesting using DanBib. The first is Netpunkt.dk, which is for professional access. The second is library.dk which is for public access.
When you search DanLib you can see on the search results page the options available to you to access get the e-book. The Table of Contents, abstract and 35 pages or one chapter of the book can all be offered through ILL.
E-articles can also be requested. There are approximately 65 million electronic articles searchable in DanBib. A pilot project has just been set up to send articles from library to library, where a printed version of an article is sent to the local library for the requester to pick up. In 10 months 16,000 articles have been delivered in this way, though this is a very small number of the overall number of articles. 155 libraries all over Denmark are involved in this pilot. The articles are delivered from the State and University Library, which is the Danish legal deposit library, to libraries. These are not sent electronically but are sent in print format.
In Aalborg University licensing and ILL rights details are manually added to the bib record, which takes a lot of time. Helle remarked that the way this information is recorded in bib records needs to be standardised.
Another project was set up in 2007 whereby the library pays fees to the Denmark Copyright Agency which allows the State and University Library to scan material from about 30,000 Danish and foreign journals. The scanned copies then put into an archive for re-use and any copies requested through library.dk are sent directly to the user in Denmark. In this initiative approximately 150,000 copies have been sent out and approximately 298,000 have been put in the archive.
New models for partial access need to be explored, including:
- Walk in use
- Pay-per-view;
- Reading - no downloading or printing;
- Voucher solutions, e.g. 10 articles per year;
- ILL access e.g. after 3 months
Helle mentioned that only the British Library requires a copyright declaration, which means getting a signature from the requester and keeping it for 5 years. This is why they try not to use the BL for ILL. Someone in the audience asked how long it takes for a requester to get the material. Helle replied that it takes one day to find and scan an item, and 3-5 days for an actual item to be delivered.
I thought that this was really interesting session. Learning about what other countries are doing to deal with specific issues is always a good thing.
Wednesday, 17 April 2013
PDA checklist for academic libraries
In this breakout session Karin Bystrom and Karin Perols talked about a Swedish project to develop a checklist with important factors for academic libraries to consider before starting with Patron Driven Acquistion (PDA).
PDA is a method of buying resources offered by e-book aggregators, which involves showing un-owned e-books in a local catalogue which then become loaned/rented or purchased under certain conditions set by the library.
The project ran between February to November 2012 and its purpose was to create a base of knowledge that could be useful to other libraries interested in implementing PDA. This would be done by:
Possible PDA objectives were identified as: better collections; better service; saving money; and replacing manual purchasing.
Creating a PDA profile
This involves limiting the titles offered. This can be done in a number of ways, according to: subject categories; publishing year; language; publishers; classification; readership level; price cap; keywords (using include/exclude).
Which limits does your library want and which profile settings are important? Choose your vendor in accordance with your requirements.
In this project the university libraries of Malmo and Uppsala had it set at 3 loans before each purchase was made, whereas at Sodertorn it was 2 loans. At the end of the project it was concluded that for Uppsala it would have been better to have 2 loans rather than 3 before the title is purchased.
PDA functionality
What functionality is required? Borrowing; loans; mediated function; number of loans per person/per day; interface layout; multiple accounts.
Which PDA model and settings are important? Choose your vendor in accordance with your requirements.
Vendor collection
Look at: the readership level; type of books; publishers; updates to the collection.
Check if the collection from the vendor meets the library's needs.
Accessibility
Think about how the PDA titles will be made accessible: making PDA e-books visible through the local catalogue, union catalogue, etc.; getting MARC records supplied; use of a link resolver; authentication.
Consider where to make your e-books visible. Try to avoid a separate platform login.
E-book functionality
Look at the platform; use of DRM; downloading; mobile interface; speech synthesis; simultaneous users.
Managing the collection
Support
What are the library's wishes and demands regarding support, e.g. start-up help and response times?
Statistics
What statistics are needed? Is it important to be able to separate out the use of PDA titles from 'ordinary' titles?
Economy
Look at: Budget (Decide what you can spend at the beginning of the process.); price model; economy reports; invoices; deposit.
What does the vendor's price model include?
Organisation
Look at: workflow; involvement of all staff; competency development; coordination; assessment.
Analyse how PDA will affect workflows and identify possible bottlenecks. Analyse the need for organisation.
A short English version of the report with full checklist is available (http://www.kb.se/dokument/Bibliotek/projekt/Slutrapporter%202012/PDA%20English.pdf).
Conclusion
There are both pros (e.g. getting users involved in choosing library material) and cons (e.g. unpredictability) to PDA. You will need to learn as you go and be prepared for change. Although academic libraries will have their differences, is is hoped that the experiences of the Swedish university libraries will help them to prepare and plan for PDA and help minimise any possible problems.
PDA is a method of buying resources offered by e-book aggregators, which involves showing un-owned e-books in a local catalogue which then become loaned/rented or purchased under certain conditions set by the library.
The project ran between February to November 2012 and its purpose was to create a base of knowledge that could be useful to other libraries interested in implementing PDA. This would be done by:
- Collecting earlier experiences;
- Carrying out a PDA vendor survey (completed by Dawson, EBSCO, ebrary, EPL and MyiLibrary);
- Having a test period (April - September);
- Creating the checklist;
- Publishing a report in December 2012.
Possible PDA objectives were identified as: better collections; better service; saving money; and replacing manual purchasing.
Creating a PDA profile
This involves limiting the titles offered. This can be done in a number of ways, according to: subject categories; publishing year; language; publishers; classification; readership level; price cap; keywords (using include/exclude).
Which limits does your library want and which profile settings are important? Choose your vendor in accordance with your requirements.
In this project the university libraries of Malmo and Uppsala had it set at 3 loans before each purchase was made, whereas at Sodertorn it was 2 loans. At the end of the project it was concluded that for Uppsala it would have been better to have 2 loans rather than 3 before the title is purchased.
PDA functionality
What functionality is required? Borrowing; loans; mediated function; number of loans per person/per day; interface layout; multiple accounts.
Which PDA model and settings are important? Choose your vendor in accordance with your requirements.
Vendor collection
Look at: the readership level; type of books; publishers; updates to the collection.
Check if the collection from the vendor meets the library's needs.
Accessibility
Think about how the PDA titles will be made accessible: making PDA e-books visible through the local catalogue, union catalogue, etc.; getting MARC records supplied; use of a link resolver; authentication.
Consider where to make your e-books visible. Try to avoid a separate platform login.
E-book functionality
Look at the platform; use of DRM; downloading; mobile interface; speech synthesis; simultaneous users.
Managing the collection
- De-duplication (only against other e-books). A few members of the audience mentioned the fact that de-duplication is a major problem, especially as some titles have two records, one for the print and one for the electronic version;
- Unique e-ISBNs (some e-books have different e-ISBNs according to publisher, even though the book itself is the same!);
- Managing titles already purchased;
- Updates.
Support
What are the library's wishes and demands regarding support, e.g. start-up help and response times?
Statistics
What statistics are needed? Is it important to be able to separate out the use of PDA titles from 'ordinary' titles?
Economy
Look at: Budget (Decide what you can spend at the beginning of the process.); price model; economy reports; invoices; deposit.
What does the vendor's price model include?
Organisation
Look at: workflow; involvement of all staff; competency development; coordination; assessment.
Analyse how PDA will affect workflows and identify possible bottlenecks. Analyse the need for organisation.
A short English version of the report with full checklist is available (http://www.kb.se/dokument/Bibliotek/projekt/Slutrapporter%202012/PDA%20English.pdf).
Conclusion
There are both pros (e.g. getting users involved in choosing library material) and cons (e.g. unpredictability) to PDA. You will need to learn as you go and be prepared for change. Although academic libraries will have their differences, is is hoped that the experiences of the Swedish university libraries will help them to prepare and plan for PDA and help minimise any possible problems.
Tuesday, 16 April 2013
'Towards an 'open' approach for discovery
In this breakout session Kamran Naim from USAID (the US Agency for International Development) provided an overview of work funded by the organisation related to research discovery specific for Africa, but also has a potential global application.
Kamran started by talking about the need to support the global research system. Global research depends on information and collaboration between the developing world and others. Good research depends on good access to information and information sharing.
The USAID Bureau of Science and Technology was established in 2012 to help support research services in Africa. There is limited access to research in Africa, so they are not participating in global scientific conversation. This is a lost opportunity for African researchers. There are 27,000 papers from Africa per year - less than the Netherlands. In Sub-Saharan Africa research output is falling. However, there are positive trends in Kenya. During 2000-2010 research output has gone up. The rate of growth has grown slower for just Kenyan authors, who are dependent on collaborative programmes. There are no citations from research from Kenyan and collaborative researchers. There is also a need to improve the visibility of African research.
Availability of research is not such a big problem. There are a number of access programmes, e.g. the UN's Research4Life, and other local initiatives. Out of the top 20 journals representing 22 core disciplines 80% of journals are available - not unlike in Europe. Kenya has its own consortia and it has about 75% of the top journals. Access to law articles is the biggest problem, as most research is US-based and not much relevance to African law. Maths and geology research is also a problem, but it is getting better.
Accessibility is an issue. Internet infrastructure in Africa is improving. The KENET network covers 90+ institutions in Kenya.There are still some constraints. There are still not many computers on campus. Demand outstrips supply.
Usability, i.e. locating and downloading, is a particular issue. Library resources are more complex. Poor usability comes from poor understanding of use of resources.
One example is Jkuat Library, which has a list of electronic resources. There are a number of silos by category. Separate websites, e.g. Gora, Oare and Hinari, ebook resources, research gateways, OA resources, institutional OPACs, databases involve separate log-ins and passwords at institutional level. There are severe penalties if the password is abused. This makes it a big issue as librarians are very protective of the password. There are very little interoperability between these silos. People are frustrated when the library doesn't have access to a journal.
When looking for resources convenience wins. People go first to Google and library resources go unused. Gateway services are the brokers of access. User expectations: full-text delivery; customisation; easy to use. The ACARDIA study was undertaken in 2010. It showed that there was a low level of awareness and understanding of information resources. Only 40% of respondents had high or good awareness.
A project was set up to address the issue of poor usability, through consolidation of resources and provision of remote access to resources. The aim was to enhance usage, engage researchers and students more and support access programmes. Multiple means of authentication and issues of scale (250 institutions in 4 countries (Malawi, Kenya, Rwanda, Tanzania)) had to be addressed.
The solution was to try and achieve web-scale 'open discovery', making Research4Life, DOAJ, Citeseer, Archiv, institutional repositories and other free content available, thereby increasing the visibility of African as well as international research. This was achieved by building an open index based on what institutions could access. LibHub was chosen as the discovery system. There is a single search box and the system authenticates students when they click onto it. It is fully searchable and can be integrated within the OPAC and library website, via APIs. It is a resource management system which gives complete control to librarians. In Kenya the collection includes 17,765 journals, over 200 publishers, 35,219 books, searchable using one single search interface.
Case study
Iraq Virtual Science Library (IVSL) was launched in 2005. At the beginning the website had a very poor interface. This was then updated in 2009 when LibHub was implemented. IVSL usage was 70,000 article downloads per month. Now it is 90,000. There have been dramatic results with one interface. Also, Iraqi research output in general has increased. Hopefully, IVSL has had a part to play in this.
Some Problems
Work needs to be done on the following:
Discussions are taking place to extend access to the system. There has also been work done on using VIVO, which is a networking tool for scientists. This collaboration enables research cooperation and communication. Technical training is also needed, with the building of MOOCs which promotes greater research capacity.
How to participate
I really enjoyed this very interesting breakout session. It was great to hear what steps have been taken to successfully improve accessibility and usability of information resources. Much can be learnt from the work done in Africa.
Kamran started by talking about the need to support the global research system. Global research depends on information and collaboration between the developing world and others. Good research depends on good access to information and information sharing.
The USAID Bureau of Science and Technology was established in 2012 to help support research services in Africa. There is limited access to research in Africa, so they are not participating in global scientific conversation. This is a lost opportunity for African researchers. There are 27,000 papers from Africa per year - less than the Netherlands. In Sub-Saharan Africa research output is falling. However, there are positive trends in Kenya. During 2000-2010 research output has gone up. The rate of growth has grown slower for just Kenyan authors, who are dependent on collaborative programmes. There are no citations from research from Kenyan and collaborative researchers. There is also a need to improve the visibility of African research.
Availability of research is not such a big problem. There are a number of access programmes, e.g. the UN's Research4Life, and other local initiatives. Out of the top 20 journals representing 22 core disciplines 80% of journals are available - not unlike in Europe. Kenya has its own consortia and it has about 75% of the top journals. Access to law articles is the biggest problem, as most research is US-based and not much relevance to African law. Maths and geology research is also a problem, but it is getting better.
Accessibility is an issue. Internet infrastructure in Africa is improving. The KENET network covers 90+ institutions in Kenya.There are still some constraints. There are still not many computers on campus. Demand outstrips supply.
Usability, i.e. locating and downloading, is a particular issue. Library resources are more complex. Poor usability comes from poor understanding of use of resources.
One example is Jkuat Library, which has a list of electronic resources. There are a number of silos by category. Separate websites, e.g. Gora, Oare and Hinari, ebook resources, research gateways, OA resources, institutional OPACs, databases involve separate log-ins and passwords at institutional level. There are severe penalties if the password is abused. This makes it a big issue as librarians are very protective of the password. There are very little interoperability between these silos. People are frustrated when the library doesn't have access to a journal.
When looking for resources convenience wins. People go first to Google and library resources go unused. Gateway services are the brokers of access. User expectations: full-text delivery; customisation; easy to use. The ACARDIA study was undertaken in 2010. It showed that there was a low level of awareness and understanding of information resources. Only 40% of respondents had high or good awareness.
A project was set up to address the issue of poor usability, through consolidation of resources and provision of remote access to resources. The aim was to enhance usage, engage researchers and students more and support access programmes. Multiple means of authentication and issues of scale (250 institutions in 4 countries (Malawi, Kenya, Rwanda, Tanzania)) had to be addressed.
The solution was to try and achieve web-scale 'open discovery', making Research4Life, DOAJ, Citeseer, Archiv, institutional repositories and other free content available, thereby increasing the visibility of African as well as international research. This was achieved by building an open index based on what institutions could access. LibHub was chosen as the discovery system. There is a single search box and the system authenticates students when they click onto it. It is fully searchable and can be integrated within the OPAC and library website, via APIs. It is a resource management system which gives complete control to librarians. In Kenya the collection includes 17,765 journals, over 200 publishers, 35,219 books, searchable using one single search interface.
Case study
Iraq Virtual Science Library (IVSL) was launched in 2005. At the beginning the website had a very poor interface. This was then updated in 2009 when LibHub was implemented. IVSL usage was 70,000 article downloads per month. Now it is 90,000. There have been dramatic results with one interface. Also, Iraqi research output in general has increased. Hopefully, IVSL has had a part to play in this.
Some Problems
Work needs to be done on the following:
- Exorbitant prices for publisher metadata.
- Competing interests - aggregators policies.
- Sustainability - local management.
- Local hosting - improved, high-speed searching.
- Authentication - for remote access.
- Mobile delivery - making resources available on smart phones across Africa.
Discussions are taking place to extend access to the system. There has also been work done on using VIVO, which is a networking tool for scientists. This collaboration enables research cooperation and communication. Technical training is also needed, with the building of MOOCs which promotes greater research capacity.
How to participate
- Advocate for open communication.
- Libraries - review metadata provision policies. You shouldn't have to pay for it.
- Publishers - participate in access programmes and global networks; reach out to African scientists.
I really enjoyed this very interesting breakout session. It was great to hear what steps have been taken to successfully improve accessibility and usability of information resources. Much can be learnt from the work done in Africa.
Butterflies, Publishers and Librarians - final UKSG plenary session
We started out with 'free-range archivist' Jason Scott of the Archive Team and the Internet Archive (home of the Wayback Machine). His talk, "The twenty-year butterflies: which web cookies have stuck to the internet's
pan?", can be watched on YouTube - and I'd highly recommend watching it, as it made me laugh at the same time it really made me think about the ephemeral quality of digital data. As Jason said, he deals in emotions rather than academia and citations!
Jason started off with the 'Bang with Friends' Facebook app - surely the first time this has ever been discussed at UKSG! - and how it uses the full infrastructure of Facebook to facilitate hookups. He went on to a discussion of the evolution of internet access - the 1983 Compuserve ad touting electronic mail by saying 'we call it email' made me laugh! Then he showed us a montage of 'bad
idea' ads and promises across the years; think babies 'preserved' in cellophane, children's wallpaper impregnated with DDT, floppy disks guaranteed for 100 years from 1983, text messages sent from 30 years in the future - you name it!
He used all of these topics to illustrate how so much of the world is now ephemeral machine/person interfaces. In effect we are cyborgs - and our recovery from that
starts with admitting it to ourselves. And as in collecting rarity is confused with 'expensive', in the world there
is no gone, there is only forgotten. The Archive
Team was started by people who were bemoaning online losses of wholesale site closure - their philosophy is to grab everything
from sites closing down as fast as they can, so that nothing will be lost to the whims of corporate hosts. Their 3 virtues are rage, paranoia and kleptomania and they have downloaded
500 terabytes of information since they started!
It was frustrating to see his list of sites that had closed down with so much personal information and data deleted. He focused on the Geocities 2009 shutdown (artists are now studying these sites via the Archive Team's copies, though curators bemoan the lack of metadata), as well as the Tabblo photo site, where people thank the Team for saving their only copies of family photographs. (I loved his idea to replace 'cloud' in any statement
with 'clown' - 'saving to the clown' - it may be about as darkly comical someday when sites close down.) Also, when Twitter
decided to delete Posterous, the Archive Team downloaded all the data so fast
that Posterous asked them to stop! They do not make friends with sites, but they don't feel that's important if data is being saved. It is worth considering Jason's issues with URL shorteners and how he feels they are the worst recent internet idea -
what happens if the link to the link doesn't work? The Archive Team is saving as many of the original links and their shorteners as they can to keep this information from being lost - though I hav eno idea how this would be implemented, I appreciate their effort.
In conclusion he showed us his 20-year butterfly - a smudged, pixillated image created on an emulator of how an old Macintosh paint program looked on a crap monitor! Like someone would still think this is worth celebrating, there is always going to be someone that will thank the Archive Team for what they have done and will do. We also need to remember that human beings are the best and most destructive force on the earth - if anything exists, it's because humans decided to do nothing. We should encourage inactivity - we don't know what these sites that were deleted would have become in 50 years. And Jason celebrated his friend Aaron Schwartz - like the Archive Team, he dealt with quasi-legal areas to do what he thought was right in preserving data.
T. Scott Plutchak of the University of Alabama at Birmingham then gave his talk 'Publishers and librarians: we share the same values, why are we fighting?'. Those of us (at UKSG) are some of the luckiest people alive - we are living in an era with innovation and we are part of it (this is our Gutenberg moment!). We are constantly reinventing the way research is done and how it is documented - there are myriad capabilities and opportunities in front of us. However, there are many challenges too, including technological, cultural and social ones. In this era, publishers and librarians look at each other through caricatures; they are so intimately connected but yet separated by the caricatures and lack of communication between them. He said that libraries are advocates of social media [though I'm not sure if this is true across the board], where publishing is the antithesis of social media networking - they ensure they get things 'right', it doesn't have to be quick. Also, publishers tend to hoard their information where librarians of course want to circulate it - they are devoted to the exchange of information, without the barriers of market forces, where publishers see market forces as a mean to exhanging information. C.P.Snow's 'The Two Cultures' elaborates on a similar 'parallel worlds' split between the science and humanities disciplines.
We are missing the chance to create new scholarly communication due to ignorance and lack of trusted spaces. However, when you push the buyer-seller relationship out of the way, there is so much in common between publishers & librarians. Knowledge will make us better negotiators, partners and collaborators - this doesn't mean we always have to agree, because we won't! He gave advice to librarians: cool it with the emotions, attend some publisher conferences, be educated by your knowledgeable faculty - and buy a publisher a drink! He then said to publishers: you are lousy at telling your own story, be more transparent, be more open about your mission/decision making, and it's worth sitting in on some library conference sessions to hear what librarians are really talking about (when they're not discussing pricing with you!). Willingness and openness are needed to realise that we are all good-hearted people who can work better together! This talk can be watched on YouTube and the slides are on the UKSG Slideshare page here.
Monday, 15 April 2013
Finch Forward: the evolution of OA
The opening plenary session
of this year’s UKSG conference was about open access. This is clearly a hot topic
at the moment, and the speakers emphasised the need for librarians to work with
other partners in steering the course of events towards the best possible goal
for everyone. I’ll summarise and paraphrase the talks below.
First up was Phil Sykes from
the University of Liverpool, whose talk ‘Open Access gets tough’ gave a brief
overview of the open access policy landscape in the UK over the last few years,
before moving on to what we can do next. Phil opened with an analogy between
the development of open access over the past two years and a puppy – it has
grown from an obedient friend quietly hanging out in the corner into a snarling
beast that we can’t ignore.
The proportion of UK research
publications that are open access has been slowly and steadily increasing for a
while now. This began to change in early 2011 as the UK government showed
that it was committed to open access when the Minister for Universities and
Science David Willetts commissioned the Finch Report, saying that the issue was
not ‘whether’, but ‘when’ and ‘how’ open access will become standard.
The Finch Report recommended
a hybrid environment with a strong focus on gold open access with article
processing charges (APCs), and Creative Commons CC-BY licenses. In light of
this report the two main funders of UK research, the RCUK and the Higher
Education Funding Councils, have announced their intentions to implement some
of the Finch Report’s recommendations by introducing a mandate that all
research that they fund should be open access. This is happening in stages, and
with allowances for some proportion of articles to be green OA rather than
gold, but the policy changes indicate that perhaps it is inevitable that all
the UK’s publicly funded research will be open access soon.
Phil cautioned us not to be
too certain about this because it is by no means inevitable. There is still a
lot of work to be done to make this happen, and Phil believes that librarians
can take a leading role. The window of opportunity may disappear if the
political landscape changes so we need to ‘make use of the improbable
opportunity we have now’. If, as a community, we don’t provide the right
support and ‘intelligent advocacy’, full open access might not come about.
Phil concluded by saying that
we are privileged to be at this transitional moment in democratizing access to
knowledge, but the exact nature of the change is not inevitable. This was an inspiring speech to open the conference
with and set the tone for the rest of the day’s talks on open access.
Fred Dylla of the American
Institute of Physics provided us with a US perspective in his talk ‘The evolving view of public access to the results of publicly funded research in the US’. Speaking as a physicist turned publisher, he pointed out that
publishers and librarians need to remember that they are working towards a
shared goal of public access to research. He then outlined the developments in
open access that have taken place in the US, which are somewhat different to events
in the UK.
There has been a lot of US government
interest in public access to publicly funded research but the difficulty of
passing any legislation in the current political climate has meant that it
falls to funding agencies to develop policies on public access. While the
intricacies of US funding policy were lost on me, the talk was a good reminder
that all countries are following a different path towards open access and the
UK is now moving faster than most.
The final talk of this
plenary session was by Jill Emery from Portland State University on ‘Mining for gold: identifying the librarian’s toolkit for managing hybrid OA’. Beginning with
a quote that the ‘mission of libraries is to improve society through
facilitating knowledge creation in their communities’ (R. David Lankes, Atlas of new librarianship), Jill talked
about the need for librarians to collaborate with other partners, whether they
are publishers or academics, and bring our traditional skills to bear on new
opportunities. For example, no library can now try and collect everything that
is published, so libraries can let go of that aim and focus their collections
locally.
A theme of this talk, and the
session overall, was that open access advocacy doesn’t need to be antagonistic
to publishers because they are a valuable and vital part of the knowledge
ecosystem. Jill also highlighted the fact that open access requires investment,
management, and collaboration at the institutional level – we can’t silo open
access as a ‘library thing’ because it involves so many different institutional
departments. One role libraries could take is to support the payment of APCs,
because they have an institutional overview (e.g. experience in budgeting
fairly across subject disciplines).
Having said that, are the
high APCs associated with hybrid open access publishing justified by the
prestige and impact that publishers provide? Librarians need to be questioning
this. We need to find out what we can negotiate on, such as publisher discounts
of APC fees being subtracted from big deal subscription costs; but we also need
to always bear in mind the requirements of academics who place much more
emphasis on prestige than cost.
The new digital students, or, “I don’t think I have ever picked up a book out of the library to do any research – all I have used is my computer.”
Plenary session by Lynn Silipigni Connaway, OCLC Research
Connaway's talk collected quotes and information from several studies from the US, UK, Australia and New Zealand.
Connaway introduces the talk by saying that "it used to be that the user built the workflow around the library, now we must build our services around the users' workflow. It used to be that resources were scarce and attention abundant. Libraries were the only game in town. Today, attention is scarce and resources are abundant". Learning of the opinions and behaviours of users through these studies provides an opportunity to rebuild the library around the user's needs.
Main themes drawn from the studies
User behaviors
Views of the library
Connaway drew on the results of these studies to make the following recommendations:
Another reoccurring theme of the talk, which was not explicitly tackled in the recommendations, was the opinion that online resources were part of an information 'black market'. Students felt they had to hide the use of online resources - for example using Google Books for citation but referencing to the print copies. This kind of secretive behavior is an interesting phenomenon, and I feel Librarians should be able to talk honestly with users about the resources they are already using (e.g. Wikipedia) and understand the motivation behind the behavior rather than only directing them elsewhere scornfully.
Connaway's talk was quite an eye opener. Although, as Librarians, we are very aware of the popularity of search engines such as Google and the influence they are having on information seeking behaviour, listening to direct quotes from users reveals the severity of the situation. Connaway explained that the only way we would believe her findings was if she quoted directly from the participants. Although it is tempting to listen to some student quotes with dismay, it is more important that we use them to influence our service delivery in the future.
The video of the talk is available here.
Evelyn Jamieson
Connaway's talk collected quotes and information from several studies from the US, UK, Australia and New Zealand.
Connaway introduces the talk by saying that "it used to be that the user built the workflow around the library, now we must build our services around the users' workflow. It used to be that resources were scarce and attention abundant. Libraries were the only game in town. Today, attention is scarce and resources are abundant". Learning of the opinions and behaviours of users through these studies provides an opportunity to rebuild the library around the user's needs.
Main themes drawn from the studies
- Convenience is King
Unsurprisingly, users prefer to use Google for it's speed and convenience. Students commented on how easy it was to use the internet to find resources instead of driving to the library. Connaway explains that convenience changes with the context and situation, so we need to provide resources and services in a variety of formats, 24/7. - Easily satisfied
Students use the resources that are delivered to them most easily - such as the first few results on a search engine search. Websites such as the British Library catalogue and WorldCat have found their biggest traffic to be pushed through from search engines, rather than users going direct to the sites.
User behaviors
- Power browsing: scanning small chunks of information, viewing the first few pages, no real reading.
- Squirreling: short basic searches, downloading content for later in a hoarding behavior.
- Self taught: users' information skills are self taught or learnt from other student and faculty.
- Lack of understanding of copyright and open access from both students and faculty.
Views of the library
- Inconvenient: limited hours, distance to library
- Physical materials: users ultimately associate library with physical books - users don't associate e-resources with the library.
- Website hard to navigate.
Connaway drew on the results of these studies to make the following recommendations:
- Improve OPACs (this is already happening at many libraries) and create seamless route from discovery to content delivery.
- Advertise resources and value of services.
- Provide help at the time of need - Chat and IM that appears on the search screen when users have problems with searching, based on the same model as retail websites.
- Design systems with users in mind, modeled on popular services.
- Focus on relationship building.
Another reoccurring theme of the talk, which was not explicitly tackled in the recommendations, was the opinion that online resources were part of an information 'black market'. Students felt they had to hide the use of online resources - for example using Google Books for citation but referencing to the print copies. This kind of secretive behavior is an interesting phenomenon, and I feel Librarians should be able to talk honestly with users about the resources they are already using (e.g. Wikipedia) and understand the motivation behind the behavior rather than only directing them elsewhere scornfully.
Connaway's talk was quite an eye opener. Although, as Librarians, we are very aware of the popularity of search engines such as Google and the influence they are having on information seeking behaviour, listening to direct quotes from users reveals the severity of the situation. Connaway explained that the only way we would believe her findings was if she quoted directly from the participants. Although it is tempting to listen to some student quotes with dismay, it is more important that we use them to influence our service delivery in the future.
The video of the talk is available here.
Evelyn Jamieson
Missing the Conference Already? Join our Webinar Series!
Following the success of our introductory webinars, we are delighted to introduce the first part of our main webinar series. These are a fantastic alternative to our face-to-face seminars, giving you the information you need, without the travel!
Sessions are priced at just £50+VAT each, however discounts are available for multiple bookings made at the same time:
Webinar sessions and dates are:
17th April 2013, 2pm: Managing e-content in an Academic library - hosted by Louise Cole, Kingston University
24th April 2013, 2pm: The business of publishing ejournals - hosted by James Pawley, Sage
29th April 2013, 10am: The role of subscription agents - hosted by Richard Steeden, LM Information Delivery
29th April 2013, 2pm: The importance of Intermediaries - hosted by Jane Wright, Swets
8th May 2013, 10am: Introduction to Access management - hosted by Mark Williams, JISC Collections
8th May 2013, 2pm: Shared Services, KB+ - hosted by Liam Earney, JISC Collections
This is a fantastic opportunity to listen to expert speakers. You will also receive a link to the recorded version of your chosen webinars to watch and listen back at a time most convenient for you.
To book your place on one or more webinars, visit www.uksg.org/webinars2013-1
If you have any questions, please contact Amelia (amelia@uksg.org).
Sessions are priced at just £50+VAT each, however discounts are available for multiple bookings made at the same time:
- One session: £50+VAT
- Two sessions: £45+VAT each (10% saving)
- Three sessions or more: £40+VAT each (20% saving)
Webinar sessions and dates are:
17th April 2013, 2pm: Managing e-content in an Academic library - hosted by Louise Cole, Kingston University
24th April 2013, 2pm: The business of publishing ejournals - hosted by James Pawley, Sage
29th April 2013, 10am: The role of subscription agents - hosted by Richard Steeden, LM Information Delivery
29th April 2013, 2pm: The importance of Intermediaries - hosted by Jane Wright, Swets
8th May 2013, 10am: Introduction to Access management - hosted by Mark Williams, JISC Collections
8th May 2013, 2pm: Shared Services, KB+ - hosted by Liam Earney, JISC Collections
This is a fantastic opportunity to listen to expert speakers. You will also receive a link to the recorded version of your chosen webinars to watch and listen back at a time most convenient for you.
To book your place on one or more webinars, visit www.uksg.org/webinars2013-1
If you have any questions, please contact Amelia (amelia@uksg.org).
E-journals and long-term availability: an overview and panel discussion on the archiving infrastructure to meet the needs of users
This afternoon break-out session tackled one of the issues that has been weighing heavy on the minds of librarians for some years now - that of long-term accessibility and archiving of e-journals. The popularity of the topic was reflected in a very full seminar room and the high attendance subsequently enabled some lively debates and thought-provoking questions.
The session started with Fred Guy from EDINA outlining the causes of the current challenges, in essence explaining the differences between print and electronic journal subscriptions. Although for some attendees this information will have been nothing new, I felt it set the scene perfectly and it also allowed Fred to explain why a robust and reliable archiving infrastructure will be essential for ensuring long-term access to e-journals. This led on nicely to an explanation of the Keepers Registry (www.thekeepers.org) a project funded by JISC which aims to collate information about the existing archiving arrangements of e-journals. Fred explained that various agencies have signed up to become stewards of digital content whereby, in addition to taking on responsibility for actually archiving e-journals, these agencies also submit metadata to the Keepers Registry about all of the journals in their programmes. So in short the Keepers Registry is an aggregation of metadata supplied by numerous archiving agencies. The metadata supplied includes bibliographic information such as journal title and ISSN (checked against the ISSN Register), the name of the agency responsible for archiving, and holdings information about which volumes have been archived. For the Librarian, this means that a simple web search will reveal whether a particular title is already being archived and who is responsible for doing so.
Adam Rusbridge (also from EDINA) then gave an overview of one of the archiving agencies (UK LOCKSS) that is currently submitting data to the Keepers Registry. By doing so he illustrated the role that libraries can have in this process. The UK LOCKSS programme (which when unscrambled is the acronym of 'Lots of Copies Keep Stuff Safe') helps libraries to build local archives of their own digital content by providing them with the specialist tools and software required to do so. Providing the Library has perpetual access rights to their content, this data can then be transferred to the Keepers Registry and therefore shared among the community.
The second half of the session comprised of an invited panel being asked to consider two pertinent questions. The panel comprised of Joanne Farrant (University of Cambridge), Bill Barker (LSE), David Prosser (RLUK), and Lorraine Estelle (JISC) and again helped to put the issues into context and see how things are turning out in practice. The first question posed to them was "How has the introduction of e-journal preservation services helped librarians withdraw print collections and focus on e-journals?". There was unanimous agreement that e-journal preservation services have given libraries the confidence to firstly cancel print subscriptions and secondly to dispose of back-runs. For example, Bill Barker shared that his institution have been able to cancel many print subscriptions providing three criteria were met: no cost difference, ease of access, and storage of the e-version in a secure archive with guaranteed perpetual access. Similarly, David commented that the wide-spread removal of duplicated print runs has only been possible because e-journal preservation projects have given librarians faith in e-journal storage and preservation. Indeed, powerful statistics support these statements; UKRR has already released 15km of shelving through de-duplication schemes and are hoping to increase this to 100km by 2015. The freeing up of such vast areas obviously has huge opportunities for libraries.
The second question set to the panel was "How can institutions, community bodies and service providers best work together to ensure sustainable, long-term initiatives?" The main themes in the panel's answers included uniformity, standardisation and the need for even more collaboration. Anne voiced her concerns that there are already multiple archiving agencies, whereas David suggested that the lack of uniformity between the terms and conditions of different publishers (and even between different titles within the same publishers) was a more pressing issue. Day-to-day concerns were also raised, in particular the need for full coverage of an entire journal run (at the moment journal coverage may be incomplete with some years missing). The session finished with Bill pointing out the great opportunity that libraries now have to incorporate this wealth of archiving information into the new library management systems that are being developed.
Talking to other attendees after the session I know that I was not the only one to have found it highly informative. Many of us came away with a sense of reassurance that an awful lot of people are working very hard to ensure continued access to e-journals.
The session started with Fred Guy from EDINA outlining the causes of the current challenges, in essence explaining the differences between print and electronic journal subscriptions. Although for some attendees this information will have been nothing new, I felt it set the scene perfectly and it also allowed Fred to explain why a robust and reliable archiving infrastructure will be essential for ensuring long-term access to e-journals. This led on nicely to an explanation of the Keepers Registry (www.thekeepers.org) a project funded by JISC which aims to collate information about the existing archiving arrangements of e-journals. Fred explained that various agencies have signed up to become stewards of digital content whereby, in addition to taking on responsibility for actually archiving e-journals, these agencies also submit metadata to the Keepers Registry about all of the journals in their programmes. So in short the Keepers Registry is an aggregation of metadata supplied by numerous archiving agencies. The metadata supplied includes bibliographic information such as journal title and ISSN (checked against the ISSN Register), the name of the agency responsible for archiving, and holdings information about which volumes have been archived. For the Librarian, this means that a simple web search will reveal whether a particular title is already being archived and who is responsible for doing so.
Adam Rusbridge (also from EDINA) then gave an overview of one of the archiving agencies (UK LOCKSS) that is currently submitting data to the Keepers Registry. By doing so he illustrated the role that libraries can have in this process. The UK LOCKSS programme (which when unscrambled is the acronym of 'Lots of Copies Keep Stuff Safe') helps libraries to build local archives of their own digital content by providing them with the specialist tools and software required to do so. Providing the Library has perpetual access rights to their content, this data can then be transferred to the Keepers Registry and therefore shared among the community.
The second half of the session comprised of an invited panel being asked to consider two pertinent questions. The panel comprised of Joanne Farrant (University of Cambridge), Bill Barker (LSE), David Prosser (RLUK), and Lorraine Estelle (JISC) and again helped to put the issues into context and see how things are turning out in practice. The first question posed to them was "How has the introduction of e-journal preservation services helped librarians withdraw print collections and focus on e-journals?". There was unanimous agreement that e-journal preservation services have given libraries the confidence to firstly cancel print subscriptions and secondly to dispose of back-runs. For example, Bill Barker shared that his institution have been able to cancel many print subscriptions providing three criteria were met: no cost difference, ease of access, and storage of the e-version in a secure archive with guaranteed perpetual access. Similarly, David commented that the wide-spread removal of duplicated print runs has only been possible because e-journal preservation projects have given librarians faith in e-journal storage and preservation. Indeed, powerful statistics support these statements; UKRR has already released 15km of shelving through de-duplication schemes and are hoping to increase this to 100km by 2015. The freeing up of such vast areas obviously has huge opportunities for libraries.
The second question set to the panel was "How can institutions, community bodies and service providers best work together to ensure sustainable, long-term initiatives?" The main themes in the panel's answers included uniformity, standardisation and the need for even more collaboration. Anne voiced her concerns that there are already multiple archiving agencies, whereas David suggested that the lack of uniformity between the terms and conditions of different publishers (and even between different titles within the same publishers) was a more pressing issue. Day-to-day concerns were also raised, in particular the need for full coverage of an entire journal run (at the moment journal coverage may be incomplete with some years missing). The session finished with Bill pointing out the great opportunity that libraries now have to incorporate this wealth of archiving information into the new library management systems that are being developed.
Talking to other attendees after the session I know that I was not the only one to have found it highly informative. Many of us came away with a sense of reassurance that an awful lot of people are working very hard to ensure continued access to e-journals.
Priyanka Petinou's Conference Programme
Priyanka Petinou from the University of Gloucestershire has put together a great document based on our very own conference programme, with direct links to the conference videos and slides.
Download her document here.
Thanks Priyanka!
Download her document here.
Thanks Priyanka!
"Great Expectations": how libraries are changing to meet student needs
In this breakout session Liz Waller and Sarah Thompson of the University of York gave an overview of the initiatives taking place at York, Durham and Newcastle University Library Services to meet the increasing expectations of students. There is now a lot of work being done in this area as students have to pay fees (£9,000 pa in 2012/13) and students needs are changing. Increasingly, students believe that course costs should cover things like travel between campuses, software and course booklets, etc.
At the University of York there are a number of surveys undertaken, including the National Student Survey (NSS) which is a big driver for change.(There is actually only one question in this survey which specifically mentions library services!)
Liz and Sarah wanted to discover how other libraries are trying to improve the student experience, so they emailed to find out, among other things:
- Are libraries buying more resources?
- Are they deploying new strategies?
- To what extent is student fees a driver?
- Is there any difference in the service provided?
- Has additional funding been made available?
There were 23 responses from a broad range of library services.
New Resources
17 libraries have purchased new information resources. Examples are: reading list materials; more e-books; targeting purchases to support specific subject areas; introduction of a fast-track student request system; reading list software purchased to manage lists; print textbooks given to students.
New Strategies
16 libraries are deploying new strategies to deliver content to students. Examples are: buying print textbooks for first year students; providing pre-loaded devices to students, e.g. Kindles; implementation of a resource discovery tool; e-first policy for books.
Service Improvements
22 out of the 23 libraries improved service provision in some way this year. Examples are: longer opening hours; refurbishment; improved wi-fi; free/reduced printing; consideration of abolishing fines; improved IT support in library buildings; improved student-library liaison.
It seems that a lot of the changes libraries are making have been planned anyway. However, higher student fees seem to have given the opportunity for more dialogue between all groups involved and money is occasionally forthcoming to make these changes.
Case Studies
Durham has done the following things:
- 24/7 opening during exam term. Looking into doing this during vacations.
- More books (student-led).
- More PCs. Laptop loans.
- More study spaces.
Durham have found that the NSS has been the biggest driver. Use of Patron Driven Acquisition (PDA) is strong. They are now collecting statistics from the entry gates.
Newcastle has done the following things:
- 24/7 opening all term time. There is explicit student need for this, as well as wanting to keep up with the competition.
- Removal of stock to off-site storage.
- Strong move to silent study. Space design is being driven by students, working in partnership with the library.
- Use of Primo.
There has been excellent feedback from students.
York has done the following things:
- Refurbishment.
- 24/7 opening.
- Improved communications and marketing. Libraries are not normally good at doing this. The library at York uses Twitter and Facebook. News is distributed in different ways.
- Bought subject specific resources. Individual discussion with departments.
- Service developments, including flexible loans, room-booking systems, LibGuides. There are now specific staff in the library who are able to answer any questions users have quickly. Also 'grab and go' questionnaires have been undertaken in the library to find out from first hand the issues customers have.
- Fast-track purchase of heavily requested materials.
- Made changes to the loans system (inspired by the University of Sheffield). Now no fines unless an item is requested; 4 week rolling loan period; situational borrower categories (looking at behaviours); dynamic loan periods (longer time given to part-time students).
- Fast-track orders.
- Electronic text service (digitising).
- More e-books available, including all key texts.
- Free inter-library loans for essential books.
The impact has been positive. The library scored 82% in the NSS, compared to 74% the year before.
Some Thoughts
The session was concluded with the following questions which libraries need to think about.
- Has this work given us more leverage?
- Are students more demanding?
- Has students behaviour chaged?
- Do we understand the impact of the things we are trying?
- Are we giving money back or are we investing in the future?
- Should we see students as partners rather than customers or consumers?
- Do we have to do more with less? Is there conflicting pressure to reduce recurrent costs across the University?
At the University of York there are a number of surveys undertaken, including the National Student Survey (NSS) which is a big driver for change.(There is actually only one question in this survey which specifically mentions library services!)
Liz and Sarah wanted to discover how other libraries are trying to improve the student experience, so they emailed to find out, among other things:
- Are libraries buying more resources?
- Are they deploying new strategies?
- To what extent is student fees a driver?
- Is there any difference in the service provided?
- Has additional funding been made available?
There were 23 responses from a broad range of library services.
New Resources
17 libraries have purchased new information resources. Examples are: reading list materials; more e-books; targeting purchases to support specific subject areas; introduction of a fast-track student request system; reading list software purchased to manage lists; print textbooks given to students.
New Strategies
16 libraries are deploying new strategies to deliver content to students. Examples are: buying print textbooks for first year students; providing pre-loaded devices to students, e.g. Kindles; implementation of a resource discovery tool; e-first policy for books.
Service Improvements
22 out of the 23 libraries improved service provision in some way this year. Examples are: longer opening hours; refurbishment; improved wi-fi; free/reduced printing; consideration of abolishing fines; improved IT support in library buildings; improved student-library liaison.
It seems that a lot of the changes libraries are making have been planned anyway. However, higher student fees seem to have given the opportunity for more dialogue between all groups involved and money is occasionally forthcoming to make these changes.
Case Studies
Durham has done the following things:
- 24/7 opening during exam term. Looking into doing this during vacations.
- More books (student-led).
- More PCs. Laptop loans.
- More study spaces.
Durham have found that the NSS has been the biggest driver. Use of Patron Driven Acquisition (PDA) is strong. They are now collecting statistics from the entry gates.
Newcastle has done the following things:
- 24/7 opening all term time. There is explicit student need for this, as well as wanting to keep up with the competition.
- Removal of stock to off-site storage.
- Strong move to silent study. Space design is being driven by students, working in partnership with the library.
- Use of Primo.
There has been excellent feedback from students.
York has done the following things:
- Refurbishment.
- 24/7 opening.
- Improved communications and marketing. Libraries are not normally good at doing this. The library at York uses Twitter and Facebook. News is distributed in different ways.
- Bought subject specific resources. Individual discussion with departments.
- Service developments, including flexible loans, room-booking systems, LibGuides. There are now specific staff in the library who are able to answer any questions users have quickly. Also 'grab and go' questionnaires have been undertaken in the library to find out from first hand the issues customers have.
- Fast-track purchase of heavily requested materials.
- Made changes to the loans system (inspired by the University of Sheffield). Now no fines unless an item is requested; 4 week rolling loan period; situational borrower categories (looking at behaviours); dynamic loan periods (longer time given to part-time students).
- Fast-track orders.
- Electronic text service (digitising).
- More e-books available, including all key texts.
- Free inter-library loans for essential books.
The impact has been positive. The library scored 82% in the NSS, compared to 74% the year before.
Some Thoughts
The session was concluded with the following questions which libraries need to think about.
- Has this work given us more leverage?
- Are students more demanding?
- Has students behaviour chaged?
- Do we understand the impact of the things we are trying?
- Are we giving money back or are we investing in the future?
- Should we see students as partners rather than customers or consumers?
- Do we have to do more with less? Is there conflicting pressure to reduce recurrent costs across the University?
Breakout A: ALMA@UEL and Intota@Huddersfield: implementing a next-generation library management system
Adjoa's presentation: http://www.slideshare.net/UKSG/adjoa-boateng-18475762
Dave's presentation: http://www.slideshare.net/UKSG/ukgs2013-dave-pattern
Adjoa Boateng talked about the University of East London’s experiences as an early adopter of ALMA (ExLibris’ new cloud-based library services platform), while Dave Pattern of the University of Huddersfield discussed his institution’s project to look at Intota as the basis for a library management system for them.
UEL, with over 28,000 students across 3 campuses, has been running ALMA live since August 2012 – Adjoa shared their implementation timetable with us from March 2012 onwards, including sandbox access, migration of data and various training periods. Much of the work was done with ExLibris, though their own in-house systems team also provided and continues to provide local expertise. They had a fairly strict deadline as they were unable to work on-campus from late July to mid-August due to the London Olympics (Adjoa did say this was a great way to text ALMA’s flexible working!).
UEL made the switch to ALMA to improve their technology, innovation and global links while reducing their costs. Adjoa emphasised that it was important to bring the whole team on board with the full process, while still identifying their individual responsibilities and workflows and planning training accordingly for project groups.
As for their thoughts post-integration, it may still be too early to give a definitive response. Adjoa stressed the importance of communication and flexibility in dealing with changing functionality and revised workflows – they have had to be adept at ‘thinking outside the box’. She talked about what had gone well (mentioning data migration, SFX migration, ALMA configuration, integration with Primo and core functionality for most products) but she also went into detail about what hadn’t gone well and how they and ExLibris were dealing with these issues. For example, self-service was blocked by IT due to patron loader issues and security concerns – as 90% of UEL checkouts had been self-service before the switch, this caused major knock-on effects and need to re-educate their users; this is still not totally sorted. She also talked about issues with British Library interloans, hold requests, renewals, fines and email notifications, as well as back-end problems with e-resource workflow and serials check-in, but said some of these problems were not just ALMA issues but also issues with decisions they made during information migration.
The ExLibris project team and developers have been very helpful, with weekly post-implementation project calls, as well as fortnightly calls with project managers and developers; they even staged a 13-day meeting on the UEL campus with local staff to look at resolution of the problems listed above. Fixes have been included in monthly releases, with hot fixes issued as necessary. Any outstanding problems are being looked at in priority order (with ILL issues being a top priority), and they are also looking toward the future for further development.
Dave Pattern then started his presentation on the JISC HIKE project… or, as his first slide put it, "40 slides and a kitten!". ‘HIKE’ is an acronym for ‘Huddersfield Intota KB+ Evaluation’ – Huddersfield’s need for a new library management system has led to this project considering whether Intota might suit their needs, while also allowing them to implement cultural change and move the library forward. Dave also made reference to JISC Knowledge Base+ (or KB+) - a shared community service which will help UK libraries manage their e-resources more efficiently - and pointed us to Liam Earney's plenary on Wednesday for more information. The HIKE project blog linked above has further information, including workflows, with the final report to be added soon.
Huddersfield
still uses the SirsiDynix Horizon as implemented in 1995 – they went through a full
tender process in 2005 but didn't buy anything.
In addition there has been limited development on their version of
Horizon since 2007. Horizon has a new
version now but as Huddersfield want to change anyway they are resisting
movement; this is causing problems with the legacy system including security concerns. In addition to Horizon, they have used Intellident
RFID for self-service since 2006, and they also use Serials Solutions Summon as
implemented in 2009 (previously Ex Libris MetaLib & SFX). Dave talked about problems with traditional
monolithic library systems: limited interoperability, no APIs, fixed workflows,
duplicated effort designed for print etc. He quoted his Hudderfield colleague Graham Stone, who pointed out that they needed to understand current workflows, as well as the problems and frustration their staff and users feel, before they can identify what to choose in a library management system and what to expect from this new system.
Intota is still in development (no demos or screenshots included) and should be fully implemented in 2014. However, Huddersfield is an existing Serials Solutions customer (360 Knowledge Base etc) and thus already work with them, plus Serials Solutions were interested in working with KB+ so they welcomed the opportunity to study the integration of KB+ with their commercial system. Huddersfield also
wanted to have a look at a new system as it developed to see what might be possible! The HIKE project aims to evaluate the projected available functionality of Intota and its APIs, as well as putting
together case studies and evaluating the suitability of Intota for UK HE as a whole. It is also a useful opportunity for them to make a thorough
analysis of acquisition workflows for both print and e-resources - no one knew everything
about all workflows, so it is useful to document them, with a view to change/improvement,
and to build a wish list for a new system (including multi-tenancy secure SaaS, a linked data model, a central knowledge base, streamlined workflows for print and E, open APIs etc.). Dave also mentioned Jill Emery and Graham Stone's TERMS
(Techniques for Electronic Resource Management) project and how it influenced HIKE - this was being discussed at another breakout session at UKSG.
Huddersfield wants to move
from batch to real-time processes, getting away from duplicating data and
effort. Dave went through the old workflows for ordering e-books and how they could be improved and sped up to the benefit of the enquirer. Whatever system they go with, it has been useful putting their 'electronic house' in order and realising they can throw away their 'legacy baggage. Further automation and interoperability should free up staff time for more interesting jobs and collaboration - staff would be less tied to certain roles. Intota
offers a total approach from discovery to the back room which could work well with this new model - the final project report with more information is imminent, watch this space...
Subscribe to:
Posts (Atom)