JISC HIKE Project Workshop – 26th February

Dave Pattern opened the workshop with a welcome to all the participants over coffee before introducing Jane Burke from Serial Solutions. Jane presented an overview of Intota to the workshop, she began by discussing how recent changes in the format of the library’s collections, such as the move to a predominantly e-based collection, the subsequent revision of the acquisitions, the increased purchase of packages over individual titles, have meant that we are now using yesterday’s systems and tools to do today’s jobs. With the old LMS’s and their corresponding workflows designed around the acquisition, maintenance and discovery of print material the move towards e-resources means that they are increasingly not fit for purpose. Jane then moved on to give an update on the development of Intota announcing that they hope to have the Assessment module ready for customers in 2013 and the complete release of the full availability of Intota in 2014. She finished by giving a demonstration of the proposed workflow of acquiring a resource.

Damyanti Patel from JISC Collections then spoke to the workshop about KB+. She open with a discussion about the rationale behind KB+ and how it developed out of a recognition of the need for accurate data and subscription lists and a realisation that every Journals Librarian across the UK was duplicating work as they were all trying to maintain an accurate list. She then moved on to provide an update of the subscriptions that are currently on KB+, the team started by populating the site with Nesli2 collections but have quickly moved on to looking at non-Jisc and non-Nesli2 collections. Damyanti then finished by talking about the future of KB+, how they are hoping to add historical data to the site, work with international partners, improve integration with other systems such as ELCAT, JUSP and 360, and also expand KB+ to cover ebooks.

Damyanti has blogged about her day here: knowledgebaseplus.wordpress.com/2013/03/04/jisc-hike-project-workshop/

Dave Pattern and Graham Stone then presented an overview and update on the HIKE project – eprints.hud.ac.uk/16837/.

The afternoon session was focused around the discussion of three main areas: workflows, cultural change and API’s and interoperability. Having done a lot of work around these areas for Huddersfield we interested to see if other institutions were experiencing the same issues or if they were having different issues what these were so we could factor them into our evaluation.


The intended outcome of this discussion was that the HIKE team would understand other institutions workflows, their pain areas where they felt efficiencies could be made and how the new systems of KB+ and Intota could help them.

Integration between the Library Management System, Reading List Software and Registry

This was raised by a number of different institutions as an area where they felt efficiencies could be made. At the moment many of the LMS’s have no integration with their reading list software, registry or book suppliers, therefore staff have to manually check the reading lists before placing an order with the supplier.

The University of Leicester reported basic integration between their LMS, reading list software (Talis Aspire) and book supplier (Coutts). Here academics create or edit their reading lists on Apsire, which creates a link between the reading list item and catalogue record as the list is being created using Z39.50. These lists are then reviewed by the Librarians who make the purchasing decisions, using a link on Talis Aspire the Librarians can link directly through to Coutts Oasis to place their order. Orders are then loaded overnight on to their LMS Sirsi-Dynix Symphony via EDI. Although this integration developed by Talis has helped reduced the amount of time spent checking the reading lists and inputting the book orders manually, staff are increasingly hoping for a completely automated process. It was agreed that ideally the reading lists created by the Academics with items marked either essential or suggested would, with the information about the number of students enrolled on the module from Registry, generate automatic orders based on a formula designed by the Librarians. The orders would not only go direct to the supplier but would also create an order record within the LMS so it would be possible to identify items that were on order.

During this discussion a few points were raised that must be considered when developing or implementing this integrated and automated process. Firstly this process would not take into account any cross-over between subjects such as English and History, or Maths and Physics were traditionally students have shared books. This could result in a large number of surplus books. Secondly this automated process of procurement would need to be considered when developing the integration between Intota and a financial system.

 Other systems

Many of the delegates raised the lack of interoperability between the LMS and a wide variety of systems as a particular pain point. These included all of the above and also subscription agents, publishers and email. One major problem was the inability to record information – of course this is something that KB+ is offering.

The lack of interoperability has led some to by-pass the LMS completely.

Knowledge Base +

An issue raised regarding KB+ was that is was not yet embedded into current workflows – this could cause a problem even if libraries subscribed – if it is not part of the workflow it won’t get used. This is something for HIKE to consider when looking ‘dream workflow’.

A point was also raised regarding the amount of human intervention in the current workflows, and whether KB+ could offer rules to put into place to improve efficiencies and prevent human error.


Another area where it was agreed that duplication of work and the risk of error could be reduced was within the financial workflows. Like us the majority of the institutions duplicate all their financial accounting in the LMS and their institutions financial management system and have the same problems that we have outlined in earlier posts. It was agreed that interoperability between Intota and the financial system is highly sought after. Again a number of points were raised that would need to be considered when developing and implementing the integration between Intota and the financial system. These were how would the system deal with:

  • the top slicing of budgets which frequently occur in Libraries
  • the split responsibilities of different subject areas between different Librarians; and
  • the subscription to multi-year deals and the commitment of money through the years.

Acquisitions workflow

One of the issues raised when looking at the acquisitions workflows was that there was a marked difference between supplier databases and that there was an on-going out of print books problem.

It was also suggested that in the next national book contracts that technology needs to be a driver for choosing the contracts and that more attention needs to be paid to workflows, it was suggested that with Intota, EDI might be surplus to requirements!

Finally, nobody had a solution for the back office pain of dealing with the huge files involved with PDA.

Cultural Change

Throughout this project we have been aware that the implementation of either or both KB+ and Intota would lead to significant cultural change, and if the implementation of these systems were to be successful how this change is managed would be important. Therefore the theme of our second discussion was cultural change – we were interested in finding out what issues delegates thought their colleagues may have about the change and how these changes could be managed.

One of the main concerns people felt their colleagues would have was that the automation of many of the processes that make up their job would led to their de-skilling, loss of knowledge and less interesting jobs. Others thought that colleagues would be unsettled by the change in responsibilities and tasks as it may require additional training and the learning of new skills. Other factors that would contribute to colleagues feeling unsettled and anxious about such changes are the changes in routine, the lack of control and the feeling of incompetence.

After identifying the factors that could cause worry and apprehension regarding the implementation of these new systems we moved on to discuss how such a change could be managed to alleviate many of these concerns and for the system to be introduced successfully. Everyone agreed that the most important contribution to change management was to ensure that everyone was comfortable with the change and that staff at all levels have the relevant information, are fully involved and actively participating from the beginning.

It was felt that this could be enabled through a series of workshops which members of staff could come to and identify for themselves where there is duplication, a high risk of error and the points of pain in the old system and then help to define how the new system would bring benefits. It was felt that such a workshop would only work if an environment was created where staff would feel comfortable to come forward and express their concerns and anxieties about the new systems without being criticised or judged – staff need the opportunity to moan. One suggestion at this point was the use of an external moderator for such workshops? It was also suggested that these workshops should be continued after the implementation and evolve in to a user group were staff regularly evaluate the system and provides feedback to the company about possible developments. Staff need to understand the journey and help to identify the skills gaps.

It was suggested that we need to evolve people into new jobs. One way of offering reassurance to staff would be to show how the time that had become free through the automation of process would be used, it was suggested that this was not just giving staff mundane tasks but about giving them the opportunity to develop themselves through the participation in projects, etc. and to show how this would benefit the library. The timing of the installation of a new system was also believed to play an important part in how the change is perceived by staff. While implementation at the busiest period of the year was not recommended it was thought that it should be during a moderately busy period in order to demonstrate the effectiveness and benefits of the new system.

However, one group wondered to what extent have we already moved towards change anyway? At least two universities present have gone down the route of having a single team that swaps tasks, and others were thinking about doing the same thing. This was also linked back to the discussion on workflows – a possibility is that we could adopt one workflow for all resources – would this lead to one team, or would this spread things out too thinly? Do we still need experts in certain areas?

Another useful point was that many ‘back room’ teams have been dealing with change for some time – the biggest impact may actually be on the subject teams as their role may change, e.g. PDA vs. subject librarians orders. It was felt the these teams need to be engaged from the outset as there is a clear tension between the need to do more outreach work and ordering resources at granular level.

It was also pointed out that we live in a constantly changing and developing world and it is important that institutions and workflows have enough flexibility to be able to constantly enact change to keep in-line with these developments. Therefore it is important that using all the ideas above we can create an environment that is safe, comfortable and open to change. Intota is part of a suite of changes and it is our responsibility to adapt to them.

Finally it was argued that things take time and we have learnt many lessons already from our implementations of Summon. However, if we don’t make our processes more efficient it’s only a matter of time before somebody else does!


In the final session of the day, groups attempted to come up with lists of APIs and stuff they wish they had – or would want in a new system:

  • To talk to the financial system
  • Less duplication of effort – we are always trying to reconcile things
  • A wish list system
  • A way of reporting problems to all systems without having to re-write the same query three times
  • To be intelligent about students, e.g. on placement – linking student records to the student information system
  • An integrated VLE
  • Integration with every operating system in the University!
  • Active directory – smartcards etc.
  • Integrated with reading lists
  • RFID – Can we GPS track the orders
  • Notification of reservations
  • Could we give more information than just ‘on order’ or ‘reserved’  – e.g. use supply times from vendors to say when an item is expected?
  • Integrated ILL – not just with the British Library, but other local libraries too
  • Ethos!!!

Some thoughts on what we don’t want!

  • FTP
  • EDI
  • Imports and exports
  • Student records in the LMS – no duplication of data!
  • Single point of failure, e.g. staff who own important pieces of information

However, before we got too carried away, we also thought that removing the library catalogue completely might be a step to far for some – back to evolving our users/staff needs through cultural change.

With thanks to all who attended the HIKE workshop for their invaluable thoughts and feedback!

Meeting with Finance to discuss interoperability between Intota and Agresso

We recently met with some colleagues in the University Finance department to discuss the procurement process for books. We covered our current workflows, an ideal workflow and the possible interoperability between Intota and Agresso that would be needed to facilitate this.

We began by discussing the current workflows (which can be seen in the previous blog post ‘Analysis of ordering processes in the acquisitions team at Huddersfield’) and agreed that the areas we have highlighted as pressure points and that could be streamlined, were definitely areas that we needed to consider rationalising. We also looked at how our present workflow represented a “financial danger zone” to the University as it could lead to delays in the financial commitment for outstanding orders. Given the period of economic austerity we are in at the moment, and that Agresso is used for the budget monitoring of the whole University, it is crucial that Agresso has accurate and real-time information available at all times for the University’s Senior Management team for their constant strategic planning.

In light of this we moved on to discuss a possible workflow between Intota and the eMarketplace portal from Agresso in order to ensure reliable information.

It was proposed that Intota could be set up on eMarketplace as a supplier and that to order books we would log into Agresso, select eMarketplace as the procurement option and then punch out to Intota as a supplier. Once in Intota we could search for the resource we would like to buy using title, author, ISBN, etc. and this would then search all the different suppliers and return our results. We would then be able to select the items we would like to purchase and place them in a basket. After selecting all the items we would like to buy we could then return to Agresso and retrieve our ‘shopping’, pulling all the items we have placed in our basket back into Agresso. This would create a Purchase Order with each individual item having its own line. At this point it would be possible to select the correct Cost Centre and Nominal to charge the item to or split the price between different Nominals if needed.

We then raised the possibility of being able to split the book Nominal in Agresso down further to reflect the different subject’s budgets. Horizon (our current LMS) offers the opportunity for each subject to have its own fund code, and each year there can be up to thirty different fund codes. Our colleagues assured us that this would be possible in Agresso by using the categories within a Nominal. We would just need to inform them of the names of the different sub-sections of the nominal we would like them to set.

Once the ‘shopping’ has been pulled back into Agresso and assigned to the correct Cost Centre and Nominal this would be sent to the budget holder for approval. Once the whole order has been approved it is sent to the supplier/suppliers and the money is committed on Agresso. Upon receiving the books into the Acquisition’s department we would need to receive the items on Agresso, this would allow the electronic invoice that has been sent by the supplier to automatically be paid by either BAC’s or credit card depending on the preference of the institution. However, the payment method of the supplier would have to have been set up in advance, when the supplier was created, and for payment by credit card to be possible the supplier must have the facility to accept online payments. Agresso also has the functionality, providing the correct fields are known, to be able to send a file to update Intota and make the items received and available. This file could be programmed to update Intota at regular intervals, the frequency of which can be determined by the institution. It was noted that this workflow would not create an order record within Intota and that the item would only be recorded on Intota after it had been received in Agresso. Is this a problem? Would we need to know which books are on order? After a brief discussion we decided that is was something that we would need to discuss further, however it may be possibly be something for Intota to consider – the ability to create an order record from the ‘shopping basket’ which is exported to Agresso.

We then looked at RFID receiving, currently in practice at UCLAN, and questioned whether with the workflow above would still be possible if this was introduced. Our colleagues said that Agresso can currently read HTML and barcodes therefore it may be possible for it to read the information in an RFID tag to receive the item. However, it was stressed that the line number of the order would have to be programmed into the tag in order for Agresso to receive the item and reconcile the financial information.

Finally, we discussed reporting, budget management and planning options. It was agreed that if all the information in Agresso was accurate and in real time it would be possible for those who manage the budgets to continue using Agresso. However, it was pointed out that few Librarians use Agresso and that they may feel more comfortable accessing the information they need in Intota. The web version of Agresso offers a homepage which can display real time information relating to selected budgets either as figures or as graphs, therefore we wondered whether it would be possible for Intota the pull this information, via an API, to the dashboard of Intota. If the information was pulled each time the user logs on it would ensure the figures were accurate.


In order for Agresso to be able to pay the correct supplier after pulling the ‘shopping’ in from Intota via the eMarketplace, we would need the suppliers to have been set up in advance and for Intota to be able to provide unique identifiers for the different suppliers within Intota to be able to identify the suppliers of each item placed in the basket and for this information to be able to brought back in to Agresso to enable the correct supplier to be paid.

We currently charge the servicing costs and MARC records to different Nominals therefore we would need this pricing information to be available through Intota and to be able to be pulled back in to Agresso through eMarketplace in order for us to assign it to the correct Nominal.

KB+ and the renewals process

After the release of the ‘renewals’ feature in KB+ in December 2012 which aims to facilitate the renewals process undertaken by Journals teams throughout the UK HE sector we thought it would be useful to provide some feedback to the KB+ team. The ‘renewals’ feature intends to simplify the journals renewals process, specifically the big deal renewals – both mid multi-year deal renewals and the renewal of the big deals themselves. The renewals tool will help to maintain an accurate list of titles within the different subscriptions we subscribe to, identify the dates we have access to and any core titles within the collection. It is also hoped that as each year’s subscription details are uploaded into the system, KB+ will provide historical documentation of all the titles we have subscribed to, the access we have to these, identification of the core titles with any changes tracked, and the changes in publishers and titles tracked through the years.

The ‘renewals’ feature on KB+ allows you to compare your current subscription, with the core titles clearly identified, with the proposed renewals package from the publisher. It is also possible to compare these with any other journals packages offered, and you are not restricted by publisher. For example you could compare Package A year X with year Y, but also package B etc. The comparison is clearly displayed by a colour-coded spread sheet and identifies the titles available in the package, highlighting those titles that are missing from the previous year’s collection and any new titles that have entered the journals package. This enables us to identify any issues at a glance. However, we wondered if it would be possible for the spread sheet to also include why some of the titles are missing such as if they have ceased publication, transferred to another publisher, combined with another journal or split into two journals, or would this information be something each individual library would need to chase up after their absence in the journal package has been drawn to our attention by KB+? Additionally we wondered if it would be possible for the spread sheet to identify which journal titles are hybrid (subscription and OA) and which are OA? We don’t want to be paying twice for an OA title so we could do with seeing the percentage OA content.

Once this comparison spread sheet has been downloaded from the renewals section it is possible for you to amend the details of the coming year, e.g. add the core titles, to reflect your holdings before uploading the spread sheet to KB+ to record the coming year’s holdings. For this document to be used as a record of historical entitlements and reflect accurately the current holdings of the institution, the additional information of the format of the title, print, electronic or both, and the dates of the access would also need to be included in the comparison spread sheet. We would also like to be able use the data in the spread sheet related to the renewal as a master version that could be uploaded (perhaps automatically) into Serial Solutions to amend the knowledgebase to accurate reflect our new holdings?

The KB+ ‘renewals’ will greatly help the Journals teams by:

  • gathering the information on the titles that have changed title or publisher, ceased or are new in the package
  • displaying it in an easy to read and interpret colour-coded spread sheet
  • highlighting the titles which have changed from last year.

It will save time and automate one of the tedious and manual jobs that has to be undertaken each year. We also think it could be improved even further if JUSP (Journals Usage Statistics Portal) information to also be included on the comparison spread sheet. This would mean that if a title was missing from the collection during the coming year we would be able to evaluate the impact it may have on our institution by analysing the usage stats from the previous year, e.g. if a title had a handful of downloads in the previous year we might not mind losing it from the collection, however, if a ‘non-core’ title was highly accessed we might want to consider a subscription with the new publisher. In addition it would be good to see the percentage of movement in a given deal in order to access the stability of a package – the downtime would be good too – but we may be asking a little too much with that one!!

Finally, we’d like to thank Liam Earney at JISC Collections (and our project team!) who has already taken some of these points forward 🙂

The Acquisitions workflow, Intota and Dawson Books

We recently met with Dawson Books to discuss our current acquisitions workflows (see our workflows blog posts), our ideal workflows and their implications for Intota, the Acquisitions team at Huddersfield and Dawson Books.

We began by discussing the information that would need to be provided by Dawson Books via Intota in order for us to make an informed decision on the items we would be purchasing. The details that we would ideally like to be visible are the:

  • format of the item
  • supplier
  • estimated delivery date
  • price

The actual price we would like to see would not be the list price, it would need to be the overall cost of the item including the servicing, delivery, vat and discount. However, at a later stage we would also want to know the price broken down in order to assign the costs to different budgets, at Huddersfield we pay for shelf ready processing costs out of a different budget to the actual book itself.

For e-books the following additional information would be required:

  • licence information
  • access criteria (how many users can have access, other options for more users)
  • purchase module (outright, credit based, availability as subscription through a collection)

Dawson Books confirmed that they would be able to supply all of this information; however, Intota would need to find a standard way of displaying the data from all of the different suppliers.

National Book Contract

Adherence to the National Book Contract was raised as an issue at this stage as we would only want to see information about the suppliers we had a contract with. Therefore it was suggested that Intota would need a series of default settings which can be amended by libraries to ensure that if the book is available from any of the chosen suppliers they are shown immediately and the search is only widened if there are no results from the chosen suppliers or the library manually chooses to widen it.


We then looked at the ordering process; one suggestion was to automate the whole process, perhaps creating a profile for each Librarian defaulting to specific loan types, etc. Unfortunately this would not be possible for individual orders at Huddersfield because there are too many variations. However, it is hoped that once our MyReading project is fully developed there will be formulae which will automatically create orders rather than going to the subject teams for approval and thus making efficiencies in the workflow (see previous blog Patron-Driven Acquisition for more information).      

Book reports

Book reports are currently supplied by Dawson Books to us via EDI or email and with the exception of cancellations very little is done with them. Dawson Books thought that it would be possible to supply this information directly to Intota via a feed. It was then suggested that these feeds could appear on a dashboard on the homepage of an individual alongside reports from all the suppliers. Due to the number of reports received it was decided that this would need to be customised to individuals so that the reports would only go to the relevant people, this led to the suggestion that reports from all suppliers would go directly to the relevant people based on the fund codes the items/resources are paid from. Further to this it was suggested that the reports would need to go to more than one person as this may cause problems during periods of absence. While discussing the idea of a dashboard on individuals homepages it was also suggested that a general overview could be presented when they log in, for example a graph could be used to show projected spend against actual spend and there could be detail on the amounts left in the budget, committed and spent visible when the individual immediately logs on. It was thought that because the information is more visible it may make the budget easier to manage and may help stabilise spend throughout the year. In addition longer reports would need to be exported as a csv file from Intota so that we could use it for other purposes if required, custom and standard reports would also need to be displayed on screen in html (or appropriate format) if a quick check of the figures was all that was required.

MARC records

The discussion then turned to the importing of MARC records. At Huddersfield we currently import our MARC records when we receive the books on to the system and pay Dawson Books. However, it is thought that with Intota we will be able to pull the records from the cloud at the time of order. While this will potentially save HEIs money as we will no longer paying the supplier for them as our access to the records will be part of the subscription to Intota. Dawson Books highlighted the possibility of poor records. Although it was acknowledged that the majority of the records would be of a high standard it was brought to our attention that books purchased pre-publication and e-books often have poor quality records. Something for Serial Solutions to consider when developing Intota?


This naturally turned our attention towards e-books; it currently around 48 hours after ordering for the record and link to become available on the catalogue.  With users expecting instantaneous access to e-books through Amazon and I-Tunes, this immediate access, alongside real-time invoicing, must be available to HE institutions through Intota. Dawson Books explained that one of the main reasons for the delay in the access to the book was the creation of the catalogue record, upon which it was suggested that with the implementation of Intota such a record may not be needed. If the front end of Intota is Summon, surely this will remove the need of a catalogue record as it will only need to be switched on in the Knowledge Base and be searchable through Summon?

Payments and our financial system

At Huddersfield we have moved to credit card payment for book orders rather than the traditional purchase order route – we would be interested to know if anyone else does this? We had a discussion about the interaction that would be required between us, Dawson Books, Intota and our financial system, Agresso, in order for the process to run smoothly and create efficiencies.

It is hoped that Intota will be able to work with Agresso and that during the ordering process we will be able to assign a Cost Centre and Nominal to each purchase which will then be visible in Intota and Agresso. Following on from this, we wondered if, after all the items on a delivery note/invoice had been received either manually or via RFID on Intota, it would be possible to send a notification Dawson Books via Intota instructing them to take payment from our credit card. If it were possible for each credit card transaction pulled in by Agresso to retain the Cost Centre and Nominal information input during the ordering process it would fully automate the payment system and significantly reduce the amount of work. However, although we use the credit card to pay it was acknowledged that other HEIs use purchase orders; therefore we turned our discussion to the payment by purchase order workflow. Similarly it was thought that if a notification could be sent to Agresso delivering the items on a purchase order, after they had been received on Intota, once the invoice was input in to the system by the Finance department, Agresso could immediately pay the invoice by BACs.

Out of print books

It was agreed that a more efficient way of ordering and supplying out of print items must be found.  There are a number of issues surrounding the order and supply of out of print items through the library’s approved book suppliers such:

  • out of print items not always being listed on the book supplier’s database, even though in a number of cases they are able to obtain them
  • if the items do appear, there can be inadequate information and sometimes no price
  • the cost of these items can be a lot more expensive
  • the speed of supply can often be a lot slower
  • the current process generally leads to confusion amongst the subject teams as to the best place to obtain the item from, which often means lots of emails/phone calls between subject teams, Acquisitions team and Dawson Books.

One suggestion was a two tiered ordering system whereby if we were to find a copy with an out of print distributor through Intota (via an out of print supplier option?) we could select to purchase that option but then request that it be processed by Dawson Books. While it was agreed that this was a good idea we were unsure how it would work in practice, for example, how would Dawson Books receive and purchase the item from the Out of Print distributor, and agreed that further discussion would be needed if we were to pursue this.

Additional features

Finally, we looked at some additional features that would be helpful for HEIs and whether it would be possible to incorporate these into Intota direct from the book suppliers. The first item mentioned was the possibility of being able to see if an item you are considering is already on order, had been previously supplied or is currently sat in a basket for your institution. Dawson Books confirmed that this information would be able to be supplied to Intota and that Serial Solutions would need to find a way of displaying this information.

On a similar theme the possibility of a reporting feature, which could remember what you had searched for was discussed and thought to be advantageous. For example, if you had looked for a number of e-books within a collection (publisher or aggregated) it was hoped that the system would be able to notify you and recommend that you purchase the collection. Following on from this, the possibility of e-books working like JISC Collections database deals was considered, whereby if you buy a database and part way through the subscription it becomes part of a national deal, you are credited with the difference to pay on the old subscription. Could e-books and national deals work in a similar way – could you get a discount from a nationally negotiated e-books deal if you had already purchased a number of titles outright, and could Intota provide this information at the point of need – possibly listing those national deals in the drop down of suppliers? Something for KB+ and JISC Collections to look into further perhaps?

ONIX-PL, JISC Collections and 360 Resource Manager

Over the past few weeks we have been adding our data and licences to KB+ – and blogging about our experiences. Our thoughts are now turning to how we get the data into both KB+ and 360 Resource Manager, ideally we only want to load the data once.  This is not such a big deal for the actual holdings – actually we plan to load some of the information into KB+ and some into 360 Resource Manager and import between the two in order to get the most efficient workflow – but that is a whole new blogpost…

A stumbling block in getting our head around the best way to test and use the two systems has been the way we go about adding licences. We are loading them into KB+ to add to the ones already there, but how do we get those into 360 Resource Manager?

Our partners in the HIKE project, JISC Collections, have been working hard in recent years to create ONIX-PL expressions of licences for e-journals, databases and archives.

ONIX-PL was developed with the needs of the academic library community in mind, since it was hoped that library and ERM systems vendors would adopt the standard as a way of improving the provision of licence information available through their products and services.

JISC Collections also recently developed the JISC Electronic Licence Comparison and Analysis Tool (elcat), which includes all of the 170 past and present licences that were created in ONIX-PL and makes them available for review, comparison and download. We blogged about a possible extension to this last week.

The trouble was that JISC Collections were way ahead of the game and the ERM systems couldn’t receive the licences in the ONIX-PL format.

So we are delighted that JISC Collections today announced that after a great deal of work with Serials Solutions to map ONIX-PL licence expressions to 360 Resource Manager, all JISC Collections licence agreements will be put to the use they were originally intended for – populating an ERM and allowing us to access the data where we need it!

We are now looking forward to testing this as part of the HIKE project.

Lorraine Estelle, CEO of JISC Collections states, “This is an important milestone for our work with ONIX-PL. We have always felt that the full value of ONIX-PL will only be realised once it has been adopted by a significant number of publishers and systems vendors. We hope that the work undertaken by Serials Solutions will act as a spur to other systems vendors and publishers to work with us to adopt this standard and include high quality licence information in their products.”

Mark Bide, Executive Director of EDItEUR, added “with the inclusion of JISC Collections licence data in 360 Resource Manager, we are starting to see the potential of ONIX-PL to improve the availability and quality of licence information throughout the supply chain. As EDItEUR begins a review of ONIX-PL we look forward to working with JISC Collections, academic libraries, publishers and systems vendors to build on this work and improve the usability of ONIX-PL.”

Thanks to Liam Earney for permission to re-hash his press release!

For further information about:

Serials Solutions and 360 Resource Manager go to http://www.serialssolutions.com

JISC elcat go to http://www.jisc-collections.ac.uk/Librarian-Tools/ElCat

Knowledge Base+ go to http://www.kbplus.ac.uk

EDItEUR go to http://www.editeur.org/2/About/

KB+ Feedback – 3

While working on inputting the all of the licences at Huddersfield, it has come to our attention that sometimes, where we have locally agreed licences for content outside of a NESLi2 deal, there is also a NESLi2 licence with the publisher. An example would be a publisher where we have an individual e-only title and associated licence – in this case we would never enter a NESLi2 deal for the whole package.

In these cases it would be beneficial to be able to compare the two licences. If we found that NESLi2 had negotiated a better deal than our individual agreement, e.g. walk-in users, unlimited users vs. simultaneous users, it would enable us to try and negotiate a better deal during our renewal. Now we could just do that anyway – we can easily view the NESLi2 licence at the JISC Collections website, but wouldn’t it be great if we could use something like ELCAT (Electronic Licence Comparison & Analysis) to do the comparison for us? At the moment it only allows the comparison of nationally agreed licences in ONIX-PL format. Therefore we thought it would be useful to have a comparison tool within KB+ (or as an extension of ELCAT), which would allow us to compare any licence on KB+, whether nationally or individually agreed, in any format with another.

And of course it would allow JISC Collections to see if we had a better deal than NESLi2!

KB+ Feedback – 2

While inputting the Huddersfield data into KB+ there were a couple of areas where we felt unsure as to which data should be entered. In these areas there were a couple of things that needed a further explanation of what data should be entered. The areas needing further clarification were:

  • How to identify core journals – One of our first tasks was to identify the individual journal titles that form our core subscription. However, we were unsure as to whether we should mark titles we have previously subscribed to but have now cancelled as core and there was no guidance within KB+.
  • Journal title start date – Similarly when identifying and editing the core journal titles information we were unsure as whether the start date of the individual titles should be the earliest year the core subscription gives access to (e.g. 2005) or whether it should be the first year we have access to the journal (e.g. 1996 because we have access through the NESLI package).

Although these are both small points, because the aim of KB+ is to reduce the time and cost spent managing the data relating to Electronic Resource Management, by having the work for management and maintenance of the data done centrally and shared or co-ordinated across the HE academic library community, it is important that all parties have a clear understanding of what the data is. By working to an agreed standard and definitions it will be easier to share and understand information across the library community because the data will be understood by all.

KB+ Feedback – 1

Over the last couple of weeks we have slowly been populating the KB+ database with the subscription and licence information relevant to the University of Huddersfield. While doing this we have come up with a couple of points of feedback for the KB+ team.

Possible link between the subscription and the related licence information and .pdf of the licence

While linking the Wiley Online Library Full Collection subscription with its relevant licence we realised there was no direct link to the licence in its PDF format or the licence properties once you have gone into the actual subscription page and can see the journal title entitlements.

The screenshot shows the page where the entitlements of an individual subscription are shown and where there is no link to the relevant licence.

Therefore on the page shown above in the screenshot, perhaps it would be possible to have a link to the licence properties and PDF of the licence that relates to the subscription. We think this would benefit users as they would easily be able look up an individual journal and link to the licence that it is controlled by. Although there is a link from the licence to the subscription it controls, we also think a link the other way would be beneficial as there is the chance users will know the journal that they would like to check access rights, etc. for but will be unsure of the licence it is controlled by.

Search-bar on the subscriptions home page

We have recently been searching for journal titles to identify them within KB+ as our core titles and found it frustrating that before we could look for the journal title we wanted we had to know what subscription package it was part of. Once we knew which subscription the journal title was part of the search was fairly quick as we could use the search/filter bar (see screenshot below) on the individual subscription page.

Change the default setting for the individual subscription homepage

While populating KB+ with data we found it strange that when you go in to an individual subscription it automatically hides the subscription detail, but this is quite important as it is here that tells you which licence the subscription is controlled by.

Therefore we wondered if it would be possible to change the default setting of this subscription homepage so that the subscription detail is automatically open and the details are visible. Perhaps there could also be link that jumps you down to the titles information so if you do not want to see the subscription detail you could jump straight down to the individual titles?