Answers to popular questions about the impact case studies.

On the questionnaire, question 1 has a check-box answer that allows more than one answer to be selected. Yet question 2 starts “from that single use…” Won’t this cause confusion?

The questions have been tested on 214 library users and this did not appear to cause confusion. However we will keep this in mind, and review in the future.

Aren’t we just cherry-picking good stories for marketing purposes?

This was a real concern for the TaF, which is why we spent so long looking at the existing research. The evidence has identified these impacts, so the quantitative questionnaire will identify how often these known impacts occur.

This gives a context for the qualitative case studies, and for the activity data that is collected nationally. If an interview or case study finds a negative impact we wouldn’t select it for advocacy or marketing that is true – but we could use it for service development or “lessons learned”.

Why is there no option for comments, or “other”. Isn’t it all a bit leading?

We spent lots of time debating this. A key request from library staff was for a short, practical questionnaire. Some libraries don’t collect impact evidence because they don’t have time to handle the responses. The questionnaire therefore doesn’t collect any free text.

This is why we spent so long looking at the existing research. The evidence has identified these impacts (and no others that can’t be mapped to them), so the quantitative questionnaire will identify how often known impacts occur. This gives a context for the qualitative case studies, and for the activity data that is collected nationally.

How long should libraries wait for impacts to happen, and when should services send out the questionnaire?

Very much a local decision as it depends on the circumstances and the particular impact of interest. A recent study (Brettle et al 2016) sent out the first questionnaire immediately following the contact then a reminder 6 weeks later.

Can libraries use their own questions or tools as well?

We have been quite rigorous in looking at the existing evidence and existing tools, and have striven to ensure that the data collected maps to the current evident base regarding health library impact.

In the past slightly different wording or outcome categories has made comparison between studies difficult. This has been compounded by small samples suggesting that the evidence base is weak.

Using the questions as they stand will make comparison, and collation of data possible and allow us to develop the evidence base at a national level. Libraries can add more specific questions (or develop their own), but ideally these should be selected from existing tools (such as the others listed in the toolkit) to enable comparison to other studies and to help us build the evidence base at local and national levels.

How will the data from the core questions be collated?

Based on experience piloting the questionnaire we are currently looking at regional or patch collection of the quantitative data using a shared survey software account. The process will be tried in two areas (Kent, Surrey and Sussex and Yorkshire and Humber) Some individual libraries are also trying out the questionnaire locally.

How will the case studies be collated?

Case studies can be submitted by knowledge and library services at any point during the year.

What happens after they are submitted?

You should receive a confirmation message from the blog stating that your case study has been submitted. After this the case studies are assessed by a panel made up of KLS staff from across the country who look at each case study against the checklist provided on the blog.

If we are unable to accept the case study for any reason you will receive a message detailing the reasons for this. Successful case studies are uploaded to the public facing area of the blog and Knowedge for Healthcare leads notified of examples added from their regions.

As case studies are generally assessed a monthly basis you should allow up to 10 weeks following submission before you hear anything.

Will the interview record sheets be collated?

No. They are for best practice really. The results could then be fed into case studies, which is what we need at national level for K4H. Ideally the in-depth interviews would be used in a robust research study so the results could be published. This research needs to continue to identify new impacts, or changes in what library clients’ value.

My library clients work in local government and their role isn’t included in the list for question 5

You could add locally relevant options locally. We did consider using “other” but this is best avoided as free text complicates the data crunching.

Are there any plans to develop tools to capture impact from Focus Groups?

The Impact Interviews templates could be adapted to work for Focus Groups. If you decide to go down this route, please share with us any modified tools you create and your experiences too.

Is any guidance available to help capture consent.
How do I obtain consent to record impact interviews with colleagues?

The interview consent form has been updated (August 2017) to allow for the capture of consent specifically for recording of interviews.

I am not including quotes from named individuals in my case study. Why do I have to give tick the button to say I have consent?

The forms is set up on the assumption that personally attributable data is included so if this is not the case with your case study then obviously you don't need to get consent from these third parties.

Just check that if you are quoting anyone and not actually giving their name, their identity cannot be traced back in other ways (eg through job titles: "Head of Nursing").

If all the information contained in the case study is from you/library staff then you could read this is you /library staff giving consent for us to use it.

Can I still collect Impact Data now that the General Data Protection Regulations (GDPR) are in place?

We sought advice on this and confirm that GDPR should not stop services collecting Impact information relating to their service.

Collection of impact data is an inherent part of the library service itself. If you receive the services of the library then you are asked about its impact.

The Quality Improvement Framework makes it clear that gathering Impact data is an essential element of an NHS Library and Knowledge service. Hence provision of the service and collection of impact data pertaining to it are two parts of a single continuum of service.

If you feel that impact data collection is a distinct activity in its own right, and cannot be justified as above, then legitimate interest would be a reasonable basis for processing the data.

However: 

  • if recipients actively object to any impact information requests then they should not be approached in the future
  • to avoid confusion any requests for impact data should not be sent alongside promotional or marketing information

Page last reviewed: 15 June 2021