for your search in support systematic reviews.

General points

Searching databases

The database searches must be broad in scope.

It is important that search strategies are very sensitive/broad to not miss any relevant research. They should contain a combination of:

  • subject headings
  • free text search terms

Indexing terms are often exploded.

Look in the MeSH hierarchy to see which terms fall below your chosen term as this will determine whether it is appropriate explode your chosen term to include these.

For free text search terms, take into consideration synonyms and plurals by truncating terms and include spelling variations such as American and British English spellings.

Relevant floating subheadings and adjacency operators can also extend your coverage.

See the list of Medline and Embase subheadings.

Selecting search terms

Select a test database to test your search terms. 

This would normally be Medline as it has broad coverage.

If your topic is very specialised and there is a key database for this, you may also wish to run a test search on this as well.

Run searches for specific terms related to your topic and make a note of relevant search terms that arise.

Check the indexing term’s relevancy by clicking on the term to see the Scope note.

You may also find the “Used for” list within the scope note handy for identifying other text words to include in your strategy.

Useful text mining software can also help to identify relevant search terms. These include:

  • PubMed PubReMiner
  • Vos Viewer
  • Mesh on Demand
  • Yale MeSH Analyzer
  • Medline Transpose
  • Litsearchr
  • Swift-Review
  • Polyglot Search Translator
  • nowGlobe
  • Citationchaser
  • RCT Tagger
  • Connected Papers

Further information is available on text mining sources.

Additional insights on using text mining software can be found in Investigation of text-mining methodologies to aid the construction of search strategies in systematic reviews of diagnostic test accuracy-a case study.

Comprehensiveness

The aim is not to miss any potentially relevant studies that are appropriate to the review question as this could weaken or invalidate any findings.

The search should maximize the retrieval or “recall” of articles over the “relevancy” of articles retrieved.

It is usual therefore when scanning through the results of the search that you will see quite a high number of references that are not so on target.

From this broad set of data, the researcher will make their selection of studies to be included in the review, discarding during this process the studies that are not relevant.

This approach helps to reduce the chance of important articles being missed. See 4.4.3 Sensitivity versus precision within the Cochrane Handbook.

Language and publication bias

Language and publication bias can creep in when limiting the parameters of the search.

Language bias

To avoid language bias results should not be limited to English.

Publication bias

To avoid publication bias conference abstracts should not be excluded from the database results.

Conference abstracts give information on the latest work which is yet to be formally published and helps to identify authors or research establishments worth contacting. They extend the search beyond what is available electronically.

Grey literature

In addition to following-up on conference abstracts grey literature should also be searched.

The following helpful guidance is available:

Population/Problem, Intervention, Comparison, Outcomes

Search strategies should ideally follow a structured framework such as the PICO (Population/Problem, Intervention, Comparison, Outcomes) structure.

However if PICO is not a natural fit for the topic, other structures such as SPIDER, ECLIPS(E), etc. are available.

For more on these, see Andrew Booth's Alternative Question Structures for Different types of systematic review.

Parts of the Cochrane Handbook for Systematic Reviews of Interventions covering designing the search strategy:

Please note that chapter 4 is currently being updated as part of the 2023 version 6.4 of the handbook.

MECIR manual: Searching for studies (C24-C38).

The PRESS Checklist (Peer Review of Electronic Search Strategies) can be useful as it provides a list of what elements the search strategy should have to be robust. See table 1.

Retraction searching

Prior to publication a retraction search should be performed on eligible or included studies to ensure that there have been no post-publication amendments to those studies. This can include expressions of concern, errata, corrigenda and retractions.

Guidance on how to include post-publication amendments in search strategies can be found in retraction searching sources.

Documentation and recording the searches

The key guidance for reporting the searches is PRISMA-S.

To be compliant you will need to do the following:

  • document all strategies; the requester needs to provide details of these both in the review write up and in the protocol submission
  • details of the number of records retrieved for each database search prior to the removal of duplicates
  • save all strategies on the database servers
  • the requester will need re-runs of these prior to completing the review and writing it up as several months will have normally passed during this process
  • for grey literature searches keep a record of the search statements used, if it's not possible to save online
  • keep a back-up Word file of all the search strategies used for easy reference
  • write up the search methods for publication as the paper will need to include a paragraph on the methods uses to conduct the search strategy

Exploring issues in the conduct of website searching and other online sources for systematic reviews: how can we be systematic? covers issues relating to searching and recording search activity on resources outside academic bibliographical databases.

It is vital that accurate records are kept to inform this paragraph. The paragraph should include:

  • where you searched
  • how you derived the search terms (did you identify relevant terms from relevant studies)
  • whether you used text mining software such as PubMed PubReMiner or Vos Viewer
  • whether you got a colleague to peer review your search for you

It is worth including a sentence that the search was developed and carried out by a librarian/ information professional.

Co-authorship may be appropriate where you have made a substantial contribution to the research paper.