In undertaking the UK’s metrics review, these are some of the books, reports, articles and blogposts that we found most helpful. Feel free to send us additional suggestions here.
This is not intended to be a comprehensive list, and many more sources can be found in our Literature Review.
Books
Cronin, B. & Sugimoto, C. R. (eds.) (2014) Beyond Bibliometrics: Harnessing Multidimensional Indicators of Scholarly Impact (MIT Press)
http://www.amazon.co.uk/Beyond-Bibliometrics-Harnessing-Multidimensional-Indicators/dp/0262525518/ref=sr_1_1?ie=UTF8&qid=1436330159&sr=8-1&keywords=beyond+bibliometrics+harnessing+multidimensional+indicators+of+scholarly+impact
Priem, J. (2014) ‘Altmetrics’ (Open access chapter from Beyond Bibliometrics: Harnessing Multidimensional Indicators of Scholarly Impact)
http://arxiv.org/abs/1507.01328
Dahler-Larsen, P. (2012). The Evaluation Society. Stanford University Press
http://www.amazon.co.uk/gp/product/B010CLCKLI?keywords=dahler-larsen%20%20the%20evolution%20society.%20stanford%20university%20press&qid=1436333112&ref_=sr_1_fkmr0_2&sr=8-2-fkmr0
Hazelkorn, E. (2015) Rankings and the Reshaping of Higher Education: The Battle for World-Class Excellence (2nd edition). Palgrave.
http://www.amazon.co.uk/Rankings-Reshaping-Higher-Education-World-Class/dp/1137446668/ref=sr_1_1?s=books&ie=UTF8&qid=1436333408&sr=1-1
Porter, T. M. (1995) Trust in Numbers: The Pursuit of Objectivity in Science and Public Life. Princeton University Press.
http://www.amazon.co.uk/dp/0691029083/ref=asap_bc?ie=UTF8
Vinkler, P. (2010) The evaluation of research by scientometric indicators. Oxford, Chandos Publishing.
http://www.amazon.co.uk/gp/product/1843345722?keywords=Vinkler%2C%20P.%20(2010).%20The%20evaluation%20of%20research%20by%20scientometric%20indicators.%20Oxford%2C%20Chandos%20Publishing.&qid=1436332003&ref_=sr_1_fkmr0_2&sr=8-2-fkmr0
Reports
Council of Canadian Academies. (2012). Informing Research Choices: Indicators and Judgment www.scienceadvice.ca/uploads/eng/assessments%20and%20publications%20and%20news%20releases/science%20performance/scienceperformance_fullreport_en_web.pdf. Retrieved 6 December 2014
NISO Altmetrics Standards Project White Paper
http://www.niso.org/apps/group_public/document.php?document_id=13295&wg_abbrev=altmetrics
US Research Universities Futures Consortium (2013). The current state and recommendations for meaningful academic research metrics among American research universities. www.researchuniversitiesfutures.org/us-research-metrics-working-group-current-state-and-recommendations-oct2013.pdf
Technopolis. (2014). Measuring scientific performance for improved policy making.
www.europarl.europa.eu/RegData/etudes/etudes/join/2014/527383/IPOL-JOIN_ET(2014)527383(SUM01)_EN.pdf
Research Information Network (2015) Scholarly communication and peer review: The current landscape and future trends. A report commissioned by the Wellcome Trust.
www.wellcome.ac.uk/stellent/groups/corporatesite/@policy_communications/documents/web_document/wtp059003.pdf
Nuffield Council on Bioethics. (2014). The culture of scientific research in the UK.
http://nuffieldbioethics.org/wp-content/uploads/Nuffield_research_culture_full_report_web.pdf
European Commission (2014) ‘Science 2.0’ consultation
http://ec.europa.eu/research/consultations/science-2.0/background.pdf
Small Advanced Economies Initiative (2014) Broadening the Scope of Impact: Defining, assessing and measuring impact of major public research programmes, with lessons from 6 small advanced economies. SAEI.
www.smalladvancedeconomies.org/wp-content/uploads/SAEI_Impact-Framework_Feb_2015_Issue2.pdf
King’s College London and Digital Science. (2015). The nature, scale and beneficiaries of research impact: An initial analysis of Research Excellence Framework (REF ) 2014 impact case studies. www.hefce.ac.uk/media/HEFCE,2014/Content/Pubs/Independentresearch/2015/Analysis,of,REF,impact/Analysis_of_REF_impact.pdf
Crossick, G. (2015) Monographs and Open Access: A report to HEFCE.
www.hefce.ac.uk/pubs/rereports/Year/2015/monographs/
Montgomery, L. (2013). Metrics challenges for monographs.
www.knowledgeunlatched.org/2013/04/metrics-challenges-for-monographs/
Royal Society. (2012) Science as an Open Enterprise. The Royal Society Science Policy Centre report 02/12
https://royalsociety.org/~/media/policy/projects/sape/2012-06-20-saoe.pdf.
Articles
Hicks, D., Wouters, P., Waltman, L., de Rijcke, S., and Rafols, I. (2015) ‘Bibliometrics: The Leiden Manifesto for research metrics.’ Nature. 520, 429-431.
www.nature.com/news/bibliometrics-the-leiden-manifesto-for-research-metrics-1.17351
Lawrence, P.A. (2007) ‘The mismeasurement of science’. Current Biology Vol.17, Issue 15, pR583–R585
http://www.cell.com/current-biology/references/S0960-9822(07)01516-3
Bilder, G., Lin, J. and Neylon, C. (2015). Principles for Open Scholarly Infrastructure-v1
http://dx.doi.org/10.6084/m9.figshare.1314859
http://cameronneylon.net/blog/principles-for-open-scholarly-infrastructures/
Costas, R., Zahedi, Z. and Wouters, P. (2014) ‘Do altmetrics correlate with citations? Extensive comparison of altmetric indicators with citations from a multidisciplinary perspective.’ arXiv preprint arXiv:1401.4321
http://arxiv.org/abs/1401.4321
Thelwall, M., Haustein, S., Larivière, V. and Sugimoto, C. (2013). ‘Do altmetrics work? Twitter and ten other candidates.’ PLOS ONE, 8(5), e64841. DOI:10.1371/journal.pone.0064841 http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0064841
Rafols, I., Leydesdorff, L., O’Hare, A., Nightingale, P., and Stirling, A. (2012). How journal rankings can suppress interdisciplinary research: A comparison between innovation studies and business & management. Research Policy. 41 (7), 1262-1282. https://www.researchgate.net/profile/Andy_Stirling/publication/254200906_How_journal_rankings_can_suppress_interdisciplinary_research_A_comparison_between_Innovation_Studies_and_Business__Management/links/0a85e538743ba4e319000000.pdf
Priem, J., Taraborelli,, D., Groth, P. and Neylon, C. (2010) Altmetrics: a manifesto.
http://altmetrics.org/manifesto
Halevi, G. and Colledge, L. (2014). Standardizing research metrics and indicators- perspectives and approaches. Research Trends. 39.
www.researchtrends.com/issue-39-december-2014/standardizing-research-metrics-and-indicators/
Adams, J. and Gurney, K. (2014). Evidence for excellence: Has the signal taken over the substance? An analysis of journal articles submitted to the RAE2008. Digital Science.
www.digital-science.com/resources/digital-research-report-evidence-for-excellence-has-the-signal-overtaken-the-substance/
Bandrowski, A., Brush, M., Grethe, J.S. et al. The Resource Identification Initiative: A cultural shift in publishing [v1; ref status: awaiting peer review,
http://f1000r.es/5fj] F1000Research 2015, 4:134 (DOI:10.12688/f1000research.6555.1)
Moed, H. and Halevi, G. (2014). Research assessment: Review of methodologies and approaches. Research Trends. 36, March 2014.
www.researchtrends.com/issue-36-march-2014/research-assessment/
Neylon, C. and Wu. S. (2009). Article-level metrics and the evolution of scientific impact. PLOS Biol 7(11). DOI: 10.1371/journal.pbio.1000242
http://journals.plos.org/plosbiology/article?id=10.1371/journal.pbio.1000242
Lee, C. J., Sugimoto, C. R., Zhang, G., and Cronin, B. (2013). Bias in peer review. Journal of the American Society for Information Science and Technology. 64 (1), 2–17. DOI:10.1002/asi.22784.
http://onlinelibrary.wiley.com/doi/10.1002/asi.22784/epdf
Sugimoto, C. et al. (2013) ‘Global gender disparities in science’. Nature. Vol 504: 211-213.
www.nature.com/news/bibliometrics-global-gender-disparities-in-science-1.14321
Moss-Racusin, C. A., Dovidio, J. F., Brescoli, V. L., Graham, M. J. and Handelsman, J. (2012) ‘Science faculty’s subtle gender biases favor male students’. PNAS vol.109, no.41, pp.16474-16479. www.pnas.org/content/109/41/16474.full.pdf
Donovan, C. (2007). The qualitative future of research evaluation. Science and Public Policy. 34 (8), 585–597. DOI:10.3152/030234207X256538
Derrick, G. E., and Pavone, V. (2013). Democratising research evaluation: Achieving greater public engagement with bibliometrics-informed peer review. Science and Public Policy. 40 (5), 563–575. DOI:10.1093/scipol/sct007.
Nightingale, P., and Scott, A. (2007). Peer review and the relevance gap: ten suggestions for policy-makers. Science and Public Policy. 34 (8), 543–553. DOI:10.3152/030234207X254396.
Priem, J., Piwowar, H., and Hemminger, B. (2012). ‘Altmetrics in the wild: Using social media to explore scholarly impact.’
http://arXiv.org/html/1203.4745v1
Russell, R. (2012). Adoption of CERIF in Higher Education Institutions in the UK: A landscape study.
www.ukoln.ac.uk/isc/reports/cerif-landscape-study-2012/CERIF-UK-landscape-report-v1.0.pdf
Blogposts
LSE Impact Blog (2014) ‘Reading list for the HEFCE Metrics Review’
http://blogs.lse.ac.uk/impactofsocialsciences/2014/04/03/reading-list-for-hefcemetrics/
Bishop, D. (2013) ‘An alternative to REF’
http://deevybee.blogspot.co.uk/2013/01/an-alternative-to-ref2014.html
Bishop, D. (2014) ‘Some thoughts on use of metrics in university assessment’
http://deevybee.blogspot.co.uk/2014/10/some-thoughts-on-use-of-metrics-in.html
Curry, S. (2014). ‘Debating the role of metrics in research assessment.’
http://occamstypewriter.org/scurry/2014/10/07/debating-the-role-of-metrics-in-research-assessment/.
Curry, S. (2012). ‘Sick of impact factors.’ Reciprocal Space blog.
http://occamstypewriter.org/scurry/2012/08/13/sick-of-impact-factors/
Thelwall, M. (2014) ‘Five Ways We Could Use Altmetrics in the Next REF’
http://www.socialsciencespace.com/2014/11/five-ways-we-could-use-altmetrics-in-the-next-ref/
Colquhoun, D. (2014) ‘Should metrics be used to assess research performance’
www.dcscience.net/2014/06/18/should-metrics-be-used-to-assess-research-performance-a-submission-to-hefce/.
Sabaratnam, M. and Kirby, P. (2014) ‘Why Metrics Cannot Measure Research Quality: A Response to the HEFCE Consultation’
http://thedisorderofthings.com/2014/06/16/why-metrics-cannot-measure-research-quality-a-response-to-the-hefce-consultation/
Martin, J. (2015) ‘Imagine there’s new metrics’
https://cubistcrystal.wordpress.com/2015/01/12/imagine-theres-new-metrics-its-easy-if-you-try/;
Martin, J. (2015). ‘Merit and demerit’
https://cubistcrystal.wordpress.com/2015/06/06/merit-and-demerit/
Priego, E. (2015) ‘More on metrics for the arts and humanities’
https://epriego.wordpress.com/2015/01/16/hefcemetrics-more-on-metrics-for-the-arts-and-humanities/
Fuller, S. (2014) ‘Is the fear of metrics symptomatic of a deeper malaise?’ LSE Impact Blog
http://blogs.lse.ac.uk/impactofsocialsciences/2014/06/25/fear-of-metrics-hefce-fiefdoms/
Sayer, D. (2014) ‘Time to abandon the gold standard?’ LSE Impact Blog.
http://blogs.lse.ac.uk/impactofsocialsciences/2014/11/19/peer-review-metrics-ref-rank-hypocrisies-sayer/
Websites
HEFCE
http://www.hefce.ac.uk/
San Francisco Declaration on Research Assessment
www.ascb.org/dora
NISO Alternative Assessment Metrics Initiative
http://www.niso.org/topics/tl/altmetrics_initiative/
ORCID
http://orcid.org/
Snowball Metrics
http://www.snowballmetrics.com/
Centre for Science and Technology Studies (CWTS)
www.cwts.nl/Home