Have a personal or library account? Click to login
Diverse and Important Ways Evaluation can Support and Advance Citizen Science Cover

Diverse and Important Ways Evaluation can Support and Advance Citizen Science

Open Access
|Jul 2022

References

  1. 1Adams, WC. 2015. Conducting semi-structured interviews. Handbook of practical program evaluation, 4: 492505. DOI: 10.1002/9781119171386.ch19
  2. 2Alkin, MC and King, JA. 2016. The historical development of evaluation use. American Journal of Evaluation, 37(4): 568579. DOI: 10.1177/1098214016665164
  3. 3Alkin, MC and King, JA. 2017. Definitions of evaluation use and misuse, evaluation influence, and factors affecting use. American Journal of Evaluation, 38(3): 434450. DOI: 10.1177/1098214017717015
  4. 4Becker-Klein, R, Davis, C, Phillips, T, DelBianco, V, Grack Nelson, A and Christian Ronning, E. (in review). Using a shared embedded assessment tool to understand participant skills: Processes and lessons learned. Citizen Science: Theory and Practice.
  5. 5Becker-Klein, R, Peterman, K and Stylinski, C. 2016. Embedded assessment as an essential method for understanding public engagement in citizen science. Citizen Science: Theory and Practice, 1(1). DOI: 10.5334/cstp.15
  6. 6Bequette, M, Cardiel, CL, Cohn, S, Kollmann, EK and Lawrenz, F. 2019. Evaluation capacity building for informal STEM education: Working for success across the field. New Directions for Evaluation, 161: 107123. DOI: 10.1002/ev.20351
  7. 7Bowser, A, Cooper, C, De Sherbinin, A, Wiggins, A, Brenton, P, Chuang, T-R, Faustman, E, Haklay, M and Meloche, M. 2020. Still in need of norms: the state of the data in citizen science, Citizen Science: Theory and Practice, 5(1). DOI: 10.5334/cstp.303
  8. 8Brandon, PR and Fukunaga, LL. 2014. The state of the empirical research literature on stakeholder involvement in program evaluation. American Journal of Evaluation, 35(1): 2644. DOI: 10.1177/1098214013503699
  9. 9Brown, ED and Williams, BK. 2019. The potential for citizen science to produce reliable and useful information in ecology. Conservation Biology, 33(3): 561569. DOI: 10.1111/cobi.13223
  10. 10Bundi, P, Frey, K and Widmer, T. 2021. Does evaluation quality enhance evaluation use? Evidence & Policy: A Journal of Research, Debate and Practice, 17(4): 661687. DOI: 10.1332/174426421X16141794148067
  11. 11Burgess, HK, DeBey, LB, Froehlich, HE, Schmidt, N, Theobald, EJ, Ettinger, AK, HilleRisLambers, J, Tewksbury, J and Parrish, JK. 2017. The science of citizen science: Exploring barriers to use as a primary research tool. Biological Conservation, 208: 113120. DOI: 10.1016/j.biocon.2016.05.014
  12. 12Cousins, JB. 2020. Collaborative approaches to evaluation: Principles in use (Vol. 3). Sage Publications. DOI: 10.4135/9781544344669
  13. 13Cousins, JB and Chouinard, JA. 2012. Participatory evaluation up close: An integration of research based knowledge. North Carolina: Information Age Publishing, Inc.
  14. 14D’Ostie-Racine, L, Dagenais, C and Ridde, V. 2016. A qualitative case study of evaluation use in the context of a collaborative program evaluation strategy in Burkina Faso. Health Res Policy Sys, 14(37). DOI: 10.1186/s12961-016-0109-0
  15. 15Daigneault, PM. 2014. Taking stock of four decades of quantitative research on stakeholder participation and evaluation use: A systematic map. Evaluation and program planning, 45: 171181. DOI: 10.1016/j.evalprogplan.2014.04.003
  16. 16Froncek, B and Rohmann, A. 2019. “You get the great feeling that you’re being heard but in the end you realize that things will be done differently and in others’ favor”: An experimental investigation of negative effects of participation in evaluation. American Journal of Evaluation, 40(1): 1934. DOI: 10.1177/1098214018813447
  17. 17Grack Nelson, A, King, JA, Lawrenz, F, Reich, C, Bequette, M, Pattison, S, Kollmann, EK, et al. 2019. Using a complex adaptive systems perspective to illuminate the concept of evaluation capacity building in a network. American Journal of Evaluation, 40(2): 214230. DOI: 10.1177/1098214018773877
  18. 18King, JA and Alkin, MC. 2019. The centrality of use: Theories of evaluation use and influence and thoughts on the first 50 years of use research. American Journal of Evaluation, 40(3): 431458. DOI: 10.1177/1098214018796328
  19. 19King, JA and Stevahn, L. 2012. Interactive evaluation practice: Mastering the interpersonal dynamics of program evaluation. Los Angeles, CA: Sage Publications. DOI: 10.4135/9781452269979
  20. 20Miles, MB, Huberman, AM and Saldaña, J. 2018. Qualitative data analysis: A methods sourcebook. Los Angeles, CA: Sage Publications.
  21. 21National Academies of Sciences, Engineering, and Medicine. 2018. Learning Through Citizen Science: Enhancing Opportunities by Design. Washington, DC: The National Academies Press. DOI: 10.17226/25183
  22. 22Patton, MQ. 1997. Toward distinguishing empowerment evaluation and placing it in a larger context. Evaluation Practice, 18(2): 147163. DOI: 10.1016/S0886-1633(97)90019-2
  23. 23Patton, MQ. 2020. Evaluation use theory, practice, and future research: reflections on the Alkin and King AJE series. American Journal of Evaluation, 41(4): 581602. DOI: 10.1177/1098214020919498
  24. 24Peterman, K, Becker-Klein, R, Stylinski, C and Grack Nelson, A. 2017. Exploring embedded assessment to document scientific inquiry skills within citizen science. In: Herodotou, C, Sharples, M and Scanlon, E (eds.), Citizen Inquiry: Synthesising Science and Inquiry Learning, 6382. London: Routledge. DOI: 10.4324/9781315458618
  25. 25Peterman, K, Del Bianco, V, Grover, A, Davis, C and Rosser, H. (in review). Hiding in Plain Sight: Secondary Analysis of Data Records as a Method for Learning about Citizen Science Projects and Volunteers’ Skills. Citizen Science: Theory and Practice.
  26. 26Peterman, K and Gathings, MJ. 2019. Using a community-created multisite evaluation to promote evaluation use across a sector. Evaluation and Program Planning, 74: 5460. DOI: 10.1016/j.evalprogplan.2019.02.014
  27. 27Phillips, TB, Porticella, N, Constas, M and Bonney, R. 2018. Articulating and measuring individual learning outcomes from citizen science: A conceptual model. Citizen Science: Theory and Practice, 3(2). DOI: 10.5334/cstp.126
  28. 28Schaefer, T, Kieslinger, B, Brandt, M and van den Bogaert, V. 2021. Evaluation in citizen science: The art of tracing a moving target. In: Vohland, K, Land-Zandstra, A, Ceccaroni, L, et al. (eds.), The Science of Citizen Science, 495514. Cham, Switzerland: Springer. DOI: 10.1007/978-3-030-58278-4_25
  29. 29Shaw, J and Campbell, R. 2014. The “process” of process use: Methods for longitudinal assessment in a multisite evaluation. American Journal of Evaluation, 35(2): 250260. DOI: 10.1177/1098214013506780
  30. 30Stevahn, L and King, JA. 2016. Facilitating interactive evaluation practice: engaging stakeholders constructively. Evaluation and Facilitation, 2016(149): 6780. DOI: 10.1002/ev.20180
  31. 31Stylinski, CD, Peterman, K, Phillips, T, Linhart, J and Becker-Klein, R. 2020. Assessing science inquiry skills of citizen science volunteers: a snapshot of the field. International Journal of Science Education, Part B, 10(1): 7792. DOI: 10.1080/21548455.2020.1719288
  32. 32West, S and Pateman, R. 2016. Recruiting and retaining participants in citizen science: what can be learned from the volunteering literature? Citizen Science: Theory and Practice, 1(2): 15. DOI: 10.5334/cstp.8
DOI: https://doi.org/10.5334/cstp.482 | Journal eISSN: 2057-4991
Language: English
Submitted on: Dec 12, 2021
Accepted on: Jun 14, 2022
Published on: Jul 18, 2022
Published by: Ubiquity Press
In partnership with: Paradigm Publishing Services
Publication frequency: 1 issue per year

© 2022 Cathlyn Davis, Veronica Del Bianco, Karen Peterman, Andrea Grover, Tina Phillips, Rachel Becker-Klein, published by Ubiquity Press
This work is licensed under the Creative Commons Attribution 4.0 License.