References
- 1Allen, S and Peterman, K. 2019. Evaluating informal STEM education: Issues and challenges in context. New Directions for Evaluation, 2019(161): 17–33. DOI: 10.1002/ev.20354
- 2American Educational Research Association, American Psychological Association, and National Council on Measurement in Education. 2014. Standards for educa-tional and psychological testing. Washington, DC: American Educational Research Association.
- 3Ashford-Rowe, K, Herrington, J and Brown, C. 2014. Establishing the critical elements that determine authentic assessment. Assessment and Evaluation in Higher Education, 39(2): 205–222. DOI: 10.1080/02602938.2013.819566
- 4Beatty, PC and Willis, GB. 2007. Research synthesis: The practice of cognitive interviewing. Public Opinion Quarterly, 71(2): 287–311. DOI: 10.1093/poq/nfm006
- 5Becker-Klein, R, Peterman, K and Stylinski, C. 2016. Embedded assessment as an essential method for understanding public engagement in citizen science. Citizen Science Theory and Practice, 1(1): 8. DOI: 10.5334/cstp.15
- 6Bell, P, Lewenstein, B, Shouse, AW and Feder, MA. 2009. Learning Science in Informal Environments: People, Places, and Pursuits, 140. Washington, DC: National Academies Press
- 7Burgess, HK, DeBey, LB, Froehlich, HE, Schmidt, N, Theobald, EJ, Ettingger, AK, HilleRisLambers, J, Tewksbury, J and Parrish, JK. 2017. The science of citizen science: Exploring barriers to use as a primary research tool. Biological Conservation, 208: 113–120. DOI: 10.1016/j.biocon.2016.05.014
- 8Davis, C, Del Bianco, V, Peterman, K, Grover, A, Phillips, T and Becker-Klein, R. 2022. Diverse and important ways evaluation can support and advance citizen science. Citizen Science Theory and Practice, 7(1): 30. DOI: 10.5334/cstp.482
- 9Eberbach, C and Crowley, K. 2009. From everyday to scientific observation: How children learn to observe the biologist’s world. Review of educational research, 79(1): 39–68. DOI: 10.3102/0034654308325899
- 10Fenichel, M and Schweingruber, HA. 2010. Surrounded by Science: Learning Science in Informal Environments. National Academies Press.
- 11Fu, AC, Kannan, A and Shavelson, RJ. 2019. Direct and unobtrusive measures of informal STEM education outcomes: Direct and unobtrusive measures of informal STEM education outcomes. New directions for evaluation, 2019(161): 35–57. DOI: 10.1002/ev.20348
- 12Fu, AC, Kannan, A, Shavelson, RJ, Peterson, L and Kurpius, A. 2016. Room for rigor: Designs and methods in informal science education evaluation. Visitor Studies, 19(1): 12–38. DOI: 10.1080/10645578.2016.1144025
- 13Grack Nelson, A, Goeke, M, Auster, R, Peterman, K and Lussenhop, A. 2019. Shared measures for evaluating common outcomes of informal STEM education experiences: Shared measures for evaluating common outcomes. New Directions for Evaluation, 2019(161): 59–86. DOI: 10.1002/ev.20353
- 14Hussar, K, Schwartz, S, Bioselle, E and Noam, GG. 2008. Toward a systematic evidence-base for science in out-of-school time. Available at:
http://ncil.spacescience.org/images/stem-in-libraries/evaluation/Toward-Systematic-EvidenceBase-Science.pdf . - 15Kim, YJ. (yj), Murai, Y and Chang, S. 2021. Implementation of embedded assessment in maker classrooms: challenges and opportunities. Information and learning science, 122(3/4): 292–314. DOI: 10.1108/ILS-08-2020-0188
- 16Learning Activation Lab. 2018. Tools: Measures and data collection instruments. Available at:
http://activationlab.org/tools/ (Accessed: December 5, 2022). - 17National Academies of Sciences, Engineering, and Medicine. 2018. Learning Through Citizen Science: Enhancing opportunities by design. Washington, DC: The National Academies Press. DOI: 10.17226/25183
- 18National Research Council. 2009. Learning science in informal environments: People, places, and pursuits. Washington, DC: National Academies Press.
- 19National Research Council. 2010. Surrounded by science: Learning science in informal environments. Washington, DC: National Academies Press.
- 20National Research Council. 2015. Identifying and supporting productive STEM programs in out-of-school settings. Washington, DC: The National Academies Press.
- 21Newman, I, Lim, J and Pineda, F. 2013. Content validity using a mixed methods approach: Its application and development through the use of a table of Specifications methodology, Journal of Mixed Methods Research, 7(3): 243–260. DOI: 10.1177/1558689813476922
- 22Noam, GG, Allen, PJ, Shah, AM and Triggs, B. 2017. Innovative use of data as game changer for OST programs. The growing out-ofschool time field: Past, present, and future, 161–176.
- 23Peterman, K, Becker-Klein, R, Stylinski, C and Grack Nelson, A. 2017.
Exploring embedded assessment to document scientific inquiry skills within citizen science . In Citizen Inquiry, 63–82. Routledge. DOI: 10.4324/9781315458618-5 - 24Peterman, K, Del Bianco, V, Grover, A, Davis, C and Rosser, H. 2022. Hiding in Plain Sight: Secondary Analysis of Data Records as a Method for Learning about Citizen Science Projects and Volunteers’ Skills. Citizen Science: Theory and Practice, 7(1). DOI: 10.5334/cstp.476
- 25Phillips, T, Ferguson, M, Minarchek, M, Porticella, N, Bonney, R, Tessaglia-Hymes, D, Nguyen, L, Shirk, J, Garibay, C, Haley-Goldman, K, Heimlich, J, Lewenstein, B and Ellenbogen, K. 2014. User’s guide for evaluating learning outcomes from citizen science. Ithaca, NY: Cornell Laboratory of Ornithology.
- 26Richards, KAR and Hemphill, MA. 2018. A practical guide to collaborative qualitative data analysis. Journal of Teaching in Physical education, 37(2): 225–231. DOI: 10.1123/jtpe.2017-0084
- 27Serrell, B. 1998. Paying attention: Visitors and museum exhibitions. American Association of Museums.
- 28Shavelson, RJ, Zlatkin-Troitschanskaia, O and Mariño, JP. 2018.
International performance assessment of learning in higher education (iPAL): Research and development . In Assessment of Learning Outcomes in Higher Education, 193–214. Cham: Springer International Publishing. DOI: 10.1007/978-3-319-74338-7_10 - 29Sloane, K, Wilson, M and Samson, S. 1996. Designing an embedded assessment system: From principles to practice. Berkeley, CA: Univesity of California.
- 30Stylinski, CD, Peterman, K, Phillips, T, Linhart, J and Becker-Klein, R. 2020. Assessing science inquiry skills of citizen science volunteers: a snapshot of the field. International Journal of Science Education Part B, 10(1): 77–92. DOI: 10.1080/21548455.2020.1719288
- 31Wiggins, G. 1990. The case for authentic assessment. Practical Assessment, Research, and Evaluation, 2(1): 2.
- 32Wilson, M and Sloane, K. 2000. From principles to practice: An embedded assessment system. Applied Measurement in Education, 13(2): 181–208. DOI: 10.1207/S15324818AME1302_4
