Have a personal or library account? Click to login
Using Shared Embedded Assessment Tools to Understand Participant Skills: Processes and Lessons Learned Cover

Using Shared Embedded Assessment Tools to Understand Participant Skills: Processes and Lessons Learned

Open Access
|Apr 2023

References

  1. 1Allen, S and Peterman, K. 2019. Evaluating informal STEM education: Issues and challenges in context. New Directions for Evaluation, 2019(161): 1733. DOI: 10.1002/ev.20354
  2. 2American Educational Research Association, American Psychological Association, and National Council on Measurement in Education. 2014. Standards for educa-tional and psychological testing. Washington, DC: American Educational Research Association.
  3. 3Ashford-Rowe, K, Herrington, J and Brown, C. 2014. Establishing the critical elements that determine authentic assessment. Assessment and Evaluation in Higher Education, 39(2): 205222. DOI: 10.1080/02602938.2013.819566
  4. 4Beatty, PC and Willis, GB. 2007. Research synthesis: The practice of cognitive interviewing. Public Opinion Quarterly, 71(2): 287311. DOI: 10.1093/poq/nfm006
  5. 5Becker-Klein, R, Peterman, K and Stylinski, C. 2016. Embedded assessment as an essential method for understanding public engagement in citizen science. Citizen Science Theory and Practice, 1(1): 8. DOI: 10.5334/cstp.15
  6. 6Bell, P, Lewenstein, B, Shouse, AW and Feder, MA. 2009. Learning Science in Informal Environments: People, Places, and Pursuits, 140. Washington, DC: National Academies Press
  7. 7Burgess, HK, DeBey, LB, Froehlich, HE, Schmidt, N, Theobald, EJ, Ettingger, AK, HilleRisLambers, J, Tewksbury, J and Parrish, JK. 2017. The science of citizen science: Exploring barriers to use as a primary research tool. Biological Conservation, 208: 113120. DOI: 10.1016/j.biocon.2016.05.014
  8. 8Davis, C, Del Bianco, V, Peterman, K, Grover, A, Phillips, T and Becker-Klein, R. 2022. Diverse and important ways evaluation can support and advance citizen science. Citizen Science Theory and Practice, 7(1): 30. DOI: 10.5334/cstp.482
  9. 9Eberbach, C and Crowley, K. 2009. From everyday to scientific observation: How children learn to observe the biologist’s world. Review of educational research, 79(1): 3968. DOI: 10.3102/0034654308325899
  10. 10Fenichel, M and Schweingruber, HA. 2010. Surrounded by Science: Learning Science in Informal Environments. National Academies Press.
  11. 11Fu, AC, Kannan, A and Shavelson, RJ. 2019. Direct and unobtrusive measures of informal STEM education outcomes: Direct and unobtrusive measures of informal STEM education outcomes. New directions for evaluation, 2019(161): 3557. DOI: 10.1002/ev.20348
  12. 12Fu, AC, Kannan, A, Shavelson, RJ, Peterson, L and Kurpius, A. 2016. Room for rigor: Designs and methods in informal science education evaluation. Visitor Studies, 19(1): 1238. DOI: 10.1080/10645578.2016.1144025
  13. 13Grack Nelson, A, Goeke, M, Auster, R, Peterman, K and Lussenhop, A. 2019. Shared measures for evaluating common outcomes of informal STEM education experiences: Shared measures for evaluating common outcomes. New Directions for Evaluation, 2019(161): 5986. DOI: 10.1002/ev.20353
  14. 14Hussar, K, Schwartz, S, Bioselle, E and Noam, GG. 2008. Toward a systematic evidence-base for science in out-of-school time. Available at: http://ncil.spacescience.org/images/stem-in-libraries/evaluation/Toward-Systematic-EvidenceBase-Science.pdf.
  15. 15Kim, YJ. (yj), Murai, Y and Chang, S. 2021. Implementation of embedded assessment in maker classrooms: challenges and opportunities. Information and learning science, 122(3/4): 292314. DOI: 10.1108/ILS-08-2020-0188
  16. 16Learning Activation Lab. 2018. Tools: Measures and data collection instruments. Available at: http://activationlab.org/tools/ (Accessed: December 5, 2022).
  17. 17National Academies of Sciences, Engineering, and Medicine. 2018. Learning Through Citizen Science: Enhancing opportunities by design. Washington, DC: The National Academies Press. DOI: 10.17226/25183
  18. 18National Research Council. 2009. Learning science in informal environments: People, places, and pursuits. Washington, DC: National Academies Press.
  19. 19National Research Council. 2010. Surrounded by science: Learning science in informal environments. Washington, DC: National Academies Press.
  20. 20National Research Council. 2015. Identifying and supporting productive STEM programs in out-of-school settings. Washington, DC: The National Academies Press.
  21. 21Newman, I, Lim, J and Pineda, F. 2013. Content validity using a mixed methods approach: Its application and development through the use of a table of Specifications methodology, Journal of Mixed Methods Research, 7(3): 243260. DOI: 10.1177/1558689813476922
  22. 22Noam, GG, Allen, PJ, Shah, AM and Triggs, B. 2017. Innovative use of data as game changer for OST programs. The growing out-ofschool time field: Past, present, and future, 161176.
  23. 23Peterman, K, Becker-Klein, R, Stylinski, C and Grack Nelson, A. 2017. Exploring embedded assessment to document scientific inquiry skills within citizen science. In Citizen Inquiry, 6382. Routledge. DOI: 10.4324/9781315458618-5
  24. 24Peterman, K, Del Bianco, V, Grover, A, Davis, C and Rosser, H. 2022. Hiding in Plain Sight: Secondary Analysis of Data Records as a Method for Learning about Citizen Science Projects and Volunteers’ Skills. Citizen Science: Theory and Practice, 7(1). DOI: 10.5334/cstp.476
  25. 25Phillips, T, Ferguson, M, Minarchek, M, Porticella, N, Bonney, R, Tessaglia-Hymes, D, Nguyen, L, Shirk, J, Garibay, C, Haley-Goldman, K, Heimlich, J, Lewenstein, B and Ellenbogen, K. 2014. User’s guide for evaluating learning outcomes from citizen science. Ithaca, NY: Cornell Laboratory of Ornithology.
  26. 26Richards, KAR and Hemphill, MA. 2018. A practical guide to collaborative qualitative data analysis. Journal of Teaching in Physical education, 37(2): 225231. DOI: 10.1123/jtpe.2017-0084
  27. 27Serrell, B. 1998. Paying attention: Visitors and museum exhibitions. American Association of Museums.
  28. 28Shavelson, RJ, Zlatkin-Troitschanskaia, O and Mariño, JP. 2018. International performance assessment of learning in higher education (iPAL): Research and development. In Assessment of Learning Outcomes in Higher Education, 193214. Cham: Springer International Publishing. DOI: 10.1007/978-3-319-74338-7_10
  29. 29Sloane, K, Wilson, M and Samson, S. 1996. Designing an embedded assessment system: From principles to practice. Berkeley, CA: Univesity of California.
  30. 30Stylinski, CD, Peterman, K, Phillips, T, Linhart, J and Becker-Klein, R. 2020. Assessing science inquiry skills of citizen science volunteers: a snapshot of the field. International Journal of Science Education Part B, 10(1): 7792. DOI: 10.1080/21548455.2020.1719288
  31. 31Wiggins, G. 1990. The case for authentic assessment. Practical Assessment, Research, and Evaluation, 2(1): 2.
  32. 32Wilson, M and Sloane, K. 2000. From principles to practice: An embedded assessment system. Applied Measurement in Education, 13(2): 181208. DOI: 10.1207/S15324818AME1302_4
DOI: https://doi.org/10.5334/cstp.487 | Journal eISSN: 2057-4991
Language: English
Submitted on: Dec 20, 2021
Accepted on: Jan 10, 2023
Published on: Apr 26, 2023
Published by: Ubiquity Press
In partnership with: Paradigm Publishing Services
Publication frequency: 1 issue per year

© 2023 Rachel Becker-Klein, Cathlyn Davis, Tina B. Phillips, Veronica Del Bianco, Amy Grack Nelson, Evelyn Christian Ronning, published by Ubiquity Press
This work is licensed under the Creative Commons Attribution 4.0 License.