Have a personal or library account? Click to login
Evaluating Citizen Science: Moving beyond Output Measures to Learner Behaviors, Interests, and Motivations Cover

Evaluating Citizen Science: Moving beyond Output Measures to Learner Behaviors, Interests, and Motivations

Open Access
|Jul 2024

Full Article

Introduction

Citizen science has gained momentum in recent years and involves members of the public in ongoing scientific research. Citizen science is often defined as a form of research collaboration that engages citizens, who are non-professional scientists, in scientific research projects that involve data collection, analysis, and dissemination (Conrad and Hilchey 2011; Crowston and Wiggins 2011; Dickinson et. al. 2012; Haklay 2012). The evaluation of these programs is an emerging area of study.

Engaging the public in scientific research has several potential societal and individualized benefits. Citizen science projects provide educational experiences that potentially enhance participants’ knowledge about a species or topic, heighten participant awareness of a topic, and increase scientific literacy (Bonney et al. 2009a; Bonney et al. 2014; Briggs, Stedman, Krasny, and Krasny 2014). By participating in scientific research and working collaboratively, citizen science projects often aim to increase learning outcomes within the population of volunteers who participate in such programs.

While citizen science projects have been successful in advancing scientific knowledge and achieving specific research or program outcomes, more focus is now being placed on evaluating the impacts of citizen science on the participants involved (Brossard et al. 2005; Phillips et al. 2018). Previous research has predominantly focused on measuring engagement in citizen science through output measures, such as the number of participants or the amount of data collected (Phillips et al. 2018). A new focus is being given to evaluating the learning outcomes of participants. In particular, the field of citizen science has begun to ask questions such as “who are the volunteers?”, “what are the outcomes of participation?”, and “in what ways does volunteer experience shape desired outcomes?”

Water quality monitoring is one of the largest citizen science activities in the United States (U.S.), with a significant number of programs (Grudens-Schuck and Sirajuddin 2019). Owing to reduced government budgets and increasing environmental concerns, citizen involvement in natural resource management has become crucial (Conrad and Hilchey 2011). Volunteer monitoring efforts now support agencies by collecting data for reports, publications, and policy decisions (Stepenuck and Genskow 2017). More than 26 states sponsor volunteer monitoring programs (Overdevest, Orr, and Stepenuck 2004). Recent evaluations of water monitoring citizen science focus on analyzing benefits to volunteers, motivations, and participation outcomes (Church et al. 2018; Kirschke et al. 2023). Among these studies, there is a variety of outcomes, limited sample sizes, and intricate theoretical frameworks, which collectively pose challenges to drawing broad generalizations across the field of study. Assessment of these outcomes in relation to water monitoring citizen science generally lacks a unified framework of study.

In Oklahoma, Blue Thumb is a state-sponsored citizen science program focusing on water quality monitoring, operating as the educational arm of the Water Quality Division of the Oklahoma Conservation Commission. Serving as the primary initiative to address the Non-Point Source Pollution Management Plan’s education objective, Blue Thumb trains volunteers for monthly stream monitoring (Blue Thumb 2019). With a goal of “protection through education,” the program empowers citizens to become conservation leaders in the state, boasting over 80 monitored streams, 138 active volunteers, and 8,000 volunteer hours logged in one year. Blue Thumb stands as a model citizen science program, actively promoting educational outcomes statewide (Oklahoma Conservation Commission Water Quality Division 2020).

While Blue Thumb annual reports have traditionally emphasized programmatic successes over specific participant outcomes, they do highlight volunteer recruitment and retention metrics, including the number of new volunteers who have completed training, participation in educational events, and the frequency of monthly monitoring. While programmatic evaluations are essential for funding requirements, assessing individual participant outcomes is crucial for program continuity. Focusing on participant outcomes in evaluations can inform improvements in program training, data collection, and volunteer recruitment and retention. A needs assessment by Blue Thumb administrators aimed to understand how participation influences pro-environmental behaviors, the initial motivations of volunteers, and how participation may alter these motivations over time (Olson and Colston 2020).

Evaluation of outcomes from participation in citizen science is often listed as a “high priority” for practitioners and program coordinators, yet it is commonly rated to also be one of the largest challenges (Phillips et al. 2014; Kieslinger et al. 2017). Published evaluations of participant outcomes are limited in quantity, lack consistent definitions and desired categories of participant outcomes (Phillips et al. 2018), rely on pre- and post-test self-reported measures (Peter et al. 2019), and use newly created survey items instead of shared frameworks and methodologies (Roche et al. 2020). This research applied a common evaluative framework to measure how participation in Blue Thumb influenced volunteers’ environmental behaviors, scientific interests, and motivations for volunteerism.

Evaluation of participant outcomes

Despite recognizing the importance of evaluation, the field of citizen science struggles to measure learning and participant impacts, often lacking comprehensive evaluations that align with available theories and tools (Brossard et al. 2005). This inconsistency hampers understanding program effectiveness and reach. Few articles have categorized participant outcomes quantifiably, contributing to challenges in cross-programmatic research (Friedman et al. 2008; National Research Council 2009; Bonney et al. 2009a,b; Jordan et al. 2011; Phillips et al. 2012; Bonney et al. 2016; Bonney et al. 2014). “A Framework for Articulating and Measuring Individual Learning Outcomes from Participation in Citizen Science” (Phillips et al., 2018) addresses this by re-conceptualizing and synthesizing learning outcomes into quantifiable categories, providing standardized vocabulary and evaluation instruments to facilitate research in citizen science (Roche et al. 2020). The framework encompasses content, process, and nature of science knowledge; interest in science and the environment; skills; self-efficacy for science and the environment; behavior and stewardship; and motivation. Its development aims to offer a standardized approach for measuring citizen science learning outcomes.

Common assessment items are crucial for consistent evaluation in citizen science. The Cornell Lab of Ornithology’s Developing, Validating, and Implementing Situated Evaluation Instruments Project (DEVISE) offers practitioners a standardized set of evaluation tools aligned with the Phillips et. al. 2018 framework (https://www.birds.cornell.edu/citizenscience/measuring-outcomes/). These instruments aim to establish uniformity in evaluation methodology across diverse projects (Phillips et al. 2015; Phillips et al. 2017; Porticella et al. 2017a,b). Blue Thumb conducted a needs assessment survey, with staff rating major activities by participant engagement and intended learning outcomes (Olson and Colston 2020). Brossard et al. (2005) recommend focusing on a program’s main desired outcomes rather than attempting to evaluate all listed outcomes. The study identified three major learning outcomes for Blue Thumb’s evaluation: behaviors and stewardship, motivation, and interest. The subsequent section defines each outcome and presents related evaluation tools (Table 1), followed by a literature review of methodologies used in previous evaluations of these outcomes.

Table 1

Description of Phillips et al. (2018) definitions for three major learning outcomes with links to survey items., DEVISE: Developing, Validating, and Implementing Situated Evaluation Instruments Project.

CONSTRUCTDEFINITIONS AND BLUE THUMB EXAMPLESDEVISE SCALES
Behavior and StewardshipMeasurable behaviors that result from engagement in citizen science projects but are external to protocol or skills of the specific citizen science project.General Environmental Stewardship Scale
Interest in Science and NatureThe degree to which an individual assigns personal relevance to a topic or endeavor, and their actions taken toward pursuing that endeavor.Interest in Science and Nature Scale
Motivation for Participation in Citizen Science
Motivation for Doing and Learning Science
Factors that activate, direct, and sustain goal-directed behaviors and are the “whys” that explain what drives participation.Motivation for Participation in Citizen Science Scale
Motivation for Doing and Learning Science Scale

Behavior and stewardship

Behavior change, especially environmental stewardship, is often considered a key outcome in environmental citizen science programs, as indicated by surveys of biodiversity projects (Peter et al. 2019; Phillips et al. 2018). In projects like Blue Thumb, which involves regular hands-on experiences in nature such as monthly site assessments and bi-annual collections, these experiences have been shown to enhance a sense of connection with nature (Pocock et al. 2023). This connection strongly correlates with specific behavior changes, including positive attitudes towards the environment, pro-environmental behaviors, and engagement in policy making (Phillips et al. 2018; Wells and Lekies 2012; Brossard et al. 2005; Heimlich and Ardoin 2008; Merenlender et al. 2016; Santori et al. 2021; Jordan et al. 2012). Although Blue Thumb aims for behavior changes like increased water stewardship and environmental activism, formal assessments of volunteers are lacking (Olson and Colston 2020). With variations in projects and their associated behaviors, there is no universal method for monitoring behavioral outcomes in citizen science. Although scales exist for measuring environmental stewardship in general, many initiatives create project-specific questionnaires to assess desired behaviors.

Interest in science and the environment

An individual’s interest in science is a key driver of pursuing science jobs, sustaining lifelong involvement, creating a science identity, and laying the foundation for future engagement (Falk, Storksdieck, and Dierking 2007; Maltese and Tai 2010; Phillips et al. 2018). According to studies, interest is a component that affects involvement levels and helps to keep highly active volunteers on board (De Moor et al. 2019; Nov, Arazy, and Anderson 2014). Many challenges exist to capturing interest and engaging volunteers in a particular project: The voluntary nature of citizen science, personal circumstance, lack of compensation, and competition with other priorities may influence engagement from volunteers (Geoghegan et al. 2016; Frensley et al. 2017). While interest and engagement are desired outcomes, there is currently little information on how a citizen science project should measure them. In many cases, quantitative engagement measures (e.g., amount of time dedicated to an activity, number of activities participated in, number of submitted contributions over time) are used to reflect participant interest in a project.

Motivation for science and the environment

Motivation, an attitudinal construct, involves goal setting to achieve specific behaviors or desired outcomes (Phillips et al. 2018). Studies on citizen science agree that motivation is multifaceted, can change over time, and is linked to volunteer retention and project engagement (Alender 2016; Hajibayova 2020; Porticella et al. 2017a,b; Rotman et al. 2012; Raddick et al. 2009). Psychological theories, such as Self-Determination Theory (SDT), are valuable for understanding motivations in citizen science (Richter et al. 2021; Miller et al. 1988). SDT categorizes motivation into intrinsic (self-satisfaction) and extrinsic (rewards or societal pressures) types, providing insights into why participants engage in citizen science projects (Ryan and Deci 2000a,b; Porticella et al. 2017a,b). Increasingly used for evaluating motivations, SDT helps identify elements crucial for recruiting, retaining, and engaging citizen scientists (Wu et al. 2016; Nakayama et al. 2019; Richter et al. 2021). SDT predicts that intrinsic motivations lead to sustained engagement more than extrinsic motivations (Phillips et al. 2018). Recent studies highlight SDT’s role in explaining the psychological aspects of volunteers’ motivations in environmental monitoring programs (Maund et al. 2020; Pateman et al. 2021).

Study Purpose

The purpose of this study is to evaluate how length of participation in a citizen science program influences learning outcomes related to behaviors, interests, and motivations of volunteers. This study attempts to answer the following research question: To what extent, and in what ways, does length of participation influence volunteers’ behaviors, motivations, and interests as they relate to water quality monitoring?

Methods

Survey design

This study utilized a quantitative survey of participants with open-ended qualitative questions at the end (Appendix A). The survey items were intended to assess the identified participant outcomes determined during the previous focus groups with Blue Thumb Staff (i.e., behavior and stewardship, interest, and motivation). These surveys were modified versions of participant outcomes surveys created by Cornell Lab of Ornithology in their DEVISE project. We chose these because they have been previously tested for validity and used in other evaluation publications (Bonney et al. 2016; Phillips et al. 2015; Phillips et al. 2017; Porticella et al. 2017a,b).

Scales used in this research include: the General Environmental Stewardship Scale (GESS), the Interest in Science and Nature Scale (ISNS), the Motivation for Participation in Citizen Science Scale (MPCS), and the Motivation for Doing and Learning Science Scale (MDLSS).

  • The GESS includes 6 different survey items and produces a range of totaled responses from 0 to 42, where higher scores indicate higher levels of pro-environmental behaviors. The GESS had a Cronbach’s alpha score of 0.73 in this study.

  • The ISNS scores participants from 1 to 5, where scores closest to 5 indicate higher levels of interest in science. In this study, Cronbach’s alpha indicated strong internal consistency with a value of 0.96.

  • The MPCS and MDLSS scores were ranked from 1 to 5. An average of individual response was measured, and scores closest to 5 indicated higher levels of interest. Secondarily, motivations were also classified as either intrinsic or extrinsic on each scale according to the original survey items, and average responses for each type of motivation were calculated. Intrinsic responses were subtracted from extrinsic responses, as suggested by Phillips et al. (2017), to produce a “total response,” where positive scores indicate predominately intrinsic motivations, and negative scores indicate predominately extrinsic motivations. The MPCS exhibited a Cronbach’s alpha score of 0.67 for intrinsic items 0.79 for extrinsic items. The MDLSS Cronbach’s alpha score was 0.71 and extrinsic items had a score of 0.68.

Surveys were adapted for Blue Thumb by aligning terminology and streamlining questions. Participants took less than 15 minutes on average to complete the entire survey. To ensure validity, participants had to answer all scale items, and attention filler questions were included in each scale to ensure careful survey reading before responding.

Qualitative questions at the end of each survey item were designed to provide open-ended responses that probed participants further about each scale/learning outcome. These questions focused on what environmental behaviors each participant engages in, how training/monitoring has changed their interests, their driving motivations for originally signing up for a citizen science program, and what motivations or interests sustain their continued participation in Blue Thumb. Qualitative questions were intended to enhance meaning or give context to the previous quantitative survey results.

Sampling and survey distribution

Blue Thumb volunteers were categorized as “new volunteers” or “experienced volunteers” through purposeful sampling (Palinkas et al. 2019). New volunteers, aged 18 or older, newly joined Blue Thumb, completed a two-day training, and hadn’t started stream monitoring. Experienced volunteers, also 18 or older, completed the same training and had at least three months of active monitoring with data entry at a stream site. Different survey distribution methods were employed based on volunteer category. New volunteers received in-person surveys at the end of Blue Thumb training events. Experienced volunteers had two survey options: a) a digital survey through Qualtrics™ survey software (Qualtrics 2020), or b) hard copies distributed at semi-annual events for those with limited computer access. Experienced volunteers were given additional open-ended reflective questions that were not included in new volunteer surveys.

Data analysis

Quantitative survey data, analyzed using SPSS software, yielded descriptive statistics, and a chi-squared test explored demographic differences based on volunteer experience levels. Significant differences in learning outcomes were assessed between new and experienced volunteers through non-parametric methods (Mann-Whitney U, p < 0.05) due to non-normal data distribution and ordinal data. Post-hoc analyses were conducted for the Motivation for Participation in Citizen Science Scale, comparing intrinsic and extrinsic participation scale items, along with total participation motivation scores. Qualitative data analysis, facilitated by QDA Miner software, employed structural coding to tag responses with “behavior,” “interest,” or “motivation” to identify frequencies. In the final step, quantitative and qualitative data was compared to provide a more comprehensive understanding of our results. This involved comparing survey score items with qualitative comments from individual volunteers to analyze how participants’ qualitative responses explained their scale scores for behavior, interest, and motivation (Creswell et al. 2011).

Results

Demographic summary

The full dataset contained new volunteer (n = 41) and experienced volunteer responses (n = 33), for a total of 74 participants (Table 2). However, not all participants answered every question; thus, sample sizes vary among demographic parameters. Demographically, new volunteers were typically younger and still in college whereas experienced volunteers were older and had already completed college (Table 2). No meaningful difference in gender was noted between experience level, with approximately two-thirds of volunteers identifying as female. Overall, Blue Thumb volunteers are a highly educated group with existing science training. A chi square test of independence showed no significant association between experience level and demographic data.

Table 2

Demographic data between new (N = 41) and experienced volunteers (N = 33).

DEMOGRAPHICNEWEXPERIENCEDFULL SAMPLE
Sex
    Female262248
    Male15924
    Nonbinary022
Age
    > = 2419827
    25–3912820
    40–60358
    60+4711
Education
    K–12202
    High school101
    Some college11213
    Associates358
    Bachelors121022
    Masters8917
    Doctorate369
Career
    College professor156
    College student171027
    Government employee066
    Group/other325
    Hobbyist101
    K–12 student303
    K–12 teacher123
    Landowner437
    Scientist8513

Comparing new and experienced volunteers

To assess the impact of length of participation from the Likert scale data, Mann-Whitney U tests were conducted in SPSS to compare survey results for desired outcomes (behavior, interest, and motivation) between new (level “N”) and experienced (level “E”) volunteer groups (Table 3). Motivation for Participation in Citizen Science was significantly higher for experienced volunteers compared with new volunteers (p = 0.03). There were no statistically significant differences between new and experienced volunteers for behavior and stewardship, interest in science, and motivation for doing/learning science.

Table 3

Comparison of all survey scores between new and experienced volunteers with a Mann-Whitney U test. N = new and E = experienced volunteers. Significance denoted by *. Behavior was scored 0–42, where higher scores indicate higher levels of pro-environmental behaviors. Interest was scored between 1 and 5, where scores closest to 5 indicate higher levels of interest in science. Intrinsic motivation scores were subtracted from extinction motivations where positive scores indicate predominately intrinsic motivations.

OUTCOMELEVELNMEDIANSTD. DEVIATIONZp-VALUE
Behavior and StewardshipN4128.30.93–1.660.10
E3331.581.26
Interest in ScienceN414.091.40–0.060.96
E334.660.59
Motivation for Citizen ScienceN410.610.63–1.990.03*
E330.950.68
Motivation for Doing/Learning ScienceN411.500.90–0.100.92
E331.550.85

On the Behavior and Stewardship Scale, volunteers averaged a total behavioral score of 29.77 out of 36 possible points (Table 3), indicating high levels of pro-environmental behaviors. On the survey items, volunteers reported that they were currently engaged (averages above a score of 4, “I am currently doing this”) in all the listed pro-environmental actions (Figure 1). Qualitative responses from all volunteers revealed additional categories of environmental behaviors not captured on the survey items, such as sustainable shopping (19%), energy use (15%), alternative transportation (14%), plastic reduction (14%), composting (12%), stream clean ups (9%), and dietary changes (9%).

cstp-9-1-683-g1.png
Figure 1

General Environmental Stewardship Scale survey items response scores for all Blue Thumb volunteers.

For the qualitative questions given only to experienced Blue Thumb volunteers, 13 participants indicated a change in either attitude or behavior based on their participation. Several experienced volunteers described how advocacy and education had become new and important behaviors for them. A majority of responses in this category reported participation in additional activities that ranged from educational presentations at regional conferences, virtual environmental webinars, and participation in other citizen science events such as BioBlitz, iNaturalist, and Project WET. A total of four participants mentioned they had not noticed any changes in attitude or behavior since joining Blue Thumb. Volunteers indicate that they already had established pro-environmental attitudes and behaviors before joining Blue Thumb.

On the Interest in Nature Scale, volunteers had a median response score of 4.41 (Table 3) indicating high levels of interest in nature and science. On the survey items, volunteers reported that they were interested (averages approaching a score of 4, “Agree”) in all the listed topics of interest excluding “interested in making tables or reports” (Figure 2). All volunteers were asked which aspects of stream-monitoring they were most interested in within an open-ended survey question. The most reported interests included biological observations of macroinvertebrates (31%) and fish (30%), chemistry (24%), and being in nature (23%).

cstp-9-1-683-g2.png
Figure 2

Interest in Nature Scale survey items response scores for all Blue Thumb volunteers.

Experienced volunteer responses described how, if at all, their interest had changed since beginning in Blue Thumb. Slightly over half of experienced volunteers (18 responses) responded to open-ended questions that their perceived level of interest had increased over time. In some cases, these were increases to specific interests (such as more interest in data interpretation and education of others), or increased interest in participation (more dedication to consistent, routine monitoring of a stream). A total of 8 experienced respondents reported no overall change in interest, either indicating that they had high levels of pre-interest before joining Blue Thumb or that there was a consistent level of interest from when they had begun. One experienced volunteer said, “My interests have not changed significantly. I have always cared for every aspect of environmental science.”

Qualitative responses from experienced volunteers also identified barriers or challenges that made it more difficult for them to explore their interests. The largest of these challenges was time: Experienced volunteers felt that though they wanted to engage in topics beyond the required stream monitoring aspect of Blue Thumb, they did not have enough personal time to do so. One volunteer described their barriers to interest as follows: “I would like to be more involved beyond stream monitoring with Blue Thumb but have trouble finding the time.”

Two modified motivational scales were used: the Motivation for Doing and Learning Science Scale (Porticella et al. 2017a,b) and the Motivation for Participation in Citizen Science Scale (Phillips et al. 2017). On the Motivation for Doing and Learning Science Scale, new volunteers’ median response score was 1.50, whereas experienced volunteers’ median response was 1.55, where positive scores indicate intrinsic motivations (Table 3). On the survey items, volunteer responses were above 4 (“I agree”) for intrinsic items, and less than 4 for extrinsic items (Figure 3). Significant differences were found between new and experienced volunteers on the Motivation for Participation in Citizen Science Scale (Table 3). New volunteers’ median response score was 0.61, whereas experienced volunteers’ median response score was 0.95, indicating higher levels of intrinsic motivation in experienced volunteers. The intrinsic motivation score was not significantly different between new and experienced volunteers (p = 0.86), but extrinsic motivations and overall motivations for participation were both statistically significant (p = 0.02 for both) (Figure 4; Table 4).

cstp-9-1-683-g3.png
Figure 3

Motivation for Doing and Learning Science Scale survey items response scores for all Blue Thumb volunteers.

cstp-9-1-683-g4.png
Figure 4

Comparison of Motivation for Participation in Citizen Science scale items response scores between new and experienced volunteers, where * denotes significant differences.

Table 4

Comparison of Motivation for Participation in Citizen Science Scale survey items response scores between new and experienced volunteers. Significance denoted by *.

MOTIVATION PARAMETERLEVELNMEDIANSTD. DEVIATIONp-VALUE
Intrinsic motivationN414.620.460.86
E334.600.48
Extrinsic motivationN414.010.640.02
E333.650.65
Motivation sumN410.610.630.02*
E330.960.66

Qualitative questions related to motivations for participating in Blue Thumb and/or monitoring a stream revealed several different categories of motivations, ranging from interest in learning about water quality to promoting education (Table 5). Volunteers commonly expressed more than one motivation as their reason for participation. Experienced volunteers reported motivation largely attributed to personal feelings or place-based attachments. Volunteers who had continuously monitored at the same site for the duration of their participation mentioned the connection they felt to these places. Many experienced volunteers described how “contributing” to something was associated with positive feelings, or a sense of responsibility and duty. New volunteers reported more motivation for creating/maintaining social relationships between themselves, friends and family, or other Blue Thumb volunteers. Some volunteers, usually new volunteers and younger college students, described participation in Blue Thumb as a way of gaining career-specific skills.

Table 5

Response categories of qualitative data describing all volunteers’ (N = 74) main motivations for participating in Blue Thumb.

DESCRIBE YOUR MAIN MOTIVATION(S) FOR PARTICIPATING IN BLUE THUMB.
RESPONSESEXAMPLE QUOTESN
Interest in learning about water quality“To learn about what factors in my local watershed influence stream health and share with the community the differences we can make”17
“I want to learn more about water quality monitoring and gain an understanding about how things around me work”
Personal feelings and attachments“I have a deep passion for the environment and feel a duty to monitor and preserve these fragile ecosystems”15
“I was motivated because I thought it would be something fun to do and I enjoy fishing, science, and statistics”
Contribution“I desire to contribute to sustainability through action”13
“It’s empowering to feel like I’m having a direct impact on the environment around me”
Social relationships“Personally engage in an established organization and surround myself with like-minded people”12
“ I want to be an example to my family of how to do this… I want to be a part of a bigger community involved in water monitoring”
Conservation practices“I want to contribute to water conservation in Oklahoma and see how water quality changes”12
“It’s something to do that helps the environment […] promotes conservation”
Connection to land“I want to help protect water quality on family land that we own”10
“I love the stream […] It means a lot to me and my community”
Career experience/skills“Eventually I want a career in something environmental so I thought this was a good place to get started”7
“Interest in learning skills that could be useful in moving into a new career”
Education“I want to help educate and create awareness of water quality issues”7

Discussion

This research applied a framework for participant outcome evaluation from Phillips et al. (2018), and shared survey items, to assess how length of participation in Blue Thumb influenced participants behaviors, interests, and motivations. In contrast to expectations, comparisons between new and experienced volunteers for many of the outcomes did not reveal significantly different results for behaviors, interest in science, and motivation for doing/learning science. Instead, all participants, regardless of experience level, had positive outcomes. Additionally, no significant relationship existed between any of the 4 measured outcome categories and years of experience, further demonstrating that volunteers likely were interested and motivated before joining and remained so throughout their volunteer career. It is possible that the reason for this lack of results was due to the selective nature of who participates in citizen science (Crall et al. 2013; West et al. 2021; Trumbull, Bonney, Bascom, and Cabral 2000). Blue Thumb volunteers came from a strong science background, with higher levels of education that tended toward the biological/environmental sciences, and they engaged in environmentally friendly actions before joining.

Both new and experienced Blue Thumb volunteers exhibited positive scores on intrinsic motivation scales, emphasizing the significance of internal drives. Blue Thumb volunteers reported being motivated by personal feelings and local attachments. Similar studies have shown that these motivations correlate with long-term engagement, improved environmental attitudes, and behavioral changes (Church et al. 2018; Deci, Ryan, and Koestner 2001; Domroese and Johnson 2017; Jacobson et al. 2012). A notable distinction emerged between new and experienced volunteers in their motivations for citizen science participation. New volunteers, often college students pursuing science careers, leaned toward extrinsic motivators, while experienced volunteers favored intrinsic motivators tied to personal feelings and place attachments. This aligns with SDT and prior citizen science studies highlighting the enduring impact of intrinsic motivations (Nov et al. 2014; Phillips et al. 2019). Similar differentiations were observed in studies comparing younger with older volunteers and new with experienced participants (Clary and Snyder 1991; Jacobsen et al. 2012; Richter et al. 2021; Pateman et al. 2021), underscoring the dynamic nature of motivation throughout a volunteer’s engagement in a citizen science project.

The study underscores the limitation of relying solely on quantitative survey items to assess context-specific differences among volunteer groups. Whereas quantitative scores for behavior, interest, and motivation did not yield significant differences, qualitative responses provided valuable insights. These insights included details about volunteers’ environmental behaviors, shifts in interests among seasoned participants, and the motivating factors behind joining Blue Thumb. Advocating for a mixed-methods approach, this study aligns with previous research emphasizing the importance of combining quantitative and qualitative methods to gain a comprehensive understanding of program dynamics (Diaz et al. 2021; Palinkas et al. 2019; Phillips et al. 2014; Phillips et al. 2017).

Although the anticipated differences were not evident in the results, the evaluation process proved valuable for Blue Thumb administrators. The survey data unveiled a compelling narrative about highly engaged Blue Thumb volunteers: They actively participate in pro-environmental behaviors, maintain a strong interest in various science topics, and exhibit motivation not only for citizen science but also for broader environmental action. The evaluation results prompted Blue Thumb administrators to implement program enhancements such as additional training on stream biology (macroinvertebrates, fish), guidance on accessing and interpreting program data, and the planning of a state-wide appreciation event to foster connections among Blue Thumb volunteers. This collaborative approach highlights the advantages of integrating the evaluated program into the assessment process, encouraging continuous reflection on best practices in citizen science programs.

Limitations

A crucial aspect of this study’s findings is the geographic locations of Blue Thumb’s training events. Throughout the research period, many training sessions took place at public educational institutions, ranging from regional colleges to public libraries. These venues primarily attracted the local population, characterized by younger age and higher education demographics. The geographic ties of new participants to program-hosted events may introduce a bias in the volunteer base. Notably, most volunteers at Blue Thumb training events did not represent rural areas or underrepresented groups. This may have made it more likely to capture volunteers with a higher likelihood of having scientific backgrounds and engagement in environmental activities, skewing the overall generalizability of scale items. A directly comparable control group for new and experienced volunteers was unfeasible. The different sampling methodologies potentially introduce sampling bias to the research’s limited sample size (74 participants: 41 new volunteers and 33 experienced volunteers), and could limit the study’s validity despite achieving a relatively high response rate among the 138 active Blue Thumb participants.

Additionally, the scale items used in this study were developed by the Cornell Lab of Ornithology. Items for motivation marked intrinsic or extrinsic, and how to interpret these data, were included in the original scale document. The authors’ use of recommended interpretations from these scales may have limitations, particularly in the calculation of average scores across behavior, interest, and motivation scales. Notable discrepancies arise in how scores were processed: Behavior scores were summed, interest scores averaged, and motivation scores derived by subtracting intrinsic from extrinsic scores, potentially oversimplifying the relationship between these motivations and raising concerns about the validity of findings.

Conclusion

Measuring participant outcomes in citizen science programs presents theoretical and practical challenges. Blue Thumb’s evaluation revealed no significant changes in behaviors, or interests, underscoring the limitations of generalized quantitative survey items. Qualitative responses provided additional context, indicating that volunteers possessed pro-environmental behaviors and motivations before joining, which persisted throughout their volunteerism. The literature supports the notion that environmentally focused citizen science volunteers often bring pre-existing high levels of pro-environmental behaviors and interests (Crall et al. 2013; West et al. 2021). Survey tools did not capture significant differences between new and experienced volunteers; however, they identified prevalent pro-environmental behaviors, interests, and motivations among Blue Thumb volunteers. Both statistical and thematic analysis of responses indicate that volunteers’ motivations are different between new and experienced volunteers, aligning with SDT, which suggests intrinsic motivations are more important to long-term volunteer engagement.

Future evaluations should consider long-term studies to track motivation changes and link these motivations to volunteer retention and engagement. Additionally, the inclusion of qualitative data can be used to enhance quantitative findings and provide additional context for citizen science programs. In summary, although citizen science has the potential to influence learning outcomes, evaluating these remains a challenge. Common frameworks and shared tools offer valuable insights, but more in-depth mixed-methods studies may provide a deeper understanding of participant outcomes over time.

Data Accessibility Statement

The data used in this study contains sensitive information about the study participants who did not provide consent for public data sharing. The current approval by the Oklahoma State University Institutional Review Board (reference IRB-21-35-OFF) does not include data sharing.

Supplemental File

The supplemental file for this article can be found as follows:

Supplemental File 1:

Appendix A: Blue Thumb Volunteer Survey. DOI: https://doi.org/10.5334/cstp.683.s1

Ethics and Consent

This research was approved by the Oklahoma State University Institutional Review Board (IRB-21-35-OFF). Informed consent was obtained from all participants before being surveyed.

Acknowledgements

We are thankful to the Blue Thumb citizen science practitioners who contributed to this work and whose stories and passion allowed for great insight into the program. We appreciate the support from Blue Thumb staff: Rebecca Bond, Cheryl Cheadle, Candice Miller, Becky Zawalski, and Kim Shaw. Additionally, thanks to the dissertation committee that reviewed and edited the original document: Dr. Toni Ivey, Dr. Dan Shoup, and Dr. Jim Long.

Funding Information

This research was supported by NSF Advancements in Informal STEM Learning project (DRL-1811506) awarded to Oklahoma State University. Any opinion, findings, and conclusions or recommendations expressed in this website are those of the authors and do not necessarily reflect the views of the NSF.

Competing Interests

CO is a board member for the Association for Advancing Participatory Sciences (AAPS) and volunteered for Blue Thumb before, during, and after the study was conducted. All other authors have no competing interests.

Author Contributions

CO: Conceptualization, Methodology, Analysis, Investigation, Original draft writing, Review and editing; NC: Conceptualization, Methodology, Project administration, Review and editing, Supervision, Funding acquisition.

DOI: https://doi.org/10.5334/cstp.683 | Journal eISSN: 2057-4991
Language: English
Submitted on: Oct 5, 2023
|
Accepted on: Apr 26, 2024
|
Published on: Jul 2, 2024
Published by: Ubiquity Press
In partnership with: Paradigm Publishing Services
Publication frequency: 1 issue per year

© 2024 Cheyanne Olson, Nicole Colston, published by Ubiquity Press
This work is licensed under the Creative Commons Attribution 4.0 License.