Está en la página 1de 29

JOURNAL OF RESEARCH IN SCIENCE TEACHING

VOL. 50, NO. 7, PP. 773801 (2013)

Research Article
Changes in Participants Scientific Attitudes and Epistemological Beliefs
During an Astronomical Citizen Science Project
C. Aaron Price1 and Hee-Sun Lee2
1

American Association of Variable Star Observers, Museum of Science and Industry, Chicago,
Illinois
2
University of California, Santa Cruz, California
Received 21 March 2013; Accepted 11 May 2013

Abstract: Citizen science projects provide non-scientists with opportunities to take part in scientific
research. While their contribution to scientific data collection has been well documented, there is limited
research on how participation in citizen science projects may affect their scientific literacy. In this study, we
investigated (1) how volunteers attitudes towards science and epistemological beliefs about the nature of
science changed after six months of participation in an astronomy-themed citizen science project and (2) how
the level of project participation related to these changes. Two main instruments were used to measure
participants scientific attitude and epistemological beliefs and were administered before they registered for
the program and six months after their registration. For analysis, we used pre- and post-test data collected
from 333 participants who responded to both tests. Among them, nine participants were randomly chosen for
interviews. Participants responses were analyzed using the Rasch Rating Scale Model. Results show that
overall scientific attitudes changed positively, p < 0.01. The change was strongest in attitudes towards
science news and citizen science projects. The scientific attitudinal change was related to participant social
activity in the project. There was a negative change in their evaluation of their knowledge. The interviews
suggest that this is due to a greater appreciation for what they have yet to learn. Epistemological beliefs about
the nature of science significantly improved from the pre- to the post-tests, p < 0.05. Overall, we found
volunteers participation in social components of the program was significantly related to their improvement
in scientific literacy while other project participation variables (such as amount of data contributed to the
project) was not. # 2013 Wiley Periodicals, Inc. J Res Sci Teach 50: 773801, 2013
Keywords: citizen science; informal science; nature of science (NOS); science literacy

Most people spend the majority of their lives outside of school, yet science learning in nonschool settings is often overlooked (National Research Council [NRC], 2009). Informal learning
opportunities can support lifelong learning (Dierking, Falk, Rennie, Anderson, & Ellenbogen, K.,
2003; Falk, Storksdieck, & Dierking, 2007), engage underrepresented populations in science
(Center for Informal Learning and Schools [CILS], 2005) and create personal relationships with
science (NRC, 2009) in ways different from formal science education opportunities. Informal
science education is a rapidly growing field of research. Recent calls for more research on informal
science learning have been made by the National Science Board (NSB, 2008) and the National
Research Council (NRC, 2009).
Contract grant sponsor: National Science Foundation DRL award; Contract grant number: 0840188.
Correspondence to: C. Aaron Price; E-mail: aaron.price@msichicago.org
DOI 10.1002/tea.21090
Published online 11 June 2013 in Wiley Online Library (wileyonlinelibrary.com).
# 2013 Wiley Periodicals, Inc.

774

PRICE AND LEE

Citizen science, defined as research collaborations between scientists and volunteers (Cornell
Ornithology Lab, 2009), is an increasingly popular venue for informal science education
(Cohn, 2008). While the scientific contributions of citizen science participants have been
documented, such as contributing to the large-scale database of migrating bird populations
(Hand, 2010) and discovering new galaxies (Christian, Lintott, Smith, Fortson, &
Bamford, 2012), there is limited research addressing how citizen science projects impact
volunteers science literacy (Conrad & Hilchey, 2011; Mueller, Tippins, & Bryan, 2012;
Silvertown, 2009).
In this study, we investigate volunteers involved in an astronomical citizen science project
called Citizen Sky where participants collaborate with scientists to monitor and analyze data about
a bright variable star. Aligned with Miller (1983, 1998, 2004), we use a civic oriented scientific
literacy framework and measured two scientific literacy elements called attitudes towards
science-related activities and epistemological beliefs about the nature of science. The research
questions of this study are:
How did volunteers attitudes towards science-related activities and epistemological
beliefs about the nature of science change over six months of participation in an
astronomy-themed citizen science project?
How do patterns of project participation account for these changes?

We first introduce the role citizen science plays in informal science education and categorize
citizen science projects in terms of their participants level of involvement. We then describe
Citizen Sky as our research context and provide rationale for measuring scientific literacy as an
important outcome for citizen science projects. Next, we characterize our research methods
including subjects, test designs, data collection and analysis. In the results section, we compare
pre-/post-test differences in volunteers attitudes towards science-related activities and epistemological beliefs about the nature of science and use interview data to explain some of those
differences. Finally, we discuss findings and implications for citizen science participants, program
designers, and researchers, along with limitations of the study.
Citizen Science and Science Education
Citizen science projects, at their core, provide an organized venue for non-scientists to take
part in scientific endeavors ranging from passive (such as running a computer screen saver to
process scientific data) to active engagement (such as amateur astronomers searching for and
discovering new planets). The spectrum of citizen science projects has been categorized into many
different and often overlapping categories (Brandt, Shirk, Jordan, Ballard, & Tomasek, 2010;
Ely, 2008; Wiggins & Crowston, 2011). Citizen science can also be seen as a type of
crowdsourcingthe process of using an extremely large number of participants to accomplish a
focused task (Howe, 2006). Some citizen science projects, such as the large-scale GLOBE project
(GLOBE, 2011; Penuel & Means, 2004), take place in classrooms. However, this study is focused
on out-of-classroom projects as their implementations are considerably different due to the lack of
environmental controls and limitations on instructional strategies.
Most citizen science projects are driven by their scientific goals with little attention to their
educational impact on participants (Mueller et al., 2012). As a result, there are a limited number of
educational research studies available in the literature. The vast majority of published articles are
descriptions of projects and/or their targeted scientific accomplishments. This need for empirical
research drives our research reported in this paper. With limited research about participants
Journal of Research in Science Teaching

CITIZEN SCIENCE LITERACY

775

learning through citizen science projects, there is lack of evidence in guiding the design of future
citizen science projects (Jordan, Ballard, & Phillips, 2012).
Citizen science plays an important role in the expanding field of informal science
education (NRC, 2009; Ucko, 2010) and is seen as a leading trend in new projects funded under
the informal science education program at the National Science Foundation (NSF, 2010). It
allows participants to get involved in many different scientific practices while remaining
focused on a single goal or outcome (Roth & Barton, 2004). It can often promote sustained
engagement over decades of a persons life (Price & Paxson, 2011). These characteristics
resonate well with free-choice learning, a critical element in development of lifelong scientific
literacy (Falk & Dierking, 2010; Falk et al., 2007). At the same time, this flexibility creates
unique challenges in sustaining participant interest (Nov, Arazy, & Anderson, 2011a),
balancing the reality of data collection in uncontrolled environments with the fidelity to
established scientific standards (Fore, Paulsen, & OLaughlin, 2001; Ottinger, 2010) and, for
education researchers, developing rigorous research methodologies that are not intrusive into
the participant experience.
One of the unique characteristics of citizen science is in its focus on individual agency. Unlike
traditional laboratory-style science education programs, citizen science projects allow participants to build their own view of the project rather than seeing it exclusively through the
eyes of science (Roth & Barton, 2004). It is also an example of personal-curiosity science,
a term coined by Aikenhead (2005) to describe scientific learning that is of personal interest of
the student. Citizen science, with no ties to any classroom and no direct profit for the participant,
relies on this personal interest to drive participation and sustain activity. Aikenhead (2005)
believes that this type of personal connection is central to the success of science education writ
large.
This individual empowerment increases the value of social opportunities in citizen
science projects. Since there is no formal classroom or physically shared space, it is difficult to
build a community of practice among fellow participants. Communication between project
scientists and participants can be challenging (Greaves, 2012). The scientists often have to
serve as the teacher or mentor, a role for which they may or may not be trained, based on very
little interaction with participants. This sometimes leads to a topdown flow of instructions
from scientists to participants without requiring much independent thought and contribution
by the participants (Ely, 2008; Mueller et al., 2012). The Internet has helped to overcome this
barrier by making communication easier, at least for projects that are based online. An
increasing number of online citizen science projects have turned to asynchronous discussion
forums to create such virtual communities of practice among scientists and participants
(Khare, Zevit, & Shirk, 2012a; Raddick et al., 2010; Roy et al., 2012). Other projects have
begun partnering with museums and science centers to allow participants to interact directly
with project scientists (Khare, Zevit, & Shirk, 2012b; Time Out Chicago, 2012). As citizen
science increases in popularity, both in terms of diversity of projects and number of
participants, finding creative ways to build supportive communities becomes more and more
important.
The role of their participants differs substantially between citizen science projects, which
may determine the type and quality of participants experience and the learning outcomes after the
projects are completed. The Center for the Advancement of Informal Science Educations Public
Participation in Scientific Research inquiry group proposed three models of citizen science
projects based on the participation level of the volunteers (Brandt et al., 2010). The models are
idealized in that not all citizen science projects perfectly fit into a specific category, but they are
useful as benchmarks along the continuum of citizen science projects.
Journal of Research in Science Teaching

776

PRICE AND LEE

The Contributory Model


Contributory citizen science projects use participants mostly as data collectors in distributed
networks. This is the most common type of citizen science project (Karrow & Fazio, 2010). We
further divide this category into two types of contributory models because the participant
experience can be significantly different based on a single factor: whether they were actively or
passively contributing data to the project.
Passive Contributory Model: After the initial recruitment phase, participants are asked to
monitor equipment that automatically collects data and transmits them to a central
repository. One of the most successful of projects in this type is the Berkeley Open
Infrastructure for Networked Computing (BOINC; Anderson, 2003) where the typical
participant runs a computer screen saver to process project data.
Active Contributory Model: Projects in this category actively engage participants in the
process of data collection and/or data processing. Participants are required to make
decisions such as how often to collect data and when to deviate from suggested protocol.
The most popular projects in this category involve monitoring wildlife, including birds
(Brossard, Lewenstein, & Bonney, 2005; Evans et al., 2005; Wee & Subaraj, 2009),
insects (Howard & Davis, 2004; Phillips, 2008) and turtles (Somers, Matthews, &
Carlone, 2009). But it is also common in astronomy (Ferris, 2002; Percy, 1999) and
meteorology (Cifelli, 2005).

To locate empirical studies on how the contributory model impacts citizen science
participants, we used Google Scholar to look for a variety of specific and vague terms. Examples
are scientific literacy citizen science, scientific literacy informal science education, citizen
science education research, citizen science studies, and finally, simply citizen science by
itself. However, our search did not lead to published empirical educational studies of projects
within this model.
The Collaborative Model
In collaborative citizen science projects, participants are involved in developing descriptions
and explanations, or performing basic forms of initial analysis. The most popular example of this
type of model is the Galaxy Zoo project. The participants categorize photographic data to classify
galaxies into groups based on their appearance, and later their results are further analyzed by
professional astronomers. In addition, volunteers are encouraged to come up with explanations for
anomalous galaxy shapes. One volunteers comments on a picture led to the famous discovery of a
unique galaxy structure known as Hannys Vorweep (Cardamore et al., 2009).
Most of the citizen science empirical research literature involves projects that belong to the
Collaborative Model. In a study of The Birdhouse Network, researchers measured change in
scientific attitudes, understanding of the scientific process and the knowledge of birds by project
participants and found no change in attitudes or understanding of the scientific process (Brossard
et al., 2005). However, Brossard et al. (2005) did detect an increase in knowledge of ornithology.
In another ornithology project called The Seed Preference Test, researchers analyzed communication between participants and project organizers to look for evidence of scientific thinking
(Trumbull, Bonney, Bascom, & Cabral, 2000). While the researchers at the project did find
evidence of scientific thinking in the communication, they could not isolate the project impact
from other influences due to their study design. In addition, the researchers did not find any
relationship between scientific thinking and the amount of data contributed by the participant.
Another study analyzed e-mail communications between participants and project staff (Evans
Journal of Research in Science Teaching

CITIZEN SCIENCE LITERACY

777

et al., 2005) and found that changes in participants knowledge of birds were not apparent in email
communications but was indicated through interviews with a sample of the participants.
The Co-Created Model
The co-created model is sometimes referred to as participatory action research (Cornwall
& Jewkes, 1995). In this model, volunteers do everything from defining research questions to
publishing results. These more demanding citizen science projects are often used as examples of
distributed thinking (Hand, 2010). This is a relatively new category pioneered by groups such as
the Bossa project (BOINC, 2009). There are relatively few active projects in this field (Bonney
et al., 2009; Curren, 2013). This category is likely to grow (Roy et al., 2012) and there have been
calls for involving citizens in these types of more authentic research roles (Cooper, Dickinson,
Phillips, & Bonney, 2007; Lakshminarayanan, 2007; Wright, 2010).
We found some empirical studies about the scientific output of co-created model projects on
participants but none about their educational impact. One group of volunteers was trained to both
take samples and do their own analysis (alongside professionals who did their own independent
analysis) of water quality in a river shed over 3 years, with published results of their analysis
(Wilderman, 2004). And recall in the Galaxy Zoo project one participant discovered their own
galaxy type to great fanfare (Wright, 2010). That particular participants initiative elevated their
experience to one similar to the co-created model in that they discovered anomalous data,
investigated it and published the results (with professional guidance). However, there has been no
published studies about how these projects, or any other Co-Created project, impacted learning.
The focus has been on the scientific results.
Research Context: The Citizen Sky Project and e Aurigae
Citizen Sky (www.citizensky.org) was a 3-year, astronomical citizen science project launched
in June 2009. The hosting organization was the American Association of Variable Star Observers
(AAVSO), a citizen science organization that has been collecting astronomical data from amateur
astronomers since 1911. The project was organized according to a set of design principles that
were codified based on literature and the collective experience of AAVSO staff in running citizen
science projects.
Design Principle 1: Use a Context Where Volunteers Contribution Is Necessary and
Meaningful for Their Scientific Inquiry
The scientific goal of the project was to engage the public in monitoring a rare eclipse of a
very bright starepsilon Aurigae. The star was so bright that the majority of professional
observatories could not observe it with their sensitive instruments, but it is ideal for public
observationeven from bright cities. The educational goal of the project was to increase general
scientific literacy by involving participants in every stage of the scientific process. That is,
participants were asked to do more than simply collect data. They made their own hypotheses,
perform data analysis, tested theories and models, and eventually even published papers in a peerreviewed astronomical research journal (Percy, 2012).
Design Principle 2: Provide Internet Resources to Help Volunteers Interact With Peers
and Scientists
The Citizen Sky project was mostly coordinated via a central web site. Anyone could read the
sites content, but participants who wished to actively contribute to the project had to register for
an account. There were a number of ways participants could interact with the project. They could
supply data by making brightness estimates of epsilon Aurigae and submitting it to a central
Journal of Research in Science Teaching

778

PRICE AND LEE

database. They could explore the data using a variety of analysis and modeling tools available on
the web site. For social interactivity, the site had nine asynchronous online forums, which were
moderated and facilitated by project staff. Also, synchronous live chats were held roughly every
month. These free-for-all style chat sessions were scheduled around special topics and/or guests
(usually professional astronomers). Ultimately, the participants were given the tools and other
support to form collaborative teams focused around a mini-research project. Sometimes project
topics were suggested by staff and sometimes the teams came up with their own. Their mission
was to identify a topic, do their own research and develop a paper for submission to a peerreviewed astronomical publication. A professional liaison was provided to each team to answer
questions and give advice. Example team topics include developing a software package to
statistically analyze submitted data, testing the use of consumer DSLR cameras to acquire data
and designing visualizations based on the acquired data for use in press releases and other outreach
venues. In the citizen science classification regime described in this paper, we place Citizen Sky in
the co-created category.
Design Principle 3: Actively Involve Scientists in a Role of Teaching and Communication
One of the most important motivating factors in citizen science is direct and frequent
interaction with professional researchers (Doering, Miller, & Veletsianos, 2008). Citizen Sky
applied this principle by creating an environment with many channels of communication with the
professional astronomical world. A project scientist (Ph.D. astronomer) maintained an interactive
blog to keep participants informed of the latest research of epsilon Aurigae from professional
journals and conferences. Also, a graduate student spent roughly 20 hours a week interacting with
participants via the project web siteproviding them with feedback, advice, and general support.
Additional external scientists were invited to contribute blog posts about their own research into
the star. Finally, the online chats provided a chance to interact directly, in a synchronous manner,
with scientists (and each other).
Design Principle 4: Support Participants for Analyzing and Presenting Their Own Data
This principle was based on the concept of experiential learning (Kolb, 1984), which
postulates that effective learning is based on a transformative experience. The experience here is
the collection and analyzation of data by participants working on a real and pressing research
problem. Using simulated or modeled data in an educational context can introduce misleading
ideas about the scientific experience (Hollow, 2010). With their own data, participants had the
same connection to the data as a researcherin fact, they became researchers in every sense of the
term. In Citizen Sky, after they submitted data, participants were shown real-time graphs of their
data superimposed over the data of other participants, so they could see how their data compared.
They were also provided with tutorials and GUI tools to analyze the data, either separately or
combined with data from others. Ultimately, many published their results at amateur and
professional astronomy meetings and in journals.
Design Principle 5: Encourage Participants to Become an Active Member of a Research
Community
The most advanced and successful citizen science projects in recent AAVSO history were
ones in which the participants worked as a team. Constructivist approaches to science education
also have shown the benefits of group work in classroom laboratory projects (Cummings &
Kiesler, 2005), but remote projects need extra attention to foster this type of collaboration (Corter
et al., 2007). As a result, the Citizen Sky web site had a section dedicated to the formation of
different teams working on their own research project. These were private areas where team
Journal of Research in Science Teaching

CITIZEN SCIENCE LITERACY

779

members could communicate, share documents and chat. The formation of teams was fostered by
staff during the second year of the project. But team participation was not mandatory. About 23
teams eventually formed.
These design principles are not unique to the Citizen Sky project. The first two principles are
now considered as common sense for the construction of new, online citizen science projects. The
third design principle, working closely with scientists, is becoming more common with many
online citizen science projects. The Galaxy Zoo project, which has spawned a collection of other
projects beneath the umbrella ZooUniverse organization, has maintained a very active section of
their web site where scientists interact with participants. The fourth design principle is common in
active citizen science projects where participants generate their own data, such as the Christmas
Bird Count. The fifth principle, working as a community, is perhaps the most unique to the Citizen
Sky project. Many projects involve teams to collect data, such as the Mount Rainier Citizen
Science Team (Bacher, 2011), but the teams tend to be highly localized and focused (Roy
et al., 2012). We found none that supported teams involved in the broader process of creating their
own research questions, analyzing data and ultimately publishing their results. None of these ideas
are individually unique to Citizen Sky, collectively they are unique only because all of the design
principles were applied to a single project.
Theoretical Framework: Scientific Literacy
One of the most heralded promises of citizen science projects is to support the application of
scientific thinking to everyday life. This is mainly because participants are engaged in a project
that has become a part of their life outside of school or profession (Bonney et al., 2009; NRC,
2009). This may help overcome some of the significant transfer issues that exist in science
education (Gilbert, Bulte, & Pilot, 2011) by establishing the home as part of the participants
learning environment extended beyond formal science education settings. That home is also both
its own community and a part of a larger community. Roth and Barton (2004) propose an idea that
scientific literacy and citizen science is linked through these communities, which are better able to
address complex scientific issues than relying on individuals who were taught merely to follow
directions. The complexity also makes learning more authentic: . . . expecting one set of relations
(institutional school) to prepare students for a world of many relations does not make sense, (p.
2237). The participants role in the project is influenced not just by personal interest but also by
their community (e.g., amount of time they can spend on the project is related to family and work
demands). In this way, citizen science is unique in that it is an authentic scientific learning
experience that takes advantage of the support structure tailored for the participant in their home,
and thus can overcome the limitations of classrooms or laboratories.
Scientific literacy in a citizen science context is a collective concept (Roth & Lee, 2005) that
requires the integration of non-scientific considerations (Sadler & Zeidler, 2009) within a
personally meaningful context (Holbrook & Rannikmae, 2007). Citizen science projects are
generally created by organizations outside of the classroom, adapted to fit within non-scientific
limitations provided by everyday life and ultimately fated to the personal interest of the
participants. Thus, we have chosen to focus on a definition of scientific literacy closely aligned
with the concept of civic engagement (Shen, 1975) and with an interest in its collective
development by participants in the project. This differs from more traditional definitions that
describe scientific literacy as a combination of scientific knowledge and awareness of scientific
practices (NRC, 1996) or broad definitions such as that in the Project 2061 Benchmarks for
Scientific Literacy that include civic elements, but as one component of a much larger picture that
includes habits of mind and the ability to observe and reflect on science (American Association for
the Advancement of Science, 1993).
Journal of Research in Science Teaching

780

PRICE AND LEE

Miller (1983, 1998, 2004), described civic-based scientific literacy through three elements:
(1) Vocabulary of science (science content): The vocabulary of basic scientific constructs
needed to read and understand competing views from a popular science news source
(e.g., The New York Times Tuesday Science Section).
(2) Understanding of scientific inquiry (nature of science): The process or nature of
scientific inquiry.
(3) Attitudes towards organized science and knowledge (attitudes towards science): The
social impact of science on the individual and society.

A 2004 meta-analysis of the international scientific literacy studies shows that about 17% of
the US population was scientifically literate as described by this definition (Miller, 2004). Civicbased scientific literacy has been shown to be especially influenced by education in informal
settings (Falk & Dierking, 2010).
We assume that most participants of this project already had a command of scientific
vocabulary at the level of fluency used by Millers (1998, 2004) studies (e.g., correctly being able
to categorize astrology as scientific or not or know whether lasers focus sound or light waves). In
addition, the participants were able to critically read the science section of a newspaper, or else it
would not have been possible to understand any of the recruitment materials or sign up for the
project. Thus, our measures were focused on the 2nd and 3rd elements of Millers definition: the
understanding of scientific inquiry (epistemological beliefs about science) and attitudes towards
organized science and knowledge (science-related activities).
According to Bonney et al. (2009), in order to measure scientific literacy, citizen science
projects can utilize project participation data (e.g., data submission logs), pre- and post-surveys,
analysis of e-mail and listserv messages, self-report surveys, focus groups and interviews. This
study collected participation data via the web site and collected data from pre- and post-surveys
and interviews. We treated interviews as a secondary data source to explain some of the findings
from the pre-/post-test analysis from the participants point of view. In another study, we analyzed
online discourse of participants to identify emergent patterns of inquiry in their participation
(Price, Borland, & Lee, 2012).
Methods
Instrument Design
Scientific Attitude Instrument. An attitude instrument was assembled to match the unique
characteristics of an older, informal science audience. It needed to be constrained in length, focus
on the use of science in everyday life, and include questions that would measure behavior unique
to a citizen science audience such as the pursuit of science news and attending scientific talks
two hobbies not often found in the general population. There are a total of nine items answered
with a 5-point Likert scale consisting of Strongly Disagree, Disagree, Neutral, Agree, and
Strongly Agree categories. See Table 1 for item details. The reliability of the instrument was high,
a 0.95.
NSKS Instrument. Items for measuring epistemological beliefs about nature of science
(Table 2) were based on the Nature of Scientific Knowledge Scale (NSKS) established by Rubba
and Andersen (1978) (hereafter referred to as the NSKS instrument). We use the term
epistemological belief because we feel it is flexible enough to reflect that attitudes, feelings and
understanding change and are somewhat subjective. Other words such as knowledge or
awareness imply a hard reality the participant is being judged against and oversimplifies what
Journal of Research in Science Teaching

CITIZEN SCIENCE LITERACY

781

Table 1
Scientific attitude individual item difficultiesbased on Rasch analysis
Item Difficulty (Logits)
Item (Abbreviation)
I actively seek out stories about
astronomy in the news. (SEEK)
I am likely to attend a science
seminar, class or talk. (ATTEND)
I plan to participate in other citizen
science projects in the future.
(OTHER)
I am knowledgeable about science.
(KNOWLEDGE)
I use knowledge of science to
evaluate claims made about
science. (EVALUATE)
I will pay attention if an astronomy
news item crops up in a media
source I am already following.
(ALREADY)
I use knowledge of science in
everyday life. (EVERYDAY)
I am interested in news about
astronomy. (NEWS)
I am interested in science.
(INTEREST)

SErasch

Outfit

Pre

Post

Pre

Post

Pre

Post

1.05

1.23

0.09

0.09

1.03

0.90

0.81

1.17

0.10

0.09

1.45

1.21

0.69

1.54

0.10

0.09

0.86

0.87

0.55

0.29

0.12

0.09

1.26

0.83

0.04

0.29

0.11

0.11

0.94

0.82

0.46

0.49

0.11

0.11

0.75

0.82

0.67

0.28

0.11

0.09

1.25

1.23

0.84

0.17

0.11

0.11

0.67

0.68

1.07

0.85

0.12

0.11

0.5

0.48

constitutes nature of science, a term that stirs strong emotions in many academics. While it is not
a perfect term, we feel it best represents what we are attempting to measure in this study. There is
no agreed upon definition of the nature of science, but there is consensus that . . . it is related to
epistemology and values and beliefs for scientific knowledge (p. 409) and how that knowledge is
developed, refuted, and changed (Ozgelen, 2012). The original NSKS instrument was validated
by scientists and teachers and is commonly used in science education research (Bloom, 2008). It
was chosen over other NOS instruments because of its extensive pedigree and also because it is a
survey instrument. We piloted an open ended instrument which was strongly resisted by the
participants due to its length. In fact, they rebelled on our public discussion forums at the length of
the instrument. This is a common problem with informal science settings when participants are
expected to be somewhat entertained in addition to educated. The items in the original NSKS
included 48 items grouped into six categories of the nature of science (amoral, creative,
developmental, parsimonious, testable, and unified). Each category included four positively stated
items and four negatively stated items. To constrain the length of the overall test (in response to the
pilot study), we omitted all negative items leaving four items per category for a total of 24 items in
this study. See Table 2. The NSKS used a 5-point Likert scale consisting of Strongly Disagree,
Disagree, Neutral, Agree, and Strongly Agree categories. The overall reliability for the NSKS test
for this study was high, Cronbachs a 0.94, and was in general agreement with previous
validation work on the original NSKS instrument, despite its shortened length (Rubba &
Andersen, 1978; Meichtry, 1994).
Interview Protocol. The interview protocol included questions about past participation in
other citizen science projects (Have you participated in any other astronomical project similar to
Journal of Research in Science Teaching

782

PRICE AND LEE

Table 2
NSKS individual and sub-category item difficultiesbased on Rasch analysis
Item Categories and Items
With Abbreviations
(a) Amoral
The applications of scientific knowledge
can be judged good or bad; but the
knowledge itself cannot.
Even if the applications of a scientific
theory are judged to be good, we
should not judge the theory itself.
A piece of scientific knowledge should
not be judged good or bad.
It is incorrect to judge a piece of
scientific knowledge as being
good or bad.
(b) Creative
Scientific knowledge is a product of
human imagination.
A scientific theory is similar to a work
of art in that they both express
creativity.
Scientific laws, theories, and concepts
express creativity.
Scientific knowledge expresses the
creativity of scientists.
(c) Developmental
We accept scientific knowledge even
though it may contain error.
Those scientific beliefs which were
accepted in the past and since have
been discarded should be judged in
their historical context.
Scientific knowledge is subject to
review and change.
Todays scientific laws, theories, and
concepts may have to be changed in
the face of new evidence.
(d) Parsimonious
There is an effort in science to keep
the number of laws, theories, and
concepts to a minimum.
Scientific knowledge is comprehensive
as opposed to specific.
Scientific knowledge is stated as simply
as possible.
If two scientific theories explain a
scientists observations equally well,
the simpler theory is chosen.

Item Difficulty (Logits)

SErasch

Outfit

Pre

Post

Pre

Post

Pre

Post

0.0
0.30

0.32
0.14

0.15
0.07

0.14
0.07

0.90

0.85

0.17

0.92

0.07

0.06

1.24

1.14

0.06

0.07

0.08

0.07

0.91

0.82

0.41

0.27

0.07

0.07

1.04

0.89

0.50
0.78

0.95
1.11

0.14
0.06

0.13
0.06

1.31

1.25

0.49

1.0

0.07

0.06

1.16

1.02

0.39

0.85

0.07

0.07

1.08

1.02

0.35

0.85

0.07

0.07

1.05

1.10

0.22
0.57

0.08
0.87

0.17
0.07

0.16
0.06

1.42

1.22

0.06

0.28

0.08

0.07

1.29

1.11

0.58

0.74

0.09

0.09

0.75

0.71

0.93

0.71

0.09

0.09

0.85

0.89

0.39
0.82

0.74
1.29

0.15
0.07

0.13
0.06

1.14

1.20

0.53

1.07

0.08

0.06

1.17

1.44

0.18

0.86

0.07

0.06

1.15

1.22

0.04

0.71

0.07

0.07

1.17

0.96

continued

Journal of Research in Science Teaching

CITIZEN SCIENCE LITERACY

783

Table 2
Continued
Item Categories and Items
With Abbreviations
(e) Testable
Consistency among test results is a
requirement for the acceptance of
scientific knowledge.
A piece of scientific knowledge will be
accepted if the evidence can be
obtained by other investigators
working under similar conditions.
Scientific laws, theories, and concepts
are tested against reliable
observations.
The evidence for scientific
knowledge must be repeatable.
(f) Unified
Biology, chemistry, and physics are
similar kinds of knowledge.
The various sciences contribute to a
single organized body of knowledge.
The laws, theories, and concepts of
biology, chemistry, and physics
are interwoven.
The laws, theories, and concepts of
biology, chemistry, and physics
are related.

Item Difficulty (Logits)

SErasch

Outfit

Pre
0.66
0.48

Post
0.33
0.21

Pre
0.18
0.09

Post
0.16
0.08

Pre

Post

0.86

0.90

0.58

0.07

0.09

0.07

0.88

0.82

0.65

0.45

0.09

0.08

0.77

0.79

0.93

0.58

0.09

0.09

0.70

0.77

0.02
0.53

0.21
0.32

0.17
0.08

0.15
0.07

0.98

1.08

0.20

0.32

0.07

0.07

1.07

1.39

0.15

0.22

0.09

0.08

0.79

0.82

0.65

0.60

0.09

0.08

0.84

0.84

Standard error of all items in the group added in quadrature.

Citizen Sky?), level of participation in this project (What do you feel your role is in the Citizen
Sky project?), views on the various categories of the NSKS (e.g., Do you see the creative aspect
of the nature of science on display in the Citizen Sky project?) and included a few questions
tailored to any anomalous responses each participant had to specific items. There were a total of 11
planned questions, plus the tailored follow up questions. Interview transcripts were analyzed by
listing verbatim transcripts of all answers to each common question separately and looking for
emergent trends and differences. For the individually tailored questions, the transcripts were
analyzed separately by comparing their interview response to their survey responses to see if there
is any explanation for the anomalous responses.
Subjects and Data Collection
As of February 1, 2012, the projects web site had 6,491 registered users (participants). Of
them 3,180 had taken the pre-test as part of the project registration process. They self-identified as
78% male, 19% female, and 3% did not choose a gender. The mean age was 41 years old
(SD 16). That the gender ratio and age skews towards an older, more male audience is typical
for the amateur astronomy community. Sky & Telescope magazine, the magazine of record for the
community, reports 95% of their subscriber readership is male with a mean age of 51 (New Track
Media, 2010). About a quarter of participants in this study reported no prior experience in
astronomy. About 61% of the participants had a bachelors or higher degree, which was below that
of subscribers reported by Sky & Telescope magazine (77%).
Journal of Research in Science Teaching

784

PRICE AND LEE

When participants in the project first registered via the web site, they were given the
opportunity to take the pre-test that included the Scientific Attitude instrument and the shortened
NSKS instrument along with many other instruments. They were compensated with placement in
an annual drawing for a gift card. Most participants knew very little about the project at this point
and had not formally participated in it at any level. The pre-test was offered to 6,491 participants,
of whom 3,180 opted to complete at least a portion of it (49%). They were invited to take the posttest during their first login to the web site after 6 months had passed since registration. Ultimately,
365 participants were offered the post-test and 333 opted to complete at least a portion of it (91%).
The difference between the number of participants offered the pre-test and post-test is likely due to
two factors. First, there was a high dropout rate and many people did not return to the project
6 months later. Second, participants were not required to log into the site unless they needed to
submit data or post to a forum. And web site login was a requirement to take the post-test (so we
could match pre- and post-tests). So most participants who did return later did not need to log in.
The gender distribution between the post-test group and the pre-test group is similar, but the posttest groups mean age was about 6 years older. Six months of project activity was chosen as a
measurement point because, based on project staff experience, most participants will have already
reached their peak involvement at that point. In order to reach participants who were no longer
active in the project, we also sent private e-mail invitations to those who had registered for the web
site 6 months prior but had not logged in during the previous 3 months.
Fourteen participants who took both the pre- and post-tests online were randomly invited to
participant in an interview session, of which nine accepted. They ranged in age from 18 to 64.
Eight were male and one female. Their education experience ranged from high school graduates to
one with a Ph.D. Their astronomy experience ranged from novice to professional, however, most
were in the intermediate category. The interviews were conducted via the telephone or Internet
communication software such as Skype, Google Voice, and Yahoo Messenger. They were each
compensated with a gift card. The interview durations ranged from 25 minutes to 1 hour and
15 minutes with an average duration of 40 minutes.
Data Analysis
Coding. For each item on the two instruments, we scored responses by assigning a 1 for
Strongly Disagree, 2 for Disagree, 3 for Neutral, 4 for Agree, and 5 for Strongly
Agree. Unanswered questions were treated as missing data.
Rasch Analysis. Likert scores were set on an ordinal, non-interval scale. This non-interval
nature presents many complications for parametric analysis (Carifio & Perla, 2008;
Jamieson, 2004; Knapp, 1990) which assumes equal intervals between two adjacent scores. To
address this complication, the responses to the Likert scale were converted into an single interval
scale through Rasch analysis (Rasch, 1960) based on the Rating Scale Model (RSM;
Andrich, 1978; Muraki, 1990; Wright & Masters, 1982;), which is often used by science education
researchers (Boone & Scantlebury, 2005). The RSM can be described through the following
equation (Andrich, 1978; Linacre, 2002):
logPnik =Pnik1 Bn  Di  F k

where Pnik is the probability that participant n, on encountering item i would be observed (or would
respond) in category k (item response) while Pni(k1) is the probability that the response would be
in category k  1. Bn shows the amount of the trait the participant n has on a single numerical
scale. In the measurement community, the amount of a trait is often referred to as an ability
estimate even though some constructs such as attitudes towards science are not directly associated
Journal of Research in Science Teaching

CITIZEN SCIENCE LITERACY

785

with the concept of ability. We use ability or ability estimates hereafter in this paper to mean the
amount of the trait an individual has for scientific attitude and nature of science constructs. Di is
the difficulty of item i, on the same single numerical scale. And Fk is the difficulty of producing a
category k response relative to a category k  1 on the same scale. Participant ability estimates
(ability) and item difficulty estimates are placed on the same scale with normalized values
typically ranging from 4.0 to 4.0 logits (log-odds unit).
The application of the RSM Rasch analysis generates an ability estimate for each person and
an item difficulty estimate for each item. The higher the person ability estimate the more the
person has the ability to endorse an item. That is, the person with an ability estimate of 2.0 on the
scientific attitude scale has higher scientific attitude than a person with 1.0. Similarly, higher
item difficulties are associated with increasing difficulty for an item to be endorsed. That is, an
item with an item difficulty estimate of 2.0 is more difficult and thus requires a greater amount of
ability to endorse than an item with 1.0. Therefore, whether an item, i, will be endorsed by a
person, n, (endorsement likelihood) with a response category of k depends on the difference
between the persons ability and the difficulties associated with the item and the response
category.
Rasch analysis was conducted using the Winsteps software program (Linacre, 2010). We only
included responses from those who took both the pre- and post-tests. We first applied the RSM to
pre-test responses. Then, we applied RSM to the post-test responses using the item and the person
scales as anchors to equate both tests on the same scale, which is the traditional method to compare
pre- and post-tests using Rasch analysis (Bond & Fox, 2007).
To check for psychometric validity and reliability on the two instruments, we used fit statistics
to evaluate how well the data fit the Rasch RSM model. For this analysis, the outfit statistic was
used. A proposed range for acceptable fit statistics for polytomous data is 0.61.4 (Wright &
Lincare, 1994) for rating scale items. No items were omitted from either test according to this
criterion. For person ability estimates, 17 participants on the attitude test and seven participants on
the NSKS test were omitted because their person outfit statistics were 1.4. We kept participants
with outfit statistics <0.6 since the model is overfitting (too good a fit). More detailed Rasch
analysis results will be discussed in the results section.
Demographic Variables. Age was computed as the difference between the participants
supplied birth date and 2009 (the start of the project). Based on the split-mean measure of the
computed age, we computed a dichotomous age variable: everyone below the mean (41 years) was
assigned a 0 and everyone along and above the mean assigned a 1. Gender was assigned as a 1
for a male and a 0 for a female. Astronomy Experience was created as a categorical variable where
low if the participant chose none or novice, medium if they chose intermediate, and
high if they chose advanced or professional.
Project Participation Variables. Web server and database logs recorded a variety of
participant activity which we turned into variables in an attempt to look for relationships with the
instrument data. First, we created the following variables.
Team Participant was assigned a 0 if the participant was not formally a part of an official
Citizen Sky team and was assigned a 1 if they were part of a team. Joining a team
consists of signing up on the Citizen Sky web site to have access to a specific teams
private area and requires approval from the team leader.
Active Observer was assigned a 0 if the participant had never submitted a variable star
brightness estimate to the central database and a 1 if they had submitted at least one
estimate.
Journal of Research in Science Teaching

786

PRICE AND LEE

Chat Join was assigned a 0 if they never visited a live, synchronous online chat session
or 1 if they have visited at least one live chat.

Post Count Di was assigned a 0 if the participant had never posted a message to a Citizen
Sky online, asynchronous discussion forum and a 1 if they had posted at least once to a
forum.

Since participating in chat sessions or posting messages on forums are considered social
activities, we combined these two variables into variable called Participant Communication to
reflect how active participants were in communicating with other participants. This variable had
three categories: low, medium, and high. The low category consisted of participants who had never
joined a chat or posted a message in a forum. The medium category consisted of participants who
have either joined a chat or posted at least one message to a forum. The high category consisted of
participants who have both joined a chat and also posted to a forum. For the analysis, we used three
variables: Team, Active Observer, and Participant Communication to represent various levels of
project participation. All project participation variables were categorical.
Repeated Measures ANCOVA Analysis. Repeated measures ANCOVAs were used to
investigate the main and interaction effects of project participation variables on differences in
participants scientific attitudes and NSKS ability estimates between pre and post-tests. The
independent variables in the analysis were related to the Participant Communication, Team, and
Active Observer variables. Covariates were related to individual characteristics such as the
Astronomy Experience, Gender, and Age variables.
Interviews. Post-test interviews were transcribed into a spreadsheet so patterns could be easily
identified among responses to the same question(s). The patterns were analyzed to characterize
participants perspectives on results from the pre- and post-tests. For example, we looked for
comments that may explain why some items showed significant change between pre- and posttests while other items did not. We also asked questions about any anomalous answers in the
participants survey results as compared to the rest of their responses. This was to examine if the
differences were due to personally held epistemological beliefs or misunderstanding of the items
on the tests. Every interviewed participant was asked about each category of the nature of science,
but we only present analysis of the responses in which we detected a clear pattern or link to the
survey data.
Results
Pre-Test Descriptive Statistics
Our descriptive analysis includes all 3,180 responses to the pre-test. Overall responses to the
survey questions were skewed. About 78% of all raw scores across all items lie between neutral
and strongly agree on both instruments. This is not surprising considering these were volunteers in
a citizen science project who are naturally motivated to participate in scientific activities and have
strong epistemological beliefs about science. According to the Rasch analysis results on the
attitude pre-test (Table 1), the easiest item with an item difficulty value of 1.05 was I actively
seek out stories about astronomy in the news. (hereafter referred to as the SEEK itemsee
Table 1 for code words used for subsequent items). The INTEREST item was the most difficult
item with the item difficulty of 1.07. While still quite positive, the responses to the NSKS pre-test
(Table 2) items were less negatively skewed than the attitude pre-test items. On the NSKS pre-test,
the easiest item to endorse was the There is an effort in science to keep the number of laws,
theories, and concepts to a minimum, which had an item difficulty value of 0.82. The most
difficult NSKS item was Todays scientific laws, theories, and concepts may have to be changed
Journal of Research in Science Teaching

CITIZEN SCIENCE LITERACY

787

in the face of new evidence and The evidence for scientific knowledge must be repeatable, both
of which had item difficulty values of 0.93. Table 2 also lists an average item difficulty value for each
of the six sub-categories of NSKS epistemological beliefs. The easiest NSKS sub-category was
creative with an average item difficulty value of 0.50 and the most difficult was testable with
that of 0.66. See detailed item difficulty values and other Rasch analysis results in Tables 1 and 2.
Changes in Participants Scientific Attitudes and Epistemological Beliefs About Nature of
Science
Analysis of change between the pre- and post-test is based only on responses from the 333
participants who took both tests.
Attitude Towards Science. To identify significant changes between pre- and post-tests in
scientific attitudes, we compared the difference in logit units between person ability and item
difficulty in each of the nine scientific attitudes items (Figure 1). When the difference variable
becomes zero the item will be endorsed by 50% of the participants. Positive values in the
difference variable mean more than 50% chance of participant endorsement and negative values
mean less than 50% chance of participant endorsement. Figure 1 indicates that, at the pre-test, the
SEEK item was the most likely item to be endorsed by participants and the INTEREST item was
the least likely. At the post-test, the differences significantly decreased, meaning the SEEK,
ATTEND, OTHER, EVERYDAY, NEWS, and INTEREST items became easier to endorse,
p < 0.05. There were no significant changes for ALREADY. There was a significant increase for
KNOWLEDGE and EVALUATE, meaning they became more difficult to endorse. The item with

Figure 1.

Participants endorsement likelihood is determined by the distance between person ability and item difficulty
on the scientific attitudes scale. The bars indicate the average distance in each of the nine scientific attitude items. The
smaller the bar, the more likely the participants endorsed the item.

Journal of Research in Science Teaching

788

PRICE AND LEE

the biggest overall change is the OTHER item, which is about the participants likelihood to join
other citizen science projects.
As seen in Table 3, ANCOVA results indicate that there was a significant change from pre to
post in overall person ability estimates for the scientific attitude variable. The Active Observer
variable was not significant, and neither was the Team variable. There were no significant
interaction effects for either with the Time variable. However, Participant Communication had a
significant main effect on the scientific attitudes variable. ATukeys post hoc test indicated that the
high participant communication group was significantly different, both at the p < 0.05 level, from
low and medium communication groups. This means that, coming into the project, those who had
higher Participant Communication tendencies (i.e., became most active in the chat room and
online discussion forums) had more positive attitudes at both testing times. There is a significant
interaction effect between the Participant Communication variable and the Time variable (the
repeatedly measured scientific attitude variable), meaning that those who were actively involved
with Participant Communication changed to a greater extent from pre-test to post-test than those
who did not. There was no significant difference between the low and medium Participant
Communication groups on either test.
Of the covariates, only Astronomy Experience was significantly related to the overall
scientific attitudes variable. The Age and Gender variables were not significantly related to the
overall scientific attitudes variable. A Tukeys post-hoc analysis of the Astronomy Experience
variable on the scientific attitude variable showed significant differences between the low and
medium experience groups, p < 0.05, and between the low and high experience groups,
p < 0.01. There was no difference between the medium and high experience groups, p 0.76.
Epistemological Beliefs About the Nature of Science. Figure 2 shows what NSKS item
categories changed significantly between pre- and post-tests. There were significant positive
changes in creative, parsimonious, amoral, and testable categories. There was no significant
change in the developmental category. The overall NSKS ability estimates of those who took both
pre- and post-test showed significant improvement between the two time points, according to
Table 4, p < 0.05.

Table 3
Analysis of covariance on scientific attitudes
Source
Fixed effects
Time (M)
Participant communication (P)
Active observer (AO)
Team (T)
MP
M  AO
MT
Covariates
Age
Astronomy experience
Gender


p < 0.05.
p < 0.01.



Journal of Research in Science Teaching

df

Partial eta Squared

1
2
1
1
2
1
1

18.6
3.76
0.353
1.49
3.38
2.55
0.072

0.166
0.075
0.004
0.016
0.076
0.058
0.001

0.000
0.027
0.554
0.225
0.025
0.061
0.789

1
1
1

0.311
15.0
0.575

0.003
0.138
0.006

0.578
0.000
0.45

CITIZEN SCIENCE LITERACY

789

Figure 2.

Participants endorsement likelihood is determined by the distance between person ability and item difficulty
on the scientific attitudes scale. The bars indicate the average distance in each of the six NSKS subcategories. The smaller
the bar, the more likely the participants endorsed the item.

As with the attitude test, repeated measures ANCOVA was used to look for differences in
various groups of participants (Table 4). No significant main or interaction effects of project
activity or individual characteristic variables were detected.
Post Interviews
Social Communication. The interview data provide insight into some of the results detected in
the tests. The attitude item with the greatest change is the OTHER item, which could be due to the
Table 4
Analysis of covariance on beliefs in nature of science
Source
Fixed effects
Time (M)
Participant communication (P)
Active observer (AO)
Team (T)
MP
M  AO
MT
Covariates
Age dichotomous
Astronomy experience
Gender

df

Partial eta Squared

1
2
1
1
2
1
1

4.02
0.216
0.149
0.000
0.458
0.760
0.363

0.042
0.005
0.001
0.000
0.010
0.008
0.004

0.048
0.806
0.106
0.997
0.634
0.428
0.548

1
1
1

0.694
0.074
1.95

0.008
0.001
0.021

0.407
0.786
0.166

p < 0.05.

Journal of Research in Science Teaching

790

PRICE AND LEE

selection effect related to participants who are already interested in citizen science. However, this
increase could also be related that their positive experience in Citizen Sky. Further validation of
this interpretation can be done by comparing with other citizen science projects, for which we do
not have data. The item with the second most change is the NEWS item. The other news related
item, SEEK, also showed an increase. Our interview data shed some light on a possible connection
between the two due to participants sharing news stories with each other via our online forums.
When asked about changes in their news reading activities, three participants referred to posts to
our discussion forums as new sources of news:
Participant 4: I tend to read specific [news sources] that allow me to gain as much info as
quickly as possible . . . I will tend to read Citizen Sky [forum] posts at night when things are a
bit quieter.
Participant 5 (referring to news updates from project staff): Ive always eagerly read any of
those [forum] posts from citizen sky.
Participant 9: To some degree CS has led me more into the blogosphere and the web with
regard to news. At the same time Ive had to enhance my way of critically reading such news
and be able to try to deal with the sources it is coming from . . . the forums have had
discussions with regard to the validity of sources and methods.

In addition to the items related to news reading, the interviews suggest that working with
others makes the project more interesting and also allows them to look at things from new
perspectives. Regarding this increased interest, two participants commented on the importance of
community and the collaborative nature of the Citizen Sky project, which was one of the design
principles:
Participant 2: Just in participating in it Ive learned things about interacting with other
people and assumptions about sort of the knowledge and the interpretation skills of other
people.
Participant 5: [The project has] become a pretty important part of my life (laugh). These
peopleits partly the people too its not just the science, its the combination of the two.
One thing Ive really learned is that when you are doing science it is really helpful to have
a team. It is really helpful to be able to throw ideas out there and bounce them around
each other and have people with different expertises that can clarify things that you
might now have understood completely or to see something in a different way than
someone else did.

Recall that the social aspect of the project was related to change in attitudes on the surveys.
These two interview responses suggests that it was also an important role in helping participants
see themselves as part of the scientific team.
Self-Perception Regarding Their Knowledge. The KNOWLEDGE and EVALUATE
items show the only decreases in endorsement. However, six of the nine interviewees stated
that their knowledge increased or was otherwise unaffected through participation in the
project:
Participant 1: Im learning more about variable stars as a whole.
Journal of Research in Science Teaching

CITIZEN SCIENCE LITERACY

791

Participant 4: I have a better appreciation of the different kinds of variable stars and some of
the characteristics of their light curves. Im learning more about data analysis in this context
as a result of asking questions and experimenting with data.
Participant 7: I always felt before that variable star observing was too advanced for me. That
along with the thought that [it] sounds boring. Its not so boring now.
Participant 9: I dont think my knowledge of astronomy has been affected, no. But there has
been a big effect on my knowledge of how to do astronomy. (emphasis theirs)

There was no statement in any interview where a participant suggested they were not learning
or having trouble learning. An explanation for the discrepancy between what they said and what
the test results show could be in how the KNOWLEDGE item was worded (I am knowledgeable
about science.). That could be interpreted as a question about efficacy as opposed to knowledge.
That is, participants are gaining more knowledge but this is also opening their eyes to how much
more they have to learn (the more you know, the more you dont know). This was suggested by
one of the participants:
Participant 2: There have been many instances where I think peoples awareness of
limitations of past knowledge has been increased. One of the overwhelming feelings I get is
how much we dont know. So, many times its more like in what direction to move your
boundaries of ignorance, as opposed to your boundaries of knowledge.

Creativity in Science. The NSKS Creative category had the greatest increase in endorsement
likelihood. The interview responses show consistent evidence that the project does involve broad
creativity. All nine participants interviewed gave different examples of creativity in the project. Of
them, one of the participants described their own problem solving skills as creative. But most of
the examples of creativity involved watching other peoples application of creativity:
Participant 3: In the part I participated in, I was just gathering data so other people could do
the creative part of explaining the data. So I didnt do much of the creative stuff but there are
definitely other people who did.
Participant 4: In general, discussions in the CS forums in which people are coming up with
alternative explanations for aspects of eps aur.

The interviewer noted in their field notes that when asked about the creative items,
participants often had to pause and think more so than in the other categories. Also, early in the
project a particularly heated discussion occurred in the project forums over the topic of creativity
in science. Investigation of the individual items found that in general the creative category of items
were easiest to endorse in both pre-test and post-test (Table 2). Also, participant endorsement
likelihood indicators changed the most with the creative category (see Figure 2). One participant
in the online debate described the difficulty reconciling knowledge and creativity:
Online Forum Participant 1: Scientists are creative when they develop theories, but is
knowledge creative? No, knowledge is knowledge. Its like saying water freezes at 32 F,
Im being creative by telling you this.
Journal of Research in Science Teaching

792

PRICE AND LEE

This participants focus on the specific terminology is likely an example of issues many had
with those questions. However, they also establish a difference between the presence of creativity
at the beginning of the scientific process as opposed to the end. This is supported by many of the
other participants in the discussion:
Online Forum Participant 2: While the scientic (sic.) method calls for strict reasoning and logic
when checking hypotheses and making inferences, at the very begging (sic.) of a new theory,
there is always an informed guess. A piece of intuition, imagination, some creative event.
Online Forum Participant 3: Of course, logic and mathematics are the means by which the
world can be abstracted, represented and understood as models. But before models can be
created and tested against evidence, surely imagination and creativity play a role. Einstein
imagined (emphasis theirs) what it would be like to ride on a light beam, wondering how
the world would look at close to the speed of light.
Online Forum Participant 4: No matter how mundane, devising experiments (and creating
(sic.) models) must surely often require creativity also. So, I think science is a mix of both
creativity and logic. Its a human endeavour (sic.) after all.

Parsimony in Science
The other NSKS item category that was most easily endorsed by participants was the
parsimonious category of items. The interviews suggest participants did not completely
understand the words definition:
Participant 1: . . . this is more difficult question than I thought . . .(laugh)
Participant 5: What does that mean? (laugh)
Participant 7: Thats a bit harder to wrap my head around.

To better describe the concept, the interviewer referred to Occams Razor (summarized as
when confronted with two explanations of equal accuracy, choose the simpler.), which is a
common quotation in popular astronomy literature. After the definition was cleared up, everyone
interviewed exhibited support for the parsimonious nature of science. Most often they quote the
evolving theory of epsilon Aurigae and also the ongoing development of a Theory of Everything
in the physics community.
Reinforcement. The change in the epistemological beliefs detected in the surveys is
translational, meaning the center of the distribution moved from lower to higher points in the scale
without changing the shape of the distribution or changing the locations of items on the scale. The
relative ranking of the six sub-categories of the nature of science did not change. This suggests a
reinforcement of beliefs, which is also evidenced in language used in the interviews. The language
illustrates where participants built on prior knowledge in phrases such as Im learning more about
variable stars as a whole, I have a better appreciation [of experimentation], [my interest] in
science is greater, and I was reading about astronomy and physics and other science stuff even
before Citizen Sky. In none of the interviews was there any discussion about fundamental
changes in how participants view the nature of science. Instead, the comments were about filling in
gaps of knowledge and supporting prior epistemological beliefs.
Journal of Research in Science Teaching

CITIZEN SCIENCE LITERACY

793

Discussion
There are two main overall findings addressing each of the two elements of scientific literacy
we studied. First, attitudes towards science increased and this increase was significantly related to
participant communication. Second, epistemological beliefs about the nature of science also
increasedbut we did not detect a relationship with project participation levels. These two points
together indicate that the participants attitudes towards science-related activities can be
influenced by their positive social experience with others in citizen science projects. However,
epistemological beliefs about science are less subject to the project participation experience
possibly because epistemological beliefs are personal beliefs and thus harder to change after
participating in only one citizen science project. Finally, whether the participant submitted data or
was a member of a formal team did not have a direct relationship with changes in scientific
attitudes or beliefs about the nature of science.
Attitude Towards Science
Interest in astronomy and science was very high on all of our attitude measures, as
expected for a volunteer science project. Yet we still detected a significant change in the scientific
attitude ability estimate as a whole from pre to post. Considering that other citizen science
projects have not reported any change in scientific attitude (Brossard et al., 2005), our finding is
noteworthy. The difficulty with finding changes in attitudinal constructs has been claimed to
be related to the sensitivity of the instruments used in other citizen science projects (Brossard
et al., 2005) and we agree. The change we detected is mostly through reinforcement of
existing epistemological beliefs and our detected change is likely due to the use of a more
sensitive analysis procedure. Thus we conclude that participation in a citizen science
project alone is not likely to change overall attitudes towards science at a very high level,
probably due to the fact that those entering such projects already have very positive
attitudes. However, it can build on existing positive attitudes towards science and improve them
further.
The analysis procedure we used in this study, involving Rasch analyses of test responses
combined with ANCOVAs using project participation variables, allowed us to uncover more
significant item-related effects and could serve as an important analytical framework for future
projects. Six of our attitude items became significantly easier to endorse. The OTHER items
increase suggests they are more likely to participate in other citizen science projects in the
future. The NEWS items increase is related to how participants share news stories and news
sources with each other via the project web site and online forums. This is supported by
the relationship between the survey scores and the Participant Communication variable. Also,
the interviews provide insight into how that relationship manifested itself in the project (e.g.,
through sharing of online news sources). Modern conventional wisdom is that collaboration and
social factors are key to learning about science (Vygotsky, 1964), including online science
learning (Linn, Davis, & Bell, 2004). The KNOWLEDGE survey item became more difficult to
endorse. Interview data suggests that this is not due to participants losing (forgetting) knowledge
but rather that they are becoming more aware of how much they do not know. This important result
speaks towards the fundamental relationship one has with science as the participants scientific
horizons expand. The EVALUATE survey item also became more difficult to endorse, which we
believe is related to the lowered self-perception found in the KNOWLEDGE item. That is,
participants trust in their ability to apply scientific thinking to daily life decreased because
they feel they understand a smaller fraction of the scientific field than they did when they took the
pre-test.
Journal of Research in Science Teaching

794

PRICE AND LEE

Epistemological Belief About the Nature of Science


Epistemological beliefs about the nature of science increased significantly in this project.
However, the relative rankings of the various NSKS categories did not change much between the
pre- and post-test. This suggests that previous epistemological beliefs are being reinforced, rather
than restructured. If they were being restructured, we would expect to see different levels of
change between the various categories to such a degree that their relative importance would
changefor example, creativity may suddenly become the most important element in participant
epistemological beliefs. Instead, categories with the strongest beliefs remained the strongest and
vice versa. Instead of structural difference, there was an across-the-board difference.
The creative category showed the greatest amount of change. A discussion about creativity
was one of the most active and controversial threads in our online forums. That debate was more
about where creativity exists in the scientific process (restricted to the beginning or infused
throughout) rather than whether it is important at all. The amoral category also showed significant
change and also was an active topic in the online forums, mainly through a heated discussion
thread about global warming. Because our interview responses show confusion with vocabulary,
we believe the parsimonious change is largely due to syntax and item wording rather than anything
related to the project. The change in the testable category may be associated with the ongoing
presence of a staff blog on the web site that continually discussed current models of the star system,
while comparing them with new data and slowly constraining the cloud of uncertainty around the
source of the eclipse. Through this process, participants saw their data in action in a very visible
way and how it was being used to solve the core scientific question of the project.
There was a drop in epistemological belief that the goal of science is to create universal laws
across various domains. This was a topic that was not addressed much at all in the projectwhich
was focused almost exclusively within the domain of astronomy.
Our results are the first in the literature to show a change in epistemological beliefs in the
nature of science through a citizen science project. One previous study reported high
epistemological beliefs about the scientific process by their participants, but they could not
attribute the beliefs to participation in the project (Trumbull et al., 2000) so it could be a reflection
of the same selection effect we found in our pre-test results. Another study found no difference in
understanding of the scientific process between those who participated in a citizen science
project and a control group (Brossard et al., 2005). They suggested that since the primary
motivation to join the project was an interest in the projects content (birding), participants did
not view the project in a scientific light but rather as a knowledge-building exercise (i.e., an
educational hobby). This is backed by Raddick et al. (2010) which reported only 12% of the
Galaxy Zoo participants joined due to an interest in science or discovery. The main reason to
join was interest in the subject matter, astronomy, at 46%. On the other hand, the Citizen Sky
project specifically refers to the scientific process throughout its training procedures. In fact,
Citizen Sky recruitment materials focus on the opportunity to participate in all steps of the
scientific method as a selling point to differentiate itself from other citizen science projects. So
this level of meta-cognition may explain why this study finds more of an increase in
epistemological beliefs, since participants went into the project more sensitive to how the entire
process unfolds.
The Role of Social Activity Involvement
One of the important results of this study is the value of the social component of the project.
After testing many different variables, we found that the Participant Communication variable was
the only one related to change in scientific attitudes. Interview transcripts suggest this is due to
Journal of Research in Science Teaching

CITIZEN SCIENCE LITERACY

795

participants communicating with each other and using the online forums as a source of
information and news. This could be a very positive development for the emerging field of
citizen science. In the past, one of the challenges of such projects was the isolation in which
participants worked. Nov, Arazy, and Anderson (2011b) found that establishing a community
of volunteers is second only to intrinsic motivation in terms of sustaining commitment to a
project.
One explanation for this link between change in attitude and Participant Communication
could be in the highly personalized nature of citizen science projects. The sense of agency that
drives interest in citizen science also serves as an illustration of Aikenheads personal-curiosity
science, which is part of a enculturation of science education into communities that are directed by
the learners themselves. Citizen science projects have a greater opportunity to build a social
community (as evidenced in the forums) and to empower its participants more than individual or
even classroom-based science projects. This agency stems both from a closer sense of ownership
of the process and its products and, for collaborative and co-created projects, also in the influence
the participant has over the project structure. This sense of increasing control is important to
scientific literacy and leads to citizen science becoming a strand of the learning process, one based
on personal relationships (Roth & Lee, 2005). The sense of community and personal
empowerment is fostered by an active community where the participant has a role beyond that of
an anonymous data collector or processor. Also, online and interactive forums can support a more
integrated community by narrowing the barrier between professional staff and participants, which
has been recognized as an important goal by the citizen science community (Raddick et al., 2009).
The Zooniverse project, arguably the most rapidly growing citizen science project ever, has
reported 11,000 members registered for their forums (Raddick et al., 2010). It is a rich community
that supports a deeper investigation of the various Zooniverse projects than the original project
goals intended. And participants have shown the initiative to investigate ideas on their
own (Cardamore et al., 2009; Raddick et al., 2009). This has created its own challenges,
such as determining how to effectively support such a large and active group when faced with
limited resources. Our study suggests simply providing a forum to let participants communicate
with each other is an easy and effective first step. Yet it is one that most citizen science
projects are not yet taking. This step would go a long way to help put online citizen science projects
in line with the greater science education communitys emphasis on collaborative and social
learning.
Other Project Participation Measures
The Citizen Sky project was designed to give participants a chance to participate in every
stage of the scientific process with the belief that this greater engagement will increase their
scientific literacy. Participants were not required to engage at these other stages, but many did.
Over a dozen papers have been published in a peer-reviewed scientific journal dedicated to
participant projects. However, evidence of how participation levels and type affected change in
literacy has been elusive. For example, one may expect to find a relationship between the number
of variable star observations submitted to the database (the Active Observer variable) and
epistemological belief in the testable aspect of the NSKS test, since the very nature of variable star
observing is to combine lots of data from independent sources. However, we find no such
statistical relationship in our data. Other research on citizen science projects have also found that
contributing data did not increase participants epistemological beliefs in the nature of science
(Trumbull et al., 2000). It may take more than just participation in data collection for participants
to gain new insight into science.
Journal of Research in Science Teaching

796

PRICE AND LEE

Implications
Citizen science is a field experiencing explosive growth. It has great potential to not just help
scientists but it also can help educators. This study suggests a number of ways a citizen science
project can be designed to enhance scientific literacy of its participants. But education must be
established early in a projects design as a primary goal of the project in order to implement many
of these suggestions. First, we recommend a dedicated social aspect to the project. This data show
that participant attitudes were most affected by direct communication with other participants.
Online forums are simple and easy to install on almost any web site platform these days. We
recommend using them along with live chats with project scientists to create a community of
shared knowledge. This social component has the potential to help empower participants, which
others have shown to have a significant impact on overall scientific literacy. This empowerment is
somewhat unique to the personal nature of citizen science projects and additional research in this
area would be compelling. Also, we suggest a direct illustration of how the participant is involved
in the overall scientific process the researchers are conducting (perhaps using metacognitive
strategies embedded in training materials). Other citizen science projects have found that
participants sometimes have trouble understanding their role in the entire project (Evans et al.,
2005) whereas this project emphasized and defined their role very early oneven in the training
materials. Informal science has the luxury of not being restricted by classroom walls, but it cannot
completely ignore the importance of framing the issue. Finally, practitioners should not expect to
see an increase in scientific attitudes or epistemological beliefs simply by having volunteers
participate in data collection. They need to be more involved in other aspects of the project. Our
results, along with that from other projects, shows that data collection alone has no established
relationship with changes in scientific attitudes or epistemological beliefs about the nature of
science. As for researchers, we showed how Rasch analysis can uncover results in coarse Likert
datathe type often found in informal science education surveys. Past citizen science projects
may need to apply a more sophisticated analytical model such as the Rasch model to their data and
re-analyze them for hidden results. These relatively easy steps should go a long way in turning a
passive citizen science project into an active one that has the opportunity to change how its
participants view science.
Limitations
This study is limited in a number of ways. First, both scientific attitudes and epistemological
beliefs about nature of science instruments show certain degrees of a ceiling effect because
participants already had an interest in science before joining the project. It is likely that this ceiling
effect has depressed the measured range of responses and may be responsible for the relatively
minor amounts of change detected. Most scientific attitude instruments in the literature are
developed for children and adolescents in formal educational settings and are validated
accordingly. In order to advance research in informal settings, new and targeted instruments need
to be developed that take into consideration the unique aspects of informal science education
audiences. For example, they must not take very long to complete since such audiences are usually
volunteering their personal time (as opposed to formal environments which can require
completion of an instrument). Also, they need to be sensitive to a broad range of prior enthusiasm
for science. A second limitation is related to the representativeness of the sample used in this
study. While we did make a significant effort to contact those who were no longer active in the
project, the majority of the post-tests were filled out by those who would be considered more active
than the average participant. Third, the project did not have a control group to estimate the effect of
the studied project as compared to other citizen science projects. Fourth, the attitude instrument
Journal of Research in Science Teaching

CITIZEN SCIENCE LITERACY

797

was created for this project and has not been externally validated in others measurement research,
however, the strong Rasch alpha and fit statistics reflect that the underlying latent variable is
associated with the items chosen for the scientific attitudes instrument. Fifth, our conclusion that
the change in epistemological beliefs is due to reinforcement rather than restructuring of beliefs is
based on inferences in the survey data and phrases used in the interviews. A more in depth
interview process would be helpful to provide direct evidence of this. Sixth, the post-test group
was slightly older than the pre-test group, meaning that it may reflect a slightly more mature
population compared to the general population of the project. Finally, the authors of this study
were affiliated with the Citizen Sky project.
Conclusion
This study found that overall attitudes towards science increased through participation in this
citizen science project. The change was greatest in a few specific attitudes related to scientific
news gathering and self-awareness of participants knowledge. We also found a positive change in
overall epistemological beliefs in the nature of science, but it was slight and mainly through the
reinforcement of previously held beliefs. The change in attitudes towards scientific news gathering
was related to social participation in the project, which also proved to be important regarding how
participants felt about their role in the project and suggests that community building and personal
agency should be considerations in future citizen science projects interested in promoting
scientific literacy. Otherwise, the level and type of project activity the participant engaged in (such
as data collection) was not related to any change.
The data provides a simple and clear answer to our research questions: citizen science projects
that provide access to scientists, data analysis tools and a broad user community can have potential
to help participants achieve change participants attitudes towards science and epistemological
beliefs in the nature of science. This is the first time such a change has been measured in a citizen
science project. However, the amount of the impact is limited and it does not seem to be related
specifically to how they contribute to the project.
The authors would like to acknowledge the assistance of Jennifer Borland, Dr. Eric
Chaisson, Dr. Arne Henden, Dr. Larry Ludlow, Dr. Danilo Marchesini, and Dr. Tim Slater.

References
American Association for the Advancement of Science. (1993). Benchmarks for science literacy.
Project 2061. New York, NY: Oxford University Press.
Aikenhead, G. S. (2005). Research into STS science education. Educacion Qumica, 16, 384397.
Anderson, D. (2003). Public computing: Reconnecting people to science. Presented at the Conference
on Shared Knowledge and the Web, Residencia de Estudiantes, Madrid, Spain, November 1719, 2003.
Andrich, D. (1978). Scaling attitude items constructed and scored in the Likert tradition. Educational
and Psychological Measurement, 38, 665680.
Bacher, K. (2011 , March 31). Volunteer program highlight: Citizen science [Web log post]. Retrieved
from: http://rainiervolunteers.blogspot.com/2011/03/volunteer-program-highlight-citizen.html
Bloom, M. A. (2008). The effect of a professional development intervention on inservice science
teachers conceptions of the nature of science. Unpublished doctoral dissertation, Texas Christian University,
Ft. Worth.
Berkeley Open Infrastructure for Networked Computing. (2009). Bossa. Retrieved from: http://boinc.
berkeley.edu/trac/wiki/BossaIntro (Accessed December 9, 2009).
Bond, T. G., & Fox, C. M. (2007). Applying the Rasch model: Fundamental measurement in the human
sciences. 2nd Edition. Mahwah, NJ: Lawrence Erlbaum Associates, Inc.
Journal of Research in Science Teaching

798

PRICE AND LEE

Bonney, R., Ballard, H., Jordan, R., McCallie, E., Phillips, T., Shirk, J., . . . Wilderman, C. C. (2009).
Public Participation in Scientific Research: Defining the field and assessing its potential for informal science
education. A CAISE Inquiry Group report. Washington, DC: Center for Advancement of Informal Science.
Education (CAISE).
Boone, W. J., & Scantlebury, K. (2005). Rasch analysis in science education research. Science
Education, 2, 253269.
Brandt, C., Shirk, J., Jordan, R., Ballard, H. L., & Tomasek, T. M. (2010 , March). Beyond citizen
science: Science learning and public participation in environmental research. Symposium conducted at the
meeting of the National Association of Research in Science Teaching, Philadelphia.
Brossard, D., Lewenstein, B., & Bonney, R. (2005). Scientific knowledge and attitude change: The
impact of a citizen science project. International Journal of Science Education, 27, 10991121.
Cardamore, C., Schawinski, K., Sarzi, M., Bamford, S. P., Bennert, N., Urry, C. M., . . . Vandenberg, J.
(2009). Galaxy zoo green peas: Discovery of a class of compact extremely star-forming galaxies. Monthly
Notices of the Royal Astronomical Society, 399, 11911205.
Carifio, J., & Perla, R. (2008). Resolving the 50-year debate around using and misusing Likert scales.
Medical Education, 42, 11501152.
Christian, C., Lintott, C., Smith, A., Fortson, L., & Bamford, S. (2012). Citizen science: Contributions to
astronomy research. Retrieved from: http://arxiv.org/abs/1202.2577 (Accessed April 4, 2012).
Cifelli, R. (2005). The community collaborative rain, hail, and snow network: Informal education for
scientists and citizens. Bulletin of the American Meteorology Society, 86, 10691078.
Center for Informal Learning and Schools. (2005). ISIs and schools: A landscape study. Retrieved from:
http://www.exploratorium.edu/cils/landscape (Accessed August 14, 2009).
Cohn, J. P. (2008). Citizen science: Can volunteers do real research? Bioscience, 58, 192197.
Cornell Ornithology Lab. (2009). Defining citizen science. Retrieved from: http://www.birds.cornell.
edu/citscitoolkit/about/definition (Accessed August 14, 2009).
Conrad, C. C., & Hilchey, K. G. (2011). A review of citizen science and community-based
environmental monitoring: Issues and opportunities. Environmental Monitoring Assessment, 176, 273.
Cooper, C. B., Dickinson, J., Phillips, T., & Bonney, R. (2007). Citizen science as a tool for
conservation in residential ecosystems. Ecology and Society, 12, 11 [online] Retrieved from: http://www.
ecologyandsociety.org/vol12/iss2/art11/
Cornwall, A., & Jewkes, R. (1995). What is participatory research? Social Science and Medicine, 41,
16671676.
Corter, J. E., Nickerson, J. V., Esche, S. K., Chassapis, C., Im, S., & Ma, J. (2007). Constructing reality:
A study of remote, hands-on, and simulated laboratories. ACM Transactions on Computer-Human
Interaction, 14, 127.
Cummings, J. N., & Kiesler, S. (2005). Collaborative research across disciplinary and organizational
boundaries. Social Studies of Science, 35, 703722.
Curren, D. (2013). The levels of citizen science involvementPart 1. Retrieved from: http://www.
openscientist.org/2013/01/the-levels-of-citizen-science.html (Accessed January 14, 2013).
Dierking, L. D., Falk, J. H., Rennie, L., Anderson, D., & Ellenbogen, K. (2003). Policy statement of the
informal science education ad hoc committee. Journal of Research in Science Teaching, 20, 108111.
Doering, A., Miller, C., & Veletsianos, G. (2008). Adventure learning: Educational, social, and
technological affordances for collaborative hybrid distance education. Quarterly Review of Distance
Education, 9, 249266.
Ely, E. (2008). Volunteer monitoring & the democratization of science. The Volunteer Monitor, 19, 1.
Evans, C., Abrams, E., Reitsma, R., Roux, K., Salmonsen, L., & Marra, P. P. (2005). The neighborhood
nestwatch program: Participant outcomes of a citizen-science ecological research project. Conservation
Biology, 19, 589594.
Falk, J. H., & Dierking, L. D. (2010). The 95 percent solution. American Scientist, 98, 486493.
Falk, J. H., Storksdieck, M., & Dierking, L. D. (2007). Investigating public science interest and
understanding: Evidence for the importance of free-choice learning. Public Understanding of Science, 16,
455469.
Journal of Research in Science Teaching

CITIZEN SCIENCE LITERACY

799

Ferris, T. (2002). Seeing in the dark. New York, NY: Simon & Schuster.
Fore, L. S., Paulsen, K., & OLaughlin, K. (2001). Assessing the performance of volunteers in
monitoring streams. Freshwater Biology, 46, 109123.
Gilbert, J. K., Bulte, A. M., & Pilot, A. (2011). Concept development and transfer in context-based
science education. International Journal of Science Education, 33, 817837.
GLOBE. (2011). Annual Review, 2010. Retrieved from: http://globe.gov/news/articles/2010-globeannual-review (Accessed May 16, 2012).
Greaves, S. (2012). Citizen science musings: The indispensable mentor. Retrieved from: http://
citizenscientistsleague.com/2012/04/16/citizen-science-musings-the-indispensable-mentor/ (Accessed
April 16, 2012).
Hand, E. (2010). Citizen science: People power. Nature, 66, 685687.
Holbrook, J., & Rannikmae, M. (2007). The nature of science education for enhancing scientific literacy.
International Journal of Science Education, 29, 13471362.
Hollow, R. (2010). Using authentic astronomical data in investigations and activities. Retrieved from
Commonwealth Scientific and Industrial Research Organisation web site: http://outreach.atnf.csiro.au/
education/teachers/resources/
Howard, E., & Davis, A. K. (2004). Documenting the spring movements of monarch butterflies with
journey north, a citizen science program. In K. S. Oberhauser & M. J. Solensky (Eds.), The Monarch butterfly:
Biology & conservation. New York, NY: Cornell University Press.
Howe, J. (2006). The rise of crowdsourcing. Wired. Retrieved from: http://www.wired.com/wired/
archive/14.06/crowds.html (Accessed March 17, 2010).
Jamieson, S. (2004). Likert scales: How to (ab)use them. Medical Education, 38, 12121218.
Jordan, R. C., Ballard, H. L., & Phillips, T. B. (2012). Key issues and new approaches for evaluating
citizen-science learning outcomes. Frontiers in Ecology and the Environment, 10, 307309.
Karrow, D., & Fazio, X. (2010). Education-within-place: Care, citizen science, and ecojustice. In: D. J.
Tippins, M. P. Mueller, J. D. Adams, & M. Van Eijck (Eds.), Cultural studies and environmentalism.
(pp. 193214). New York, NY: Springer.
Khare, D., Zevit, P., & Shirk, J. (2012a). Citizen science community forum. Retrieved from: http://www.
citizenscience.org/community/about-forum/ (Accessed May 1, 2012).
Khare, D., Zevit, P., & Shirk, J. (2012b). Citizen science and informal science education institutions
Retrieved from: http://www.citizenscience.org/community/blog/2011/10/25/informal-science-institutions/
(Accessed May 1, 2012).
Knapp, T. R. (1990). Treating ordinal scales as interval scales: An attempt to resolve the controversy.
Nursing Research, 39, 121123.
Kolb, D. A. (1984). Experiential learning: Experience as the source of learning and development.
Englewood Cliffs, NJ: Prentice Hall.
Lakshminarayanan, S. (2007). Using citizens to do science versus citizens as scientists. Ecology and
Society, 12, 2. Retrieved from: http://www.ecologyandsociety.org/vol12/iss2/resp2/ (Accessed December 7,
2009).
Linacre, J. M. (2002). Optimizing rating scale category effectiveness. Journal of Applied Measurement,
3, 85106.
Linacre, J. M. (2010). Winsteps and Facets Rasch Analysis Software. Retrieved from: http://www.
winsteps.com (Accessed June 1, 2010).
M. C. Linn, E. A. Davis, & P. Bell (Eds.), (2004). Internet environments for science education. Mahwah,
NJ: Lawrence Erlbaum Associates.
Meichtry, Y. J. (1994). The impact of science curricula on student views of the nature of science. Journal
of Research in Science Teaching, 30, 429443.
Miller, J. D. (1983). Scientific literacy: A conceptual and empirical review. Daedalus 112, 2948.
Miller, J. D. (1998). The measurement of civic scientific literacy. Public Understanding of Science, 7,
203223.
Miller, J. D. (2004). Public understanding of, and attitudes toward, scientific research: What we know
and what we need to know. Public Understanding of Science, 13, 273294.
Journal of Research in Science Teaching

800

PRICE AND LEE

Mueller, M., Tippins, D., & Bryan, L. (2012). The future of citizen science. Democracy and Education,
20, 112.
Muraki, E. (1990). Fitting a polytomous item response model to Likert-type data. Applied Psychological
Measurement, 14, 5970.
Nov, O., Arazy, O., & Anderson, D. (2011a , February). Dusting for science: Motivation and
participation of digital citizen science volunteers. Paper presented at iConference 2011, Seattle, Washington.
Nov, O., Arazy, O., & Anderson, D. (2011b). Technology-mediated citizen science participation: A
motivational model. In N. Nicolov & J. G. Shanahan (Eds.), Proceedings of the Fifth International AAAI
Conference on Weblogs and Social Media (pp. 249256). Menlo Park, CA: AAAI Press.
National Research Council. (1996). National science education standards. Washington, DC: National
Academy Press.
National Research Council. (2009). Learning science in informal environments. Washington, DC:
National Academies Press.
National Science Board. (2008). Science and engineering indicators. Arlington, VA: (NSB 08-01; NSB
08-01A). Retrieved from: http://www.nsf.gov/statistics/seind08/ (Accessed December 7, 2009).
National Science Foundation and Lifelong Learning. (2010 , October 5). Retrieved from: http://
caise.insci.org/uploads/docs/FINAL-NSF-ISE%20ASTC%202010%20%2810%205%2010%29v2.pdf
(Accessed April 17, 2012).
New Track Media. (2010). Reader demographics, rate card #51. Retrieved from: http://www.
skyandtelescope.com/about/advertiserinfo/3305436.html?page1&cy (Accessed January 25, 2010).
Ottinger, G. (2010). Buckets of resistance: Standards and the effectiveness of citizen science. Science,
Technology and Human Values, 35, 244270.
Ozgelen, S. (2012). Exploring the relationships among epistemological beliefs, metacognitive
awareness and nature of science. International Journal of Environmental and Science Education, 7, 409431.
Penuel, W. R., & Means, B. (2004). Implementation variation and fidelity in an inquiry science program:
Analysis of GLOBE data reporting patterns. Journal of Research in Science Teaching, 41, 294315.
Percy, J. (1999). Amateur-professional partnership in astronomical research and education. Publications
of the Astronomical Society of the Pacific, 111, 15951596.
Percy, J. R. (Ed.) (2012). Highlighting Epsilon Aurigae and Citizen Sky. Journal of the American
Association of Variable Star Observers, 40, 609611.
Phillips, A. L. (2008). Of sunflowers and citizens. American Scientist, 96, 375376.
Price, A., & Paxson, K. B. (2011). The AAVSO 2011 Demographic and Background Survey. Journal of
the American Association of Variable Star Observers, 40.
Price, C. A., Borland, J., & Lee, H-S. (March, 2012). Scientific competencies and learning in online
discourse of a citizen science project. Unpublished paper presented at National Association of Research in
Science Teaching 2012 Conference, Indianapolis, IN.
Raddick, M. J., Bracy, G., Carney, K., Gyuk, G., Borne, K., . . . Jacoby D. (2009). Citizen science: Status
and research directions for the coming decade. Retrieved from: http://www8.nationalacademies.org/
astro2010/DetailFileDisplay.aspx?id454 (Accessed December 7, 2009).
Raddick, M. J., Bracey, G., Gay, P. L., Lintott, C. J., Murray, P., . . . Vandenberg, J. (2010). Galaxy zoo:
Exploring the motivations of citizen science volunteers. Astronomy Education Review, 9, 118.
Rasch, G. (1960). Probabilistic models for some intelligence and achievement tests. Copenhagen:
Danish Institute for Educational Research.
Roth, W.-M., & Barton, A. C. (2004). Rethinking scientific literacy. New York, NY: Routledge Falmer.
Roth, W. M., & Lee, S. (2005). Rethinking scientific literacy: From science education as propaedeutic to
participation in the community. Annual Meeting of the American Educational Research Association, Seattle,
WA. (ERIC Document Reproduction Service No. ED478153).
Roy, H. E., Pocock, M. J. O., Preston, C. D., Roy, D. B., & Savage, J. (2012). Understanding citizen
science and environmental monitoring. Retrieved from NERC Centre for Ecology & Hydrology web site:
http://www.ceh.ac.uk/products/publications/documents/CitizenScienceReview.pdf
Rubba, P. A., & Andersen, H. O. (1978). Development of an instrument to assess secondary school
students understanding of the nature of scientific knowledge. Science Education, 62, 449458.
Journal of Research in Science Teaching

CITIZEN SCIENCE LITERACY

801

Sadler, T. D., & Zeidler, D. L. (2009). Scientific literacy, PISA, and socioscientific discourse:
Assessment for progressive aims of science education. Journal of Research in Science Teaching, 46(8),
909921.
Shen, B. S. (1975). Views: Science literacy: Public understanding of science is becoming vitally needed
in developing and industrialized countries alike. American Scientist, 63, 265268.
Silvertown, J. (2009). A new dawn for citizen science. Trends in Ecology and Evolution, 24, 467471.
Somers, A. B., Matthews, C. E., & Carlone, H. (2009). The turtle connection. N. C. Partners in
Amphibian & Reptile Conservation. Retrieved from: http://www.ncparc.org/TurtleConnectionSummary.pdf
(Accessed December 7, 2009).
Time Out Chicago. (2012). Citizen Science Sunday. Retrieved from: http://timeoutchicago.com/thingsto-do/this-week-in-chicago/15251541/citizen-science-sunday (Accessed May 1, 2012).
Trumbull, D. J., Bonney, R., Bascom, D., & Cabral, A. (2000). Thinking scientifically during
participation in a citizen-science project. Science Education, 84, 265275.
Ucko, D. A. (2010). NSF influence on the field of informal science education. Retrieved from: http://
caise.insci.org/uploads/docs/Ucko_%20NSFInfluenceonISE.pdf (Accessed April 23, 2012).
Vygotsky, L. S. (1964). Thought and language. Cambridge, MA: MIT Press.
Wee, Y. C., & Subaraj, R. (2009). Citizen science and the gathering of ornithological data in Singapore.
Nature in Singapore, 2, 2730.
Wiggins, A., & Crowston K. (2011). From conservation to crowd sourcing: A typology of citizen
science. Proceedings of the Forty-fourth Hawaii International Conference on System Science. Retrieved
from: http://voss.syr.edu/sites/voss.syr.edu/files/hicss-44.pdf (Accessed March 1, 2011).
Wilderman, C. C. (2004). Portrait of a watershed: Shermans Creek. A technical status report. 73 pp.
Wright, A. (2010). Managing scientific inquiry in a laboratory the size of the web. New York Times,
December 27, 2010.
Wright, B. D., & Lincare, J. M. (1994). Reasonable mean-square fit values. Rasch Measurement
Transactions, 8, 370.
Wright, B. D., & Masters, G. N. (1982). Rating scale analysis. Chicago, IL: MESA Press/University of
Chicago.

Journal of Research in Science Teaching

También podría gustarte