To provide examples of the diversity of UFEs and give realistic examples
of using the strategy to guide assessment and/or evaluation,Fig. 2 presents four sample vignettes that highlight different
kinds of UFEs and illustrate using the approach outlined in Fig.
1 . The vignettes provide examples of how one might apply the strategy
to a given UFE. At the end of the paper we present two of the vignettes
in a more detailed narrative, offering examples that synthesize the
ideas presented (Expanded
Vignettes).
Figure 1. Strategy for Assessment of Undergraduate Field
Experiences (UFEs). The strategy is meant to serve as a guide to walk
practitioners through assessing their UFE. The green arrows signify that
each box informs the other, and iterative reflection and refinement are
a key aspect of informed evaluation and assessment. The logic model
includes four key components: I) Identifying the intended
student and/or programmatic outcomes for the UFE; II)Considering the context of the UFE, which may include any
number of factors such as: student context, setting, duration, timing,
one or multiple disciplines, and accessibility of the UFE; III)Defining an assessment approach that is appropriate for the
context and in alignment with intended outcomes; IV) Utilizing
the outcomes and approach to inform and refine next steps in
the UFE.
Identify the Intended Outcomes From the UFE
The main focus of this work is to provide the tools and resources needed
such that stakeholders can confidently assess if students are meeting
expected learning outcomes from UFEs (e.g. students expand their
knowledge of endemic amphibians; students report an increased interest
in environmental sustainability efforts); however, programmatic outcomes
and goals (e.g. participants are involved in community engagement and
scientific knowledge-building activities) are also critical components
of this type of learning environment, and thus are also represented in
example vignettes (Fig. 2 ).
We draw upon Bloom’s Taxonomy of Learning (Bloom and Krathwohl 1966,
Anderson et al. 2001) to aid practitioners in considering the possible
outcomes from UFEs. The taxonomy describes three fundamental domains of
learning: the cognitive, affective, and psychomotor domains. Studies
about UFEs demonstrate that students may experience outcomes across all
of these domains and more (Boyle et al. 2007, Stokes and Boyle 2009,
Scott et al. 2012, Petcovic et al. 2014, Scott et al. 2019, O’Connell et
al. 2020). Cognitive outcomes from a UFE could include: an improved
ability explain plant species interactions, accurately identify
geological formations, or solve a problem using an interdisciplinary
lens (Fuller et al. 2006, Bauerle and Park 2012, Tripp et al. 2020).
Affective outcomes could include: a newfound interest in a subject, such
as conservation; motivation to continue seeking out field learning
experiences; or, development of a connection to place (Boyle et al.
2007, Simm and Marvell 2015, Jolley et al. 2018a, Scott et al. 2019).
Outcomes in the psychomotor domain could include: the improved ability
to geolocate, collect and measure sediment in a lake with the
appropriate instrumentation and accuracy, or use established methodology
to sample stream invertebrates (Arthurs 2019, Scott et al. 2012). In
addition to considering these three fundamental learning domains, UFEs
may promote student outcomes that span domains and enter the social
realm, such as developing communication skills (Bell and Anscombe 2013),
building friendships and collaborations (Stokes and Boyle 2009, Jolley
et al. 2019), and/or developing a sense of belonging in a discipline
(Kortz et al. 2020, Malm et al. 2020, O’Brien et al. 2020). Lastly,
students participating in UFEs could result in broader, societal level
outcomes, such as students pursuing conservation efforts, contributing
to citizen science projects, increasing awareness for social justice
issues, or supporting for sustainability efforts (Grimberg et al. 2008,
Bell and Anscombe 2013, Ginwright and Cammarota 2015).
In Table 1 , we present a list of common intended student
outcomes from UFEs. The list of outcomes was propagated by UFE
practitioners, first identified from a UFERN landscape study (O’Connell
et al. 2020) and by participants at the 2018 UFERN meeting. O’Connell et
al. (2020) surveyed practitioners on expected student outcomes from
their UFEs. We then refined the list of outcomes by removing outcomes
that were redundant, not measurable, or linked to very specific contexts
(not field universal), and then grouped them by what we call ‘primary
aim’. The primary aim category is an umbrella category by which to group
similar intended outcomes.
Table 1 illustrates a diversity of possible and likely outcomes
from UFEs ranging across domains, but not every conceivable outcome is
accounted for, and we encourage practitioners to consider outcomes that
they do not see on this table if they are in alignment with their UFE.
Interestingly, in O’Connell et al.’s (2020) survey of intended student
outcomes in extended UFEs, the majority of respondents chose outcomes in
the cognitive and/or psychomotor domains. Thus, students gaining content
knowledge and skills is a prominent goal for practitioners of UFEs, but
content can also be learned in many contexts. We and others propose that
the distinctive impact of participation in a UFE may actually be more in
the affective domain (Van Der Hoeven Kraft et al. 2011, Kortz et al.
2020). Thus, we encourage practitioners to consider focusing less on
content level outcomes and more on the full spectrum of possible
outcomes.
Consider the Context of the UFE
UFEs themselves can be highly variable (Lonergan and Andresen 1988,
Whitmeyer et al. 2009b, O’Connell et al. 2020). For example, some are
strictly disciplinary (Jolley et al. 2018b), others interdisciplinary
(Alagona and Simon 2010); they might occur locally (Peacock et al.
2018), in short duration (Hughes 2016), over an entire course (Thomas
and Roberts 2009), or as a summer research experience held at a
residential field station (Hodder 2009, Wilson et al. 2018).
Acknowledging that UFEs come in almost infinite shapes and sizes, we
will not be able to list all of the contextual variables that may define
a particular UFE. However, we do urge practitioners to consider a number
of situated contextual factors, such as those highlighted throughout
this paper, when utilizing the assessment strategy to make decisions
about next steps in assessment or evaluation (Fig. 1 ). Please
note that the intention is for the strategy to be used for iterative
change and improvement and reflective practice, not as static
scaffolding. We encourage consideration of contextual factors both in
thinking about next steps in assessment and evaluation and in
iteratively developing and reconsidering intended outcomes for the UFE.
A paper is forthcoming (O’Connell et al, submitted) that comprehensively
describes and organizes the evidence for how student context factors
such as student identity, prior knowledge, and prior experience and
design factors such as setting and social interaction that influence
learning in UFEs. The focus of this paper is to provide examples of
factors that have a direct impact on assessment strategy.