Suggested Assessment Tools
For suggestions of assessment tools specifically relevant to UFEs, we
developed Table 1 , which not only describes common intended
student outcomes from UFEs, but also provides example assessment tools
that might be appropriate to assess the intended student outcomes. In
Table 1 we included only tools that have been peer-reviewed and
published. We strongly recommend reviewing the associated peer-reviewed
paper before using a tool, as well as looking in the literature to see
if others have used the tool and published their findings.
What are the Next Steps?
We encourage that the process of evaluation and assessment is a
reflective, cyclical, iterative process of improvement as it relates to
UFE design and implementation. There are inevitably going to be aspects
of any learning experience that could be improved, and this assessment
strategy (Fig. 1) can help practitioners visualize alignment between
intended outcomes, programming, assessment and evaluation; and how each
informs the other. The next steps for many UFEs might be to first report
to stakeholders (funders, the institution, etc.) on the outcomes of the
UFE. Or, if the goal of the assessment effort was to conduct novel
research, then the next steps might be to analyze, write up and submit
the results of the study for peer review, thereby contributing to the
growing literature of empirical outcomes from UFEs. For example,Fig. 2D will use the results to inform similar UFE design and
assessment, Fig. 2B will provide pilot data for ongoing
publishable projects, and Fig. 2A&C will, in-part, leverage
results to apply for or validate grant funding. These types of data may
be paramount to sustained funding, data-driven advocacy efforts, and/or
applying for future funding for continued programming.
An important part of the presented strategy is that it might be used to
engage stakeholders in a discussion about what additional questions
might be appropriate to ask or what improvements need to be considered.
Is there alignment between activities and learning goals? Is the current
evaluation strategy accurately measuring what stakeholders expect the
students to gain from the UFE? Is the programing intentionally inclusive
of the participants’ diverse perspectives and experiences, or could
adaptations be made to better serve the UFE population? For example, to
address financial and relocation barriers identified through the program
evaluation for one field based REU, the REU leaders introduced new
policies for students to be paid at the start of their experience and
identified field research projects that were located in student
communities, and in another case, accommodations were made for the
student’s family to join them as part of the residential field
experience (Ward et al. 2018). This is just one example of how
assessment data can be used to inform the design of future UFEs and
highlights how the assessment process can be both informative and
iterative.