Evaluation

The CEEE team conducts high-quality evaluation that delivers systematic information to enhance projects and document outcomes. Our work is guided by evaluation questions that align with program outcomes. We engage project teams in an iterative and reflective process that supports the refinement of project activities as needed to better meet the needs of project partners and audiences.

We have experience collaborating on large- and small-scale projects for a variety of groups and at all stages of a project, from needs assessment to summative evaluation. We support project teams in proposal writing, developing evaluation plans, integrating project goals and activities, crafting logic models, creating evaluation budgets, and securing approval from the Institutional Review Board, if necessary.

CEEE evaluators bring multidisciplinary backgrounds and training to our partnerships. We work with scientists, educators, and community organizations in formal and informal education settings, with scientific research centers or offices, and with career development programs such as undergraduate research experiences and early career trainings.

We have experience and expertise in survey development, interview and focus group facilitation, social network analysis, website and social media analytics, and program activity tracking. Learn more about our work in the tabs below.

Learn more about Evaluation

Explore our Evaluation 101 pages and a curated Collection of Evaluation Tools 

Our team currently evaluates several NSF-funded research experiences for undergraduate (REU) programs and course-based undergraduate research experiences (CUREs). We work with program coordinators and instructors from the proposal stage through the final program impact report. Evaluation activities can include customizing formative survey questions for program improvement and refinement; the use of validated survey questions to assess program outcomes, such as science self-efficacy, science identity, and career plans; and qualitative measures, such as focus groups, for a deeper understanding of students’ experiences in a program or course.  

See an example evaluation report from the Research Experiences for Community College Students (RECCS) program.

  • Link to RECCS Report

Our team evaluates curriculum and its impacts on students. We typically evaluate curriculum at a project level, including iterative formative evaluation during the development phase; workshop evaluation on professional development training for teachers using the curriculum; evaluation of student data and teacher implementation data; and web analytics to assess the reach of curriculum use. We have experience with needs assessments in the early stages of a project to determine teacher demands and help project teams deliver successful and relevant curricula to classrooms.  

See an example evaluation report from the HEART Force program:

  • Link to HEART Force Report

Our team currently evaluates several informal education projects. Two examples are described below.  

At the University of Colorado Boulder’s Fiske Planetarium, we evaluate video content against funding goals; relationships built through collaborations with video experts; as well as audience impact through surveys. We analyze the reach of the videos through tracking download requests and web analytics. We have also surveyed planetarium managers to gain information about how the videos were used and received by audiences.  

See an example report from the Fiske Explorations project:

  • Link to Fiske report

For the collaborative We are Water (WaW) project, we conducted needs assessment and exhibit usability studies during the development phase of the exhibition that is currently traveling to rural and Tribal libraries across the Southwest. As the exhibition travels, we are tracking visitor engagement and using feedback to refine associated community events. also collaborate with an external evaluation team who assess the impact of the project on library and community partnerships via interviews. In addition, our research group is working with library staff and visitors to understand the impact of the project on the learning ecosystems in their communities. 

See an example evaluation report from the We are Water project:

  • Link to WaW report

Our team provides formative and summative evaluation using collaborative approaches with project teams and partners for data centers and other large, multi-organization projects. Two examples are described here. 

The Environmental Data Science Innovation and Inclusion Lab (ESIIL): ESIIL is an NSF-funded data synthesis center led by CU Boulder, in collaboration with NSF’s CyVerse (University of Arizona) and the University of Oslo. ESIIL enables a global community of environmental data scientists to leverage the wealth of environmental data and emerging analytics to develop science-based solutions to solve pressing challenges in biology and other environmental sciences. CIRES CEEE supports this center through collaborative evaluation with the leadership team to assist with development of the center and provide feedback on data synthesis working groups, events, and connections within the ESIIL network. The CIRES team contributes skills and expertise in collaborative evaluation approaches, needs assessment, and social network analysis. 

The Navigating the New Arctic Community Office (NNA-CO): NNA-CO is an NSF community program office led by CU Boulder in collaboration with Alaska Pacific University and the University of Alaska Fairbanks. NNA-CO supports the NSF-funded NNA Arctic Research program by building awareness, partnerships, opportunities, and resources for collaboration and equitable knowledge generation through research design and implementation; and coordinates effective knowledge dissemination, education, and outreach. In coordination with the project leadership, CIRES E&O provides a process evaluation of the team, feedback on events including a large annual meeting, tracking methods for project deliverables, as well as an ongoing needs assessment and focus group engagement of the participating researchers and Arctic community members which helped define the goals and activities for the community office and assess the extent of change from those activities. 

See an example evaluation report from NNA-CO report:

  • Link to NNA-CO Meeting Eval Report
Illustration of two people high-fiving in a work environment

Get involved and stay up-to-date with CIRES CEEE.