Evaluation Resources, especially for Broader Impacts

These materials are meant for "small" evaluations, for research projects where outreach or teaching are not the major focus of the work.  Still, if you are proposing to do an activity for a certain purpose, how will you know when the purpose has been achieved?  How will you convince others that it has been?

WHAT CAN YOU DO FOR YOURSELF?  As a start, the October 20, 2011 program on evaluation is linked to from here. (You can also get there from workshop archives.) The most important thing you'll find there is a link to LOGIC MODELS.  They are "a framework for thinking about evaluation as a relevant and useful program tool"; i.e. that helps you think of what you need to be thinking about.  Kellog Foundation has a good guide; and U Wisconsin Extension is less designerly, but has numerous templates that are pretty self-explanatory.  

I think this article from NATURE (vol. 465|27 May 2010) puts the need for evaluation in perspective:

[Researcher Leslie-Pelecky] took three female graduate students on weekly visits to local classrooms, where they spent 45-minutes leading nine- and ten-year-old children in practical activities designed to teach them about electricity and circuits. The visitors also talked about their lab work and careers. In addition, [she] did something less typical of broader-impacts efforts: she brought along education researchers to study the effect of this interaction on the children’s perception of scientists.

Those assessments were startling, she says. After three months, most of the students said that they still weren’t sure who these young ‘teachers’ were – except that they couldn’t possibly be scientists.

[Without a plan for evaluation, the researchers] “have no idea if they’re making any difference or not.”

But NSF (or any other funders) want to know!  The idea is to find out whether you're making a difference and then demonstrate what and how much!

Here is a short "hints & flaws" reminder sheet from a very extensive evalu-ATE website

I've taken another brief overview, more specifically for informal science activities, from the National Academy of Sciences Board of Science Education publication, Surrounded by Science (2010).  Here is a pdf of all of Chapter 6 (13 pages) on "Assessing Learning Outcomes," and if you only have bandwidth for 3 pages from the chapter, here is the section "Things to try" (includes references).

Here is a link to the more extensive NSF User-Friendly Handbook for Project Evaluation (2010).  By all reports, it is truly User Friendly.  

Here's another website with both tips and "perspective":  Beyond Rigor  (especially focused on measuring broadened participation in STEM).

A free course-evaluation tool for gathering learner-focused feedback can be found at the website for the "Student Assessment of Learning Gains" (SALG).  

And here is a very clear (straightforward) evaluation  of a STEM outreach program that I found on the UMass website.  I think it nicely illustrates some of the things you can learn from an evaluation and steps for discovering them.

Doing an REU?  There's an evaluation website that was developed with Howard Hughes money at Grinnell College that looks very useful.  (I've "inserted" into a number of broader impacts components, and while I haven't used it personally, it looks very much like systems I have used), and it has more credibility. Check out SURE by D. Lopatto.  At the same website, there are also a couple more specialized applications, and this 132-page document supporting the value of undergraduate research (including several pages of references for it). 

***In fact, there is so much material to choose from, the task of our office will be to point out the most efficient and effective tools, and help you select the most appropriate ones.***


When you need an objective opinion (and an evaluator's deep expertise)...

We are keeping a running list, currently very short.  Please let us know if you know of people/ groups to add. 

Back to Research Home |  Back to Broader Impacts Home