Student Self-Reported Learning Outcomes

What are Self-Reported Learning Outcomes?

To read a PDF version of this report, click here

The “gold standard” in student learning outcomes assessment is the use of “direct” measures of student performance (i.e., evidence of actual student performance—in papers, presentations, capstone courses, etc.). However, using direct measures is often time and cost intensive in terms of scoring, collecting, and interpreting data, and requires substantial faculty involvement in the collection and scoring of student work. As a result, direct measures generally study small focused populations of students (e.g., students in a particular department selected for program assessment purposes) and focus on a few specific learning objectives (i.e., using student final papers to assess student writing abilities and critical thinking performance). Direct measures are an essential component of an institution’s approach to understanding student learning, providing deep and authentic evidence of student performance.[1] However, at a campus as large as UMass Amherst, focusing only on direct measures is unlikely to provide the campus with fully representative evidence of student learning (either in terms of the variety of student experiences, student characteristics, or the range of learning objectives) necessary to understand the variety of learning experiences and performance on campus.

For these reasons, many campuses also collect information on students’ own perceptions of their learning, referred to as “indirect” measures of student performance. These measures provide an opportunity for students to reflect upon and consider their own learning, and they offer institutions systematic information on student perspectives and ratings of their learning. Although student self-reporting reflects what students believe they have learned (not what their performance demonstrates they actually can do), universities often use these perspectives alongside direct measures in studies of student learning. Indeed, scholarship in higher education suggests that student self-reports of their cognitive outcomes can correlate with direct assessments in some studies.[2]

While we cannot assume that self-reported student learning outcomes are equal to direct measures of student performance, self-reported data makes it possible to systematically analyze a larger population of students and consider their views on their learning across a wide swath of important learning objectives, something that would be cost prohibitive if the focus was only on direct measures.[3] Scholars suggest that triangulating student self-reported data with other methods can provide a “general measure of achievement” of student growth.[4]

At UMass, we collect and consider student self-reported learning from a number of surveys—each with their own strengths and areas of emphasis. Specifically, three surveys provide us with perspectives on student self-reported learning at UMass: (1) the National Survey of Student Engagement (NSSE) survey allows us to trace UMass student responses over time and compare UMass to national benchmarks; (2) our UMass Senior Survey offers the capacity to look at student perception by major and by learning context also over time and provides information to individual departments; since 2016 this survey has included a set of self-reported learning items that offer insights into the University’s contributions to our students’ learning across a number of learning objectives; and (3) the 2017 Survey of Recent UMass Alumni highlights alumni views on the contribution their UMass experience made to their student learning outcome development and the learning outcomes that alumni use in the workplace post-graduation. In this Research Report, we describe in some detail current results from these three Self-Reported Study Learning Outcomes surveys.

The NSSE Report

We have been collecting NSSE Survey data since 2000 and in 2005 started administering the survey to all freshmen and seniors every three years. The survey has many elements to it, all focused on students’ engagement in their learning. The items asking students to rate the University’s contributions to their learning and development on a host of learning objectives are an important component of the results. In these historical comparisons, we observe a general improvement in UMass seniors’ views of the University’s contribution to their “knowledge, skills, and personal development” over time (see the various shades of red bars in Figure 1).

 NSSE Senior Self-Reported Learning Outcomes

An important benefit of the NSSE results is that they also provide us with comparisons to the responses of students at other public Research Universities. Over time, UMass seniors’ ratings have become more similar to those of students at other Research Universities. In Figure 1, we show the comparative results for the most recent survey administration (the gray bar represents the results from other Research Universities). As the Figure shows, UMass students’ most recent ratings (the darkest red bars) are equal to or higher than this external benchmark in all areas except for Analyzing Numerical and Statistical Information. The upward trajectory in students’ ratings of the University’s contribution to their development is a promising indicator of the success of current efforts to enhance the undergraduate learning experience.

UMass Senior Survey

We ask similar learning objectives and development questions in the UMass Senior Survey. The rating for knowledge in the major is particularly high, as is thinking critically, working effectively with others, understanding strengths and weaknesses, and learning effectively on your own. A number of these reflect what some call the “soft skills,” which include interacting with others (people skills) and being reflective about your own abilities.[5]

 2017 Senior Survey Learning Outcomes 

While both the NSSE survey and this survey focus on seniors’ experiences for the class graduating in 2017, you will see that in the Senior Survey students’ ratings of the University’s contribution are even higher than they are in the NSSE survey. The reasons for these differences are not quite clear. Response rates for both are quite strong, although the Senior Survey rate is higher (NSSE = 47%; Senior Survey=71%). Both surveys were administered in Spring 2017, although NSSE was earlier in the semester and the senior survey administered the week before graduation. The primary differences are that NSSE was administered online, and the Senior Survey is administered as a paper survey handed out to seniors while they are waiting in line to pick up their caps and gowns for graduation. In addition, the NSSE survey asks students about their University experience overall, while in the Senior Survey all the items before the learning outcomes items are about the students’ experiences in their major, specifically. While the survey item itself is identical in both surveys, the survey administration context for the Senior Survey (a focus on the major, celebratory state prior to graduation, person-to-person survey administration) may contribute to the more positive ratings.

In addition to the overall campus results represented in Figure 2, each academic department receives a report for the responses of their own majors and can use this information as another tool in understanding students’ perceptions of their learning. This information can help departments consider how they communicate departmental objectives to students, and how the curriculum and major-based experiences might contribute to students’ views of their learning.

UMass Alumni Survey

The survey of recent Alumni provides us with the opportunity to understand how students view their education, and what they have gained, a few years after they graduated. In Fall 2017, we surveyed individuals who had graduated five and six years earlier. The survey asked these alums to rate how much UMass contributed to their development of particular outcomes (see Figure 3). The overall pattern of higher and lower ratings is fairly similar to the patterns in the other surveys, with the highest ratings of contributions in thinking critically, knowledge in the major, and learning effectively on one’s own. “Analyzing Quantitative Problems” is, again, on the low end.

 2017 Alumni Survey UMass Contributions to Skills and Knowledge

The 2017 Alumni Survey also provides us with information on the skills our alumni use most in their work environments. In Figure 4, those ratings are juxtaposed with these alums’ ratings of UMass’ contributions to those skills. Alums generally indicate that UMass has made a fairly substantial contribution to their skills overall (as represented in Figure 3 as means, and in Figure 4 by percent for each category of response).

However, there are in some cases a miss-match between their skill development at UMass and the extent these skills are used in their work. The most dramatic example is in “Knowledge specific to the major” where, across all surveys, students and alumni alike rate the UMass contribution as quite high. However, when it comes to the skills they use in the work place, these alums rate this content knowledge as lowest among the skills listed. In other cases (working effectively with others, thinking critically and analytically, learning effectively on your own), the ratings of UMass’ contributions are more muted than are the extent to which alums are required to use these skills at work. While only one indicator of UMass’ contributions to our students’ preparation for the workplace, these results provide some insight in areas where we might want to consider making enhancements to the curriculum and to the kinds of skills students are asked to practice while enrolled.
 

 2017 Alumni Survey Skills and Knowledge Used at Work

 

Next Steps

These results provide plenty of food for thought as the University continues to enhance its efforts to prepare students for life after college. At the broadest level, one can look at student ratings of their reported learning and see that students believe their experiences at UMass have made strong contributions to their learning along a number of objectives of importance. Comparisons with external benchmarks indicate the gains our students report are similar to those of other Universities like ours. However, there are areas we should consider in future studies. First, while students express that they gain a lot of major related knowledge, their ratings of how much they have gained in the  “softer skills” that alumni say are important are more muted. Second, there is some mismatch between what students say they receive and what they use most in their employment (see Figure 4).

Of course, UMass’s success in student preparation for the workplace is only one indicator of our educational effectiveness. The campus has goals for students that extend beyond workplace performance—goals like lifelong learning, engagement in society, and ethnical decision making, to name a few. These self-reported results offer us an opportunity to consider UMass contributions to student development in these dimensions as well. One of the strengths of these indirect measures, as suggested earlier, is that they provide us with a particularly expansive perspective on student learning and development, a view that helps the campus consider the effectiveness of the full range of its intentions.

While we do not explore it in detail here, the Senior Survey report for departments is a valuable additional component to our research into self-reported learning outcomes. Departments can review their individual reports and see how students’ ratings fit with faculty members’ own expectations for student learning. These results can contribute to departments’ understanding of their students’ experiences and raise questions about student performance that departments want to pursue further. This process of using available evidence to inform current practices and identify additional lines of inquiry into the quality of students’ learning and experiences is the centerpiece of the campus’s new Educational Effectiveness Planing process.

 

Footnotes & References

[1] The University is currently involved in a direct assessment of student learning through our participation in the national Multi-States Collaborative assessment project.

[2] Pike, G. (1995). The Relationship between Self Reports of College Experiences and Achievement Test Scores. Research in Higher Education, 36(1), 1-21.

[3] Anaya, G. (1999). College Impact on Student Learning: Comparing the Use of Self-Reported Gains, Standardized Test Scores, and College Grades. Research in Higher Education, 40(5), 499-526.

[4] Gonyea, R. (2005). Self-Reported Data in Institutional Research: Review and Recommendations. New Directions for Institutional Research, (127), 73-89.

[5] The AAC&U lists the following as soft skills: “written and oral communication, critical thinking, and interacting with people from diverse backgrounds” ("New Reports Offer Good News for the Arts and Humanities," AAC&U ) and “ teamwork, ethics, diversity, and lifelong learning preparation” ("Liberal Education for the Twenty-first Century: Business Expectations," AAC&U).