Sunday, May 15, 2011
Is Our Students Learning?
Remarkably, one of the topic's of yesterday's blog post (and another I wrote two years ao)-- the limited learning taking place on many college campuses-- is the subject of a New York Times op-ed today. Titled, "Your So-Called Education," the piece argues that while 90% of graduates report being happy with their college experience, data suggests there's little to celebrate. I urge you read it and its companion op-ed "Major Delusions," which describes why college grads are delusional in their optimism about their future.
We don't regularly administer the Collegiate Learning Assessment at UW-Madison, the test that the authors of the first op-ed used to track changes in student learning over undergraduate careers. From talking with our vice provost for teaching and learning, Aaron Brower, I understand there are many good reasons for this. Among them are concerns that the test doesn't measure the learning we intend to transmit (for what it does measure, and how it measures it, see here), as well as concerns about the costs and heroics required to administer it well. In the meantime, Aaron is working on ways to introduce more high-impact learning practices, including freshmen interest groups and learning communities, and together with colleagues has written an assessment of students' self-reports of their learning (the Essential Learning Outcomes Questionnaire). We all have good reason to wish him well. For it's clear from what we do know about undergraduate learning on campus, we have work to do.
The reports contained in our most recent student engagement survey (the NSSE, administered in 2008) indicate the following:
1. Only 60% of seniors report that the quality of instruction in their lower division courses was good or excellent.
This is possibly linked to class size, since only 37% say that those classes are "ok" in size -- but (a) that isn't clear, since the % who says the classes are too large and the % that say they are too small are not reported, and (b) the question doesn't link class size to quality of instruction. As I've noted in prior posts, it's a popular proxy for quality but also one that is promoted by institutions since smaller classes equates with more resources (though high-quality instruction does not apparently equate with smaller classes nor high resources). There are other plausible explanations for the assessment of quality that the survey does not shed light on.
2. A substantial fraction of our students are not being asked to do the kind of challenging academic tasks associated with learning gains.
For example, 31% of seniors (and 40% of freshmen) report that they are not frequently asked to make "judgments about information, arguments, or methods, e.g., examining how others gathered/ interpreted data and assessing the soundness of their conclusions." (Sidebar-- interesting to think about how this has affected the debate over the NBP.) 28% of seniors say they are not frequently asked to synthesize and organize "ideas, information, or experiences into new, more complex interpretations and relationships." On the other hand, 63% of seniors and 76% of freshmen indicate that they are frequently asked to memorize facts and repeat them. And while there are some real positives-- such as the higher-than-average percent of students who feel the university emphasizes the need to spend time on academic work-- fully 45% of seniors surveyed did not agree that "most of the time, I have been challenged to do the very best I can."
3. As students get ready to graduate from Madison, many do not experience a rigorous academic year.
In their senior year, 55% of students did not write a paper or report of 20 pages or more, 75% read fewer than 5 books, 57% didn't make a class presentation, 51% didn't discuss their assignments or grades with their instructor, and 66% didn't discuss career plans with a faculty member or adviser. Nearly one-third admitted often coming to class unprepared. Less than one-third had a culminating experience such as a capstone course or thesis project.
4. The main benefit of being an undergraduate at a research university--getting to work on a professor's research project-- does not happen for the majority of students.
While 45% of freshmen say it is something they plan to do, only 32% of seniors say they've done it.
Yet overall, just as the Times reports, 91% of UW-Madison seniors say their "entire educational experience" was good or excellent.
Well-done. Now, let's do more.
Postscript: Since I've heard directly from readers seeking more resources on the topic of student learning, here are a few to get you started.
A new report just out indicates that college presidents are loathe to measure learning as a metric of college quality! Instead, they prefer to focus on labor market outcomes.
Measuring college learning responsibly: accountability in a new era by Richard J. Shavelson is a great companion to Academically Adrift. Shavelson was among the designers of the CLA and he responds to critics concerned with its value.
The Voluntary System of Accountability, embraced by public universities who hope to provide their own data rather than have a framework imposed on them. Here is Madison's report.
On the topic of students' own reports of their learning gains, Nick Bowman's research is particularly helpful. For example, in 2009 in the American Education Research Journal Bowman reported that that in a longitudinal study of 3,000 first year students, “across several cognitive and noncognitive outcomes, the correlations between self-reported and longitudinal gains are small or virtually zero, and regression analyses using these two forms of assessment yield divergent results.” In 2011, he reported in Educational Researcher that "although some significant differences by institutional type were identified, the findings do not support the use of self-reported gains as a proxy for longitudinal growth at any institution."
As for the NSSE data, such as what I cited above from UW-Madison, Ernie Pascarella and his colleagues report that these are decent at predicting educational outcomes. Specifically, “institution-level NSSE benchmark scores had a significant overall positive association with the seven liberal arts outcomes at the end of the first year of college, independent of differences across the 19 institutions in the average score of their entering student population on each outcome. The mean value of all the partial correlations…was .34, which had a very low probability (.001) of being due to chance."
Finally, you should also check out results from the Wabash study.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment