Students are paying a higher price tag for college, but is the quality of their education also increasing, or at least staying stable? A lot of indicators suggest “no.”
During the George W. Bush administration, the Spellings Commission found evidence that “the quality of student learning at U.S. colleges and universities is inadequate and, in some cases, declining.”
In 2003, the Department of Education’s National Assessment of Adult Literacy found that only 31 percent of college graduates scored at “the proficient level” of reading. That number was 9 percentage points higher in 1992. Of the 2003 college graduates, 53 percent scored at the “intermediate level” and 14 percent scored at the “basic level.” Three percent of college students scored a “below basic” literacy level.
In 2008, 57 percent of college graduates failed a civic literacy exam put out by the Intercollegiate Studies Institute. And a 2013 Gallup and Lumina Foundation poll found that only 11 percent of business leaders believed that college graduates are prepared for the workforce.
According to a 2013 study, fewer than 5 percent of college students knew the following: that Thomas Jefferson’s home is named “Monticello;” the name of the author of Brave New World; and that Madam Curie discovered radium or that Mozart wrote Don Giovanni. Additionally, compared with students in 1980, far fewer students knew that Paris is the capital of France.
Law school graduates also have poorer outcomes. According to the Bar Examiner, between 2007 and 2016, bar exam passing rates declined in most states.
The results of a 2017 Gallup survey show that only 42 percent of college alumni strongly agreed that they were challenged academically in college.
One of the most comprehensive studies on college student learning is detailed in the 2011 book Academically Adrift by Richard Arum and Josipa Roksa. The authors surveyed 3,000 students on 29 campuses.
After analyzing transcripts, surveys, and scores from the standardized test called the Collegiate Learning Assessment (CLA), the researchers found that 45 percent of students demonstrated “no significant gains in learning” after two years of college. They also found that, compared to students from a few decades ago, today’s college students spend 50 percent less time studying.
The authors note that although students may be familiar with course-specific content and may graduate with a respectable GPA, many are nonetheless “academically adrift” because they are “failing to develop higher-order cognitive skills.”
As the findings of Academically Adrift suggest, students’ dismal learning outcomes may be connected in part to their poor study habits. Federal data show that college students don’t spend enough time studying.
In 2016, the Heritage Foundation analyzed data from the Bureau of Labor Statistics’ American Time Use Survey from 2003–2014. The analysts found that the “average full-time college student spends only 2.76 hours per day on all education-related activities”—totaling 19.3 hours spent a week.
On a weekly basis, the average full-time student spends only 10.7 hours a week on research and homework—a number that falls far below the recommended number of study hours. It is recommended that students study two to three hours per credit hour per week. Since full-time students must take at least 12 credit hours per semester, they should be studying at least 24 to 36 hours a week.
Unfortunately, colleges are reluctant to track and release information regarding how much students learn in college. Former Stanford Graduate School of Education dean Richard Shavelson said that, for many schools, student learning is “less important than having a winning football team if you want to stay alive, in the scheme of things.”
Shannon Watkins is senior writer at the James G. Martin Center for Academic Renewal.