Selling Dreams, Not Reality

Colleges are marketing possible, not likely, outcomes. That needs to change.

Choosing a college major is a life-altering decision, made millions of times a year by 18-year-olds with little exposure to higher education or labor markets. Colleges frame programs in terms of possibilities, not probabilities. They pitch degrees like products, emphasizing prestige and potential while downplaying the risks of dropping out or ending up underemployed. These institutions influence such decisions millions of times a year, but students make them only once. And while schools hold internal data on outcomes and labor markets, they rarely share what matters most.

To the institution, the student is a sales prospect, not a learner they’re accountable for educating.

Students deserve to know how often graduates land relevant roles. Anyone choosing a major should know his odds of finishing. He should be able to trace a degree to the jobs it qualifies him for—and to see what those jobs actually pay. Students deserve to know how often graduates land relevant roles, how much they earn over time, and whether they’ll be actively recruited or stuck searching job boards with little traction.

Colleges have long been free to imply connections that don’t exist. The Bureau of Labor Statistics catalogs nearly every job in the U.S. economy using the Standard Occupational Classification system (SOC), tracking data on wages, job growth, and employment across nearly 900 distinct occupations. The Department of Education classifies more than 2,800 academic programs using the Classification of Instructional Programs (CIP) system.

A federal “crosswalk” links CIP codes to SOC codes, mapping degrees to potential careers. But that mapping shows intent, not results. Misalignment between curricula and job requirements, weak labor demand, or industry norms can make the crosswalk meaningless.

Take ballet. A degree in ballet—yes, the codes are that granular—maps directly to the occupation “Dancer.” But most professional dancers didn’t earn a college degree. They trained in studios or conservatories. The link exists on paper but rarely in practice.

A CIP-SOC match may suggest a pathway, but it doesn’t prove one. Colleges have long been free to imply connections that don’t exist. The data to check those claims have been public for decades, downloadable, machine-readable, and rarely used, not because they’re flawed but because what they reveal is inconvenient. Designed to improve alignment between education and employment, the system is treated as an administrative task, not a tool for reform.

When a program survives on enrollment rather than outcomes, the incentive is clear: Sell the degree, not the result. Advising becomes marketing. Of course the psychology professor or counselor says the student is a great fit. The bartender thinks he needs a drink. The barber says he needs a haircut.

And students make high-stakes decisions based on incomplete—and often self-serving—information.

The irony is that much of these data are already public, available through tools such as College Scorecard, CareerOneStop, and state wage dashboards. But they’re fragmented, buried in technical portals, and almost never appear where students make actual decisions. Institutions could surface these outcomes in advising or program pages, but they don’t. The data are available. They’re just not accessible.

Consider California State University, Monterey Bay (CSUMB), where psychology is the most popular undergraduate major. Last year, the school awarded 290 psychology degrees—about one in every seven diplomas according to College Navigator. The program is marketed as a launchpad into “a broad spectrum of professional fields” and, implicitly, as a starting point for careers in counseling and social services. What the school doesn’t say is that most of those jobs require additional education, licensure, or training beyond the bachelor’s degree.

The CSUMB psychology B.A. isn’t a clinical pipeline. It includes courses in statistics and research methods but little clinical training and no practicum. The school offers a master’s degree in school psychology, but it’s small and specialized, not a general path for most psychology students. For the majority of students, the degree ends at capstone, leaving them with a diploma but no clear and immediate route to the careers the program promotes.

Institutions could surface these outcomes in advising or program pages, but they don’t. According to the CSU Data Dashboard, CSUMB psychology graduates earn a median of $41,000 two years after graduation and $57,000 after five years. Those aren’t the wages of licensed therapists or counselors—they’re more consistent with jobs that don’t require a degree, such as retail supervisor, dispatcher, or customer-service lead. The school has access to these data, but they don’t appear in promotional materials or advising pages. Students are told what the program could lead to, not what it actually does. The program is fully accredited, but accreditation doesn’t ask whether students find relevant jobs or earn a decent wage in most cases. It doesn’t evaluate marketing claims or require labor-market justification for a degree. As long as internal procedures are followed, a program is approved—even if the outcomes consistently fall short.

Of course the psychology professor says the student is a great fit. The bartender thinks he needs a drink. Psychology is the fifth most popular undergraduate major in the United States, with colleges awarding approximately 129,600 psychology bachelor’s degrees in the 2021-22 academic year, according to NCES and IPEDS. CSUMB’s psychology program reflects a broader national pattern: The degree is widely granted but poorly aligned with labor-market demand. According to CareerOneStop, psychology-related or -adjacent professions typically require more (e.g., clinical psychologist) or less (e.g., social and human services assistant) education than a psychology B.A. In many cases, the credential adds little or no value—and wages reflect that.

According to the New York Federal Reserve’s 2024 study of recent graduates, psychology majors face below-average unemployment (3.6 percent) but significant underemployment (45.4 percent). Their median early-career wage is $45,000. Mid-career wages rise to $70,000, but that includes many workers with advanced degrees. Nationally, 53 percent of psychology majors pursue graduate study. By contrast, and as previously stated, CSUMB psychology graduates earn just $57,000 after five years—likely because most don’t pursue an advanced degree.

This labor-market/program misalignment isn’t unique to CSUMB. It’s systemic, driven by incentives that reward enrollment over outcomes. Colleges focus on short-term budgets. Students absorb the long-term consequences.

Tools such as College Scorecard and College Navigator report program-level earnings, graduation rates, and more. Some states go further, linking student records to wage data to track long-term outcomes. The National Center for Education Statistics has funded task forces and usability studies to improve dissemination, but the data rarely reach students. Highly specific and data-driven career outcomes are almost never listed in program descriptions, and labor-market demand is typically omitted. Most students—and many advisors—have never heard of the systems designed to show what a degree actually leads to.

If colleges told students that most psychology majors don’t become psychologists—or that digital-media degrees rarely lead to jobs in tech—many would choose higher-value fields such as nursing or engineering that are more selective. Or, from the institution’s perspective, they might walk away entirely. So instead, schools rely on vague promises, highlight anecdotal successes, and let students assume a pathway exists—even when it doesn’t.

In 2025, Congress took a step toward closing the gap between college marketing and labor-market reality. Section 84001 of the “One Big Beautiful Bill Act” authorized the Department of Education to cut off federal aid to programs where most graduates earn less than the median high-school graduate. It’s a low standard, but some programs will fail to meet it.

For the first time, institutions must now consider not just what they teach but where it leads. Programs that consistently leave students underemployed and underpaid now risk losing access to federal student aid—the financial lifeblood of most colleges.

We don’t need new tools. We need to use the ones we have. Section 84001 doesn’t solve the problem. But it’s the first serious attempt to hold programs accountable for what they deliver—regardless of what they promise. A similar rule in 2011, Gainful Employment, forced the closure of Corinthian and other for-profits that failed to meet basic outcome thresholds.

This new law marks real reform: placing accountability where it belongs—on outcomes, not intentions. That’s long overdue.

Colleges have the data. They know which programs lead to viable careers and which don’t. Transparency threatens the bottom line, so it’s avoided. And the cost doesn’t fall on the institution. It falls on the student, who leaves with debt, disappointment, and no clear path forward.

We don’t need new tools. We need to use the ones we have. Stop selling dreams. Start reporting outcomes. That’s how markets work. That’s how trust gets rebuilt.

James Andrews is a CPA and professor of business administration at Ohlone College in California, where he writes on higher education, labor markets, and artificial intelligence.