Over the last several years, UW-Madison has increased its tuition at a higher rate than its System peers, thanks to the Madison Initiative for Undergraduates. That shift has not been accompanied by a decline in the percent of students receiving Pell Grants–in fact there’s been a 5.5 percent increase in % Pell since 2000. Some are saying that this means that low-income students have been “held harmless” from the rising tuition, and that further increases would likely not lead to diminished economic diversity on campus. Furthermore, we are told, we can look to the outreach campaigns of institutions like UVA and UNC-Chapel Hill (home to Access UVA and the Carolina Covenant respectively) for models of anti-“sticker shock” programs that “work.”
These claims are terrific examples of why it’s a bad idea to make causal claims based on correlational data. If you want to make those statements, you can look to those examples and find support for your agenda. But you shouldn’t.
In fact, the increase in the percent Pell at UW-Madison over the last few years is consistent with increases in % Pell at many colleges and universities nationwide over that time period. The cause lies not in successful outreach campaigns, or the failure of tuition increases to inhibit student behavior, but mainly in the recession. The recession had two relevant effects: First, many people were laid off– and thus saw a temporary loss of income. Thus, students from families that in 2007 were not Pell eligible found themselves eligible for the Pell in 2008. The Pell is based on current and not long-term disadvantage. So an increase in % Pell doesn’t mean you coaxed “new” low-income students into attending Madison or did a better job retaining those you already enrolled, but rather that a greater proportion of those who were already UW-bound (or already enrolled) now found themselves eligible for the additional help. Second, the Pell reduced the number of jobs available to students not enrolled in college–thus lowering the opportunity costs associated with college (e.g. foregone earnings). This could have independently increased both enrollment and persistence.
Furthermore, during the same time period, as part of the legislation that increased the maximum Pell the federal government also increased the family income (AGI) a student could have and qualify for the Pell– from $20,000 to $30,000. Thus, a whole bunch more people became Pell-eligible during the period in which the MIU was implemented. And, the maximum Pell was increased– possibly helping to offset the increase in tuition.
Thus, it should abundantly clear that it would be incorrect to state that the increasing % Pell at UW-Madison over the last several years is evidence that tuition increases do not inhibit enrollment of low-income students and/or that additional investments in need-based financial aid hold students harmless.
Same goes for the “success” of programs like the Carolina Covenant. Don’t get me wrong– the program seems great, and feels great, and the leadership is great. And for sure, the program’s data looks nice– they’ve seen an uptick in the representation of Pell recipients on campus and increased retention over time. As an evaluation they show better outcomes than prior cohorts of students. But as compelling as those numbers seem to be, they cannot be interpreted as evidence that these changes are attributable to the program itself– and that’s where the burden of proof lies. Indiana saw increases in college enrollment among the children of low-income families when its 21st Century Scholars Program was implemented, but reforms to the k-12 system were made at the same time, and the economy was booming. The program “effects” may have been little more than happy coincidence. We cannot rely on the potential for such happy coincidences when crafting new policies and making decisions about affordability.
It’s time to get honest about what data can and cannot tell us. I’ve heard too many claims around here that it can tell us whatever we want. While that’s undoubtedly partially true under the best of circumstances, it is especially true when we take no steps to collect data systematically and use sophisticated tools when analyzing it. If we were really committed to holding students harmless from tuition increases, we’d have commissioned an external evaluation (external= not done by institutional researchers) and made the data available for analysis. There are plenty of talented folks on campus who know how to do this work– why not ask them to take a look at what happened under MIU?