With the creation of the first federal student loans as part of the National Defense Education Act of 1958, the US postsecondary financial aid system was set on a path from which it has not fundamentally deviated in the intervening decades. While college financing has trended almost inexorably toward greater reliance on student borrowing as costs have outpaced families’ incomes, the major components of the financing “mix” have remained unchanged. Financial aid policy is sometimes tweaked around the edges to lighten the burden of student debt, give colleges a competitive edge, or address undesirable disincentives. For the most part, however, these reforms bear more resemblance to the classic “shell game” than to authentic innovations. What American students need are more powerful tools with which to approach their futures—tools that help them prepare for higher education, persist to completion, and then leverage returns on their degrees. What they get, however, are repackaged versions of the same blunt instruments. While everyone wants improved outcomes from our financial aid investments, the nation’s apparent inability or unwillingness to innovate truly novel approaches to paying for higher education stands in the way of progress. The goal of financial aid policy has been narrowly framed as only helping young adults pay for college, a low bar that completely ignores the role financial aid could play in influencing early education, postsecondary completion, and post-college financial health. As a result, instead of receiving support at critical junctures along the opportunity pipeline to a prosperous adulthood, students are largely left to their own devices except at the moment when the tuition bill becomes due. To capitalize on the resulting missed opportunities, the United States needs more than different loan repayment schedules or loosened rules on grant disbursement. What we need is a fundamental shift in how we think about financing higher education and what we believe about why it matters.