Of all the changes in American higher education in the twentieth century, none has had a greater impact than the rise of the two-year, junior college. Yet this institution, which we now take for granted, was once a radical organizational innovation. Stepping into an educational landscape already populated by hundreds of four-year colleges, the junior college was able to establish itself as a new type of institution—a nonbachelor’s degree-granting college that typically offered both college preparatory and terminal vocational programs. The junior college moved rapidly from a position of marginality to one of prominence; in the twenty years between 1919 and 1939, enrollment at junior colleges rose from 8,102 students to 149,854 (U.S. Office of Education 1944, p. 6). Thus, on the eve of World War II, an institution whose very survival had been in question just three decades earlier had become a key component of America’s system of higher education. The institutionalization and growth of what was a novel organizational form could not have taken place without the support and encouragement of powerful sponsors. Prominent among them were some of the nation’s greatest universities—among them, Chicago, Stanford, Michigan, and Berkeley—which, far from opposing the rise of the junior college as a potential competitor for students and resources, enthusiastically supported its growth. Because this support had a profound effect on the subsequent development of the junior college, we shall examine its philosophical and institutional foundations. In the late nineteenth century, an elite reform movement swept through the leading American universities. Beginning with Henry Tappan at the University of Michigan in the early 1850s and extending after the 1870s to Nicholas Murray Butler at Columbia, David Starr Jordan at Stanford, and William Rainey Harper at Chicago, one leading university president after another began to view the first two years of college as an unnecessary part of university-level instruction.