The Diverted Dream
Latest Publications


TOTAL DOCUMENTS

8
(FIVE YEARS 0)

H-INDEX

0
(FIVE YEARS 0)

Published By Oxford University Press

9780195048155, 9780197560044

Author(s):  
Steven Brint ◽  
Jerome Karabel

No analysis of the history of the community college movement in Massachusetts can begin without a discussion of some of the peculiar features of higher education in that state. Indeed, the development of all public colleges in Massachusetts was, for many years, inhibited by the strength of the state’s private institutions (Lustberg 1979, Murphy 1974, Stafford 1980). The Protestant establishment had strong traditional ties to elite colleges—such as Harvard, Massachusetts Institute of Technology, Williams, and Amherst—and the Catholic middle class felt equally strong bonds to the two Jesuit institutions in the state: Boston College and Holy Cross (Jencks and Riesman 1968, p. 263). If they had gone to college at all, most of Massachusetts’s state legislators had done so in the private system. Private college loyalties were not the only reasons for opposition to public higher education. Increased state spending for any purpose was often an anathema to many Republican legislators, and even most urban “machine” Democrats were unwilling to spend state dollars where the private sector appeared to work well enough (Stafford and Lustberg 1978). As late as 1950, the commonwealth’s public higher education sector served fewer than ten thousand students, just over 10 percent of total state enrollments in higher education. In 1960, public enrollment had grown to only 16 percent of the total, at a time when 59 percent of college students nationwide were enrolled in public institutions (Stafford and Lustberg 1978, p. 12). Indeed, the public sector did not reach parity with the private sector until the 1980s. Of the 15,945 students enrolled in Massachusetts public higher education in 1960, well over 95 percent were in-state students. The private schools, by contrast, cast a broader net: of the nearly 83,000 students enrolled in the private schools, more than 40 percent were from out of state (Organization for Social and Technical Innovation 1973). The opposition to public higher education began to recede in the late 1950s. Already by mid-decade, a large number of urban liberals had become members of the state legislature, and a new governor, Foster Furcolo, had been elected in 1956 on an activist platform.


Author(s):  
Steven Brint ◽  
Jerome Karabel

During the 1970s, the community colleges were finally able to realize the vocationalization project that visionaries in the junior college movement from Koos to Gleazer had favored for almost half a century. Since the 1920s, as we saw in Chapters 2 and 3, the advocates of junior college vocationalization pursued their project in the face of persistent student indifference and occasional overt opposition. But in the early 1970s, a complex concatenation of forces—among them, a changed economic context and an unprecedented degree of support for vocational education from key institutions—including private foundations, the federal government, and business—tilted the balance in favor of the vocationalizers. A key factor behind the sharp increase in vocational enrollments at the community college, we shall argue, was the declining labor market for graduates of four-year institutions. But the objective change in the structure of economic opportunities for college graduates was not, as the consumer-choice model would have it, the sole factor responsible for the shift in junior college enrollments; indeed, the impact of such objective changes is, of necessity, mediated through subjective perceptions—perceptions that, we shall attempt to demonstrate below, tended to exaggerate the economic plight of college graduates. Moreover, the community college itself, driven by a powerful organizational interest in expanded enrollments and in carving out a secure niche for itself in the highly competitive higher education industry, actively shaped its economic environment by pursuing those segments of its potential market—in particular, adults and part-time students— most likely to enroll in occupational programs. By almost any standard, the rise in vocational enrollments during the 1970s was remarkable. Between 1970–1971 and 1979–1980, for example, the proportion of A.A. degrees awarded in occupational fields rose from 42.6 percent to 62.5 percent (Cohen and Brawer 1982, p. 203). With respect to total enrollments (full-time and part-time) the picture was similar: between 1970 and 1977, the proportion of students enrolled in occupational programs rose from less than one-third to well over half (Blackstone 1978). In the midst of a long-term decline in the liberal arts, Cohen and Brawer (1982, p. 23) observed, “occupational education stands like a colossus on its own.”


Author(s):  
Steven Brint ◽  
Jerome Karabel

Of all the changes in American higher education in the twentieth century, none has had a greater impact than the rise of the two-year, junior college. Yet this institution, which we now take for granted, was once a radical organizational innovation. Stepping into an educational landscape already populated by hundreds of four-year colleges, the junior college was able to establish itself as a new type of institution—a nonbachelor’s degree-granting college that typically offered both college preparatory and terminal vocational programs. The junior college moved rapidly from a position of marginality to one of prominence; in the twenty years between 1919 and 1939, enrollment at junior colleges rose from 8,102 students to 149,854 (U.S. Office of Education 1944, p. 6). Thus, on the eve of World War II, an institution whose very survival had been in question just three decades earlier had become a key component of America’s system of higher education. The institutionalization and growth of what was a novel organizational form could not have taken place without the support and encouragement of powerful sponsors. Prominent among them were some of the nation’s greatest universities—among them, Chicago, Stanford, Michigan, and Berkeley—which, far from opposing the rise of the junior college as a potential competitor for students and resources, enthusiastically supported its growth. Because this support had a profound effect on the subsequent development of the junior college, we shall examine its philosophical and institutional foundations. In the late nineteenth century, an elite reform movement swept through the leading American universities. Beginning with Henry Tappan at the University of Michigan in the early 1850s and extending after the 1870s to Nicholas Murray Butler at Columbia, David Starr Jordan at Stanford, and William Rainey Harper at Chicago, one leading university president after another began to view the first two years of college as an unnecessary part of university-level instruction.


Author(s):  
Steven Brint ◽  
Jerome Karabel

The focus of this chapter is on the shift toward predominantly vocational enrollments in the 1970s, brought on by the combined pressures of market decline, state fiscal crisis, and the political ascendance of conservative business leaders. Nevertheless, it would be misleading to suggest that contrary forces were not in evidence at least in the first few years of the 1970s. The most important of these contrary pressures was the sheer growth of the community college and university systems, which, for a time, encouraged an increase in the absolute numbers of transfers. The community colleges in Massachusetts proved to be at least as attractive in a period of economic retrenchment as they had been in better times. Low-cost, close-to-home two-year colleges were a practical alternative to more expensive higher education. Between 1970 and 1973, the community colleges’ full-time enrollment increased by over one-third, and the other two tiers grew slightly less rapidly. As the system became more vocational in the late 1960s, it also grew. Because of this growth, the absolute number of community college students who transferred to four-year colleges increased, even though the transfer enrollment rates were slowly declining. The number of community college students transferring to the University at Massachusetts at Amherst, for example, increased from just 80 in 1964, when only seven community college campuses were open, to 425 in 1970 and then to 950 in 1972, when twelve campuses were operating at full capacity. In 1973, at the peak of transfer enrollments, 1,165 public two-year college students enrolled at the University of Massachusetts; 680 enrolled in the state colleges; and 525 enrolled in four-year private colleges in Massachusetts.2 Although never more than a small fraction of total community college enrollments, transfer rates did rise dramatically, from approximately 12.5 percent of the sophomore class in 1964 (a rate congenial to the original planners) to nearly 30 percent of the sophomore class in 1973 (Beales 1974). The nationwide decline in the market for college-educated labor in the early 1970s hit Massachusetts with slightly greater force than in other states, being reinforced by a recession in the newly emerging high-technology belt around Boston that was related to the winding down of the war in Southeast Asia.


Author(s):  
Steven Brint ◽  
Jerome Karabel

An atmosphere of amiable routine now surrounds North Shore Community College in the Boston suburb of Beverly. Still located on a main downtown thorough-fare, as it has been since it opened in 1965, the college serves an economically varied region, including both the affluent oceanside towns of Marblehead, Swampscott, and Gloucester to the north and the chronically depressed old mill towns of Lynn and Peabody to the southwest. By the mid-1980s, enrollments were heavily occupational, and both staff and students seemed to like it that way. “There’s more demand than there are seats in the technical programs,” said one counselor. “In allied health, there’s a very heavy demand—three or four to one. But generally in liberal arts, we can accept people until the first week of classes.” The staff tended to view the history of their college as a natural unfolding. “The original intent,” observed one dean, “was to provide something for everyone, and that’s what we’ve done.” But vocational education did not always predominate at North Shore. Indeed, in 1965, the college’s first year of operation, over 80 percent of North Shore’s students were enrolled in liberal arts-transfer programs, and many of the faculty were committed to keeping the college’s distinctively academic image. According to one long-time member of the faculty, “At first, some of the faculty . . . had the idea that we were some kind of elitist thing. For them, the important thing was having the smartest students. . . . Quite of few of them were from universities. They didn’t know anything about community colleges.” “Yes, there were some internal battles,” one dean acknowledged. “The occupational programs were a concern to some liberal arts faculty.” The faculty’s grumbling had little effect on Harold Shively, the first president of North Shore. Shively, a long-time associate of William Dwyer in New York, shared Dwyer’s commitment to building a vocationally oriented system, and he did not wait long to press his plans for transforming North Shore in the direction suggested by this commitment.


Author(s):  
Steven Brint ◽  
Jerome Karabel

At the end of World War II, a sense of expectancy pervaded America’s colleges and universities. Enrollments had dropped during the war years, and many institutions looked forward to the return of millions of veterans. These veterans were themselves eager to get ahead in civilian life after the hardships of war, and the nation was eager to reward them for the sacrifices that they had made. Already in 1944, as the war was coming to a close, the prestigious Education Policies Commission of the National Education Association and the American Association for School Administrators came out with a report entitled Education for All American Youth. Though focused more on secondary than higher education, the report sounded some themes that were to shape thinking about education for veterans as well. Perhaps the most powerful of these themes was the belief that the war had called on all of the American people to make sacrifices and that efforts must be made to see that no segment of the population would be excluded from the rewards of American society. For higher education, in particular, this meant that new measures would be required to realize the traditional American dream of equality of opportunity. Alongside the idealistic impulse to extend to veterans unprecedented educational opportunities, there was also the fear that the nation’s economy would be unable to provide work for the millions of returning soldiers. The massive unemployment of the Great Depression had, after all, been relieved only by the boost that war production had given the economy. The end of the war therefore threatened—or so it was widely believed at the time—to send the economy back into a terrible slump. With so many soldiers returning home, the possibility of such a downturn frightened policy elites and the public alike, for it was almost certain to revive the bitter social and political conflicts of the 1930s. Together with more idealistic factors, this concern with the effects of the returning veterans on domestic stability led to one of the major higher education acts in American history: the G.I. Bill of 1944.


Author(s):  
Steven Brint ◽  
Jerome Karabel

From the earliest days of the Republic, Americans have possessed an abiding faith that theirs is a land of opportunity. For unlike the class-bound societies of Europe, America was seen as a place of limitless opportunities, a place where hard work and ability would receive their just reward. From Thomas Jefferson’s “natural aristocracy of talent” to Ronald Reagan’s “opportunity society,” the belief that America was—and should remain—a land where individuals of ambition and talent could rise as far as their capacities would take them has been central to the national identity. Abraham Lincoln expressed this deeply rooted national commitment to equality of opportunity succinctly when, in a special message to Congress shortly after the onset of the Civil War, he described as a “leading object of the government for whose existence we contend” to “afford all an unfettered start, and a fair chance in the race of life.” Throughout much of the nineteenth century, the belief that the United States was a nation blessed with unique opportunities for individual advancement was widespread among Americans and Europeans alike. The cornerstone of this belief was a relatively wide distribution of property (generally limited, to be sure, to adult white males) and apparently abundant opportunities in commerce and agriculture to accumulate more. But with the rise of mammoth corporations and the closing of the frontier in the decades after the Civil War, the fate of the “selfmade man”—that heroic figure who, though of modest origins, triumphed in the competitive marketplace through sheer skill and determination—came to be questioned. In particular, the fundamental changes then occurring in the American economy—the growth of huge industrial enterprises, the concentration of property less workers in the nation’s cities, and the emergence of monopolies—made the image of the hardworking stockboy who rose to the top seem more and more like a relic of a vanished era. The unprecedented spate of success books that appeared between 1880 and 1885 (books bearing such titles as The Law of Success, The Art of Money Getting, The Royal Road to Wealth, and The Secret of Success in Life) provide eloquent, if indirect, testimony to the depth of the ideological crisis then facing the nation.


Author(s):  
Steven Brint ◽  
Jerome Karabel

Since its origins at the turn of the century, the junior college has had a complex, and at times uneasy, relationship with a public that has looked to the educational system as a vehicle for the realization of the American dream. Despite its self-portrayal as “democracy’s college” and its often heroic efforts to extend education to the masses, the two-year institution has faced widespread public skepticism. For to most Americans, college was a pathway to the bachelor’s degree, and the junior college—unlike the four-year institution—could not award it. Moreover, the early public junior colleges were often tied administratively and even physically to local secondary schools, a pattern that compounded their problems in gaining legitimacy as bona fide institutions of higher education. The two-year institution’s claim to being a genuine college rested almost exclusively on its promise to offer the first two years of a four-year college education. Yet the junior college was never intended, despite the high aspirations of its students, to provide anything more than a terminal education for most of those who entered it; indeed, at no point in its history did even half of its students transfer to a four-year institution. Nonetheless, for at least the first two decades of its existence, almost exclusive emphasis was placed on its transfer rather than its terminal function. As the early leaders of the movement saw it, the first task at hand was to establish the legitimacy of this fragile institution as an authentic college. And this task could be accomplished only by convincing the existing four-year institutions to admit junior college graduates and to offer them credit for the courses that they had completed there. If the pursuit of academic respectability through emphasis on transfer dominated the junior college movement during its first decades, by the mid-1920s a countermovement stressing the role of the junior college as a provider of terminal vocational education began to gather momentum. Arguing that most junior college students were, whatever their aspirations, in fact terminal, proponents of this view saw the institution’s main task not as providing a platform for transfer for a minority but, rather, as offering vocational programs leading to marketable skills for the vast majority.


Sign in / Sign up

Export Citation Format

Share Document