American Medical Schools and the Practice of Medicine
Latest Publications


TOTAL DOCUMENTS

19
(FIVE YEARS 0)

H-INDEX

0
(FIVE YEARS 0)

Published By Oxford University Press

9780195041866, 9780197559994

Author(s):  
William G. Rothstein

After shortages of physicians developed in the 1950s and 1960s, federal and state governments undertook programs to increase the number of medical students. Government funding led to the creation of many new medical schools and to substantial enrollment increases in existing schools. Medical schools admitted larger numbers of women, minority, and low-income students. The impact of medical schools on the career choices of students has been limited. Federal funding for medical research immediately after World War II was designed to avoid politically controversial issues like federal aid for medical education and health care. The 1947 Steelman report on medical research noted that it did not examine “equally important” problems, such as financial assistance for medical education, equal access to health care, continuing medical education for physicians, or “the mass application of science to the prevention of many communicable diseases.” The same restraints prevailed with regard to early federal aid for the construction of medical school research facilities. Some medical school research facilities were built with the help of federal funds during and after World War II, but the first federal legislation specifically designed to fund construction of medical school research facilities was the Health Research Facilities Act of 1956. It provided matching grants equal to 50 percent of the cost of research facilities and equipment, and benefited practically all medical schools. In 1960, medical schools received $13.8 million to construct research facilities. This may be compared to $106.4 million for research grants and $41.5 million for research training grants in the same year. Federal grants for research and research training were often used for other activities. As early as 1951, the Surgeon General's Committee on Medical School Grants and Finances reported that “Public Health Service grants have undoubtedly improved some aspects of undergraduate instruction in every medical school,” with most of the improvements resulting from training rather than research grants. By the early 1970s, according to Freymann, of $1.3 billion given to medical schools for research, “about $800 million was 'redeployed' into institutional and departmental support. . . . The distinction between research and education became as fluid as the imagination of the individual grantees wished it to be.”


Author(s):  
William G. Rothstein

The professionalization of academic medicine occurred in the clinical as well as the basic science curriculum. Full-time clinical faculty members replaced part-time faculty members in the wealthier schools. Medical specialties, many of which were rare outside the medical school, dominated the clinical courses. Clinical teaching, which was improved by more student contact with patients, occurred primarily in hospitals, whose patients were atypical of those seen in community practice. The growing importance of hospitals in medical education led to the construction of university hospitals. Early in the century, some leading basic medical scientists called for full-time faculty members in the clinical fields. They noted that full-time faculty members in the basic sciences had produced great scientific discoveries in Europe and had improved American basic science departments. In 1907, William Welch proposed that “the heads of the principal clinical departments, particularly the medical and the surgical, should devote their main energies and time to their hospital work and to teaching and investigating without the necessity of seeking their livelihood in a busy outside practice” Few clinicians endorsed this proposal. They found the costs prohibitive and disliked the German system of medical research and education on which it was based. Medical research in Germany was carried on, not in medical schools, but in government research institutes headed by medical school professors and staffed by researchers without faculty appointments. All of the researchers were basic medical scientists who were interested in basic research, not practical problems like bacteriology. Although the institutes monopolized the available laboratory and hospital facilities, they were not affiliated with medical schools, had no educational programs, and did not formally train students, although much informal training occurred. For these reasons, their research findings were seldom integrated into the medical school curriculum, and German medical students were not trained to do research. German medical schools had three faculty ranks. Each discipline was headed by one professor, who was a salaried employee of the state and also earned substantial amounts from student fees. Most professors had no institute appointments and did little or no research.


Author(s):  
William G. Rothstein

Medical care at the end of the eighteenth century, like that in any period, was determined by the state of medical knowledge and the available types of treatment. Some useful knowledge existed, but most of medical practice was characterized by scientific ignorance and ineffective or harmful treatments based largely on tradition. The empirical nature of medical practice made apprenticeship the dominant form of medical education. Toward the end of the century medical schools were established to provide the theoretical part of the student’s education, while apprenticeship continued to provide the practical part. The scientifically valid aspects of medical science in the late eighteenth century comprised gross anatomy, physiology, pathology, and the materia medica. Gross anatomy, the study of those parts of the human organism visible to the naked eye, had benefitted from the long history of dissection to become the best developed of the medical sciences. This enabled surgeons to undertake a larger variety of operations with greater expertise. Physiology, the study of how anatomical structures function in life, had developed at a far slower pace. The greatest physiological discovery up to that time, the circulation of the blood, had been made at the beginning of the seventeenth century and was still considered novel almost two centuries later. Physiology was a popular area for theorizing, and the numerous physiologically based theories of disease were, as a physician wrote in 1836, “mere assumptions of unproved, and as time has demonstrated, unprovable facts, or downright imaginations.” Pathology at that time was concerned with pathological or morbid anatomy, the study of the changes in gross anatomical structures due to disease and their relationship to clinical symptoms. The field was in its infancy and contributed little to medicine and medical practice. Materia medica was the study of drugs and drug preparation and use. Late eighteenth century American physicians had available to them a substantial armamentarium of drugs. Estes studied the ledgers of one New Hampshire physician from 1751 to 1787 (3,701 patient visits), and another from 1785 to 1791 (1,161 patient visits), one Boston physician from 1782 to 1795 (1,454 patient visits), and another from 1784 to 1791 (779 patient visits).


Author(s):  
William G. Rothstein

After mid-century, university hospitals became more involved in research and the care of patients with very serious illnesses. This new orientation has created financial, teaching, and patient-care problems. In order to obtain access to more patients and patients with ordinary illnesses, medical schools affiliated with veterans’ and community hospitals. Many of these hospitals have become similar to university hospitals as a result. Medical schools experienced a serious shortage of facilities in their customary teaching hospitals after 1950. Many university hospitals had few beds or set aside many of their beds for the private patients of the faculty. Patients admitted for research purposes had serious or life-threatening diseases instead of the commonplace disorders needed for training medical students. The public hospitals affiliated with medical schools had heavy patient-care obligations that reduced their teaching and research activities. To obtain the use of more beds, medical schools affiliated with more community and public hospitals. The closeness of the affiliation has varied as a function of the ability of the medical school to appoint the hospital staff, the number of patients who could be used in teaching, and the type of students—residents and/or undergraduate medical students—who could be taught there. In 1962, 85 medical schools had 269 close or major affiliations and 180 limited affiliations with hospitals. Fifty-one of the hospitals with major affiliations were university hospitals and 100 others gave medical schools the exclusive right to appoint the hospital staffs. Dependence on university hospitals has continued to decline so that in 1975, only 60 of 107 medical schools owned 1 or more teaching hospitals, with an average of 600 total beds. All of the medical schools averaged 5.5 major affiliated hospitals, which provided an average of 2,800 beds per school. Public medical schools were more likely to own hospitals than private schools (39 of 62 public schools compared to 21 of 45 private schools), but they averaged fewer affiliated hospitals (5.1 compared to 6.0). In 1982, 419 hospitals were members of the Council of Teaching Hospitals (COTH), of which only 64 were university hospitals. Members of COTH included 84 state or municipal hospitals, 71 Veterans Administration and 3 other federal hospitals, and 261 voluntary or other nonpublic hospitals.


Author(s):  
William G. Rothstein

The use of hospitals for medical care became more varied after 1950. More patients were admitted for a wide variety of conditions and more different types of treatments were provided. Many new technologies were adopted that have raised costs considerably. Hospitals employed more residents, foreign medical graduates, and nurses. Between 1946 and 1983, hospitals grew both in size and importance in the health care system. The number of short-term nonfederal hospitals increased by only one-third, but the number of beds and the average daily census doubled and the number of admissions increased 2.6 times, while the U.S. population grew by only two-thirds. Much of the additional use was for nonsurgical care. During the 1928–1943 period, 74 percent of all hospital admissions were surgical. This declined to 60 percent between 1956 and 1968 and to 50 percent between 1975 and 1981. Outpatient care grew even more rapidly than inpatient care, with the number of hospital outpatients doubling between 1965 and 1983. The hospital system has become dominated by large hospitals, practically all of which have affiliated with medical schools. In 1983, the 18 percent of nonfederal short-term hospitals that had 300 or more beds admitted 50 percent of the patients, carried out 59 percent of the surgery, and had 55 percent of the outpatient visits and 61 percent of the births. They employed 72 percent of all physicians and dentists employed in hospitals and 90 percent of all medical and dental residents. At least 60 percent of them had nurseries for premature infants, hemodialysis units, radiation therapy or isotype facilities, computerized tomograhy (CT) scanners, and cardiac catheterization facilities, and almost one-half had open-heart surgery facilities. Most also offered types of care not traditionally associated with hospitals. Practically all of them provided social work services and physical therapy, at least 75 percent provided occupational and speech therapy, and 40 percent provided outpatient psychiatric care. On the other hand, fewer than one-third provided family planning, home care, or hospice services, or partial hospitalization for psychiatric patients. The expanding services of nonfederal short-term general hospitals has led to the employment of larger numbers of workers.


Author(s):  
William G. Rothstein

During the first half of the twentieth century, American medical education underwent drastic changes. Greater costs of operation and the requirements of licensing agencies forced many medical schools to close and most of the others to affiliate with universities. The surviving medical schools were able to raise their admission and graduation requirements, which was also made possible by the rise in the general educational level of the population. The growth of the basic medical sciences led to the development of a new kind of faculty member whose career was confined to the medical school. During the first half of the twentieth century, the educational level of the population rose significantly. The proportion of the 17-year-old population with high school educations increased from 6.3 percent in 1900 to 16.3 percent in 1920, 28.8 percent in 1930, and 49.0 percent in 1940. The number of bachelors’ degrees conferred per 100 persons 23 years old increased from 1.9 in 1900 to 2.6 in 1920, 5.7 in 1930, and 8.1 in 1940. Between 1910 and 1940, the number of college undergraduates more than tripled. Because the number of medical students did not increase, medical schools were able to raise their admission standards. At the same time, many new professions competed with medicine for students. Between 1900 and 1940, dentistry, engineering, chemistry, accounting, and college teaching, among others, grew significantly faster than the traditional professions of medicine, law, and the clergy. Graduate education also became an alternative to professional training. Between 1900 and 1940, the number of masters’ and doctors’ degrees awarded, excluding medicine and other first professional degrees, increased from 1,965 to 30,021, or from 6.7 to 13.9 percent of all degrees awarded. Colleges and universities decentralized their organizational structure to deal with the increasingly technical and specialized content of academic disciplines. They established academic departments that consisted of faculty members who shared a common body of knowledge and taught the same or related courses. Departments were given the responsibility of supervising their faculty members, recruiting new faculty, and operating the department’s academic program. By 1950, departments existed in most of the sciences, social sciences, and humanities.


Author(s):  
William G. Rothstein

The expansion of the functions of medical schools since mid-century has had many unanticipated and adverse consequences for medical education. As a result, medical schools have lost some of their societal support. In the years since 1900, medical schools have made major changes in their structure in order to solve specific educational problems. University hospitals were built to provide clinical training in hospitals that emphasized education and research rather than patient care. Full-time clinical faculty members were employed in order to professionalize a role previously occupied by part-time practitioner-educators. Biomedical research was undertaken to enable faculty members to advance medical knowledge and enhance their skills as educators. Internships and residencies became restricted to hospitals affiliated with medical schools to replace the poorly supervised practical experience provided in community hospitals with a more structured education administered by professional educators. Each of these changes assumed that medical schools could be removed from the hurly-burly of professional life and made to fit the model of the liberal arts college. This assumption failed to recognize the fundamental differences between the two types of institutions. In liberal arts education, the body of knowledge taught to students need not be suitable for practical application in the community. In many fields, like most of the humanities, it has rarely been used outside of institutions of higher education. In others, like the social sciences, the knowledge has been sufficiently tentative that its direct application has been problematic. In still others, like most natural sciences, the knowledge has been so highly specialized that it could not provide a basis for viable careers. As a result, most faculty members in the liberal arts and sciences have spent their careers in teaching and research without the option of nonacademic employment in their disciplines. Medical schools, on the other hand, have continually influenced and been influenced by the practice of medicine in the community. The knowledge taught in medical schools has affected the way that physicians have practiced medicine, but it has also been tested by practitioners and fed back to the faculty for modification and refinement.


Author(s):  
William G. Rothstein

Training in primary care has received limited attention in medical schools despite state and federal funding to increase its emphasis. Departments of internal medicine, which have been responsible for most training in primary care, have shifted their interests to the medical subspecialties. Departments of family practice, which have been established by most medical schools in response to government pressure, have had a limited role in the undergraduate curriculum. Residency programs in family practice have become widespread and popular with medical students. Primary care has been defined as that type of medicine practiced by the first physician whom the patient contacts. Most primary care has involved well-patient care, the treatment of a wide variety of functional, acute, self-limited, chronic, and emotional disorders in ambulatory patients, and routine hospital care. Primary care physicians have provided continuing care and coordinated the treatment of their patients by specialists. The major specialties providing primary care have been family practice, general internal medicine, and pediatrics. General and family physicians in particular have been major providers of ambulatory care. This was shown in a study of diaries kept in 1977–1978 by office-based physicians in a number of specialties. General and family physicians treated 33 percent or more of the patients in every age group from childhood to old age. They delivered at least 50 percent of the care for 6 of the 15 most common diagnostic clusters and over 20 percent of the care for the remainder. The 15 clusters, which accounted for 50 percent of all outpatient visits to office-based physicians, included activities related to many specialties, including pre- and postnatal care, ischemic heart disease, depression/anxiety, dermatitis/eczema, and fractures and dislocations. According to the study, ambulatory primary care was also provided by many specialists who have not been considered providers of primary care. A substantial part of the total ambulatory workload of general surgeons involved general medical examinations, upper respiratory ailments, and hypertension. Obstetricians/ gynecologists performed many general medical examinations. The work activities of these and other specialists have demonstrated that training in primary care has been essential for every physician who provides patient care, not just those who plan to become family physicians, general internists, or pediatricians.


Author(s):  
William G. Rothstein

Undergraduate medical education has changed markedly in the decades after mid-century. The basic medical sciences have been de-emphasized; clinical training in the specialties has replaced that in general medicine; and both types of training have been compressed to permit much of the fourth year to be used for electives. The patients used for teaching in the major teaching hospitals have become less typical of those found in community practice. Innovations in medical education have been successful only when they have been compatible with other interests of the faculty. As medicine and medical schools have changed, major differences of opinion have developed over the goals of undergraduate medical education. Practicing physicians have continued to believe that the fundamentals of clinical medicine should be emphasized. A survey in the 1970s of 903 physicians found that over 97 percent of them believed that each of the following was “a proper goal of medical school training:” “knowing enough medical facts;” “being skillful in medical diagnosis;” “making good treatment plans;” “understanding the doctor-patient relationship;” “understanding the extent to which emotional factors can affect physical illness;” “being able to keep up with new developments in medicine;” and being able to use and evaluate sources of medical information. Only 52 percent felt that “being able to carry out research” was a proper goal of medical school training. Medical students have also believed that undergraduate medical education should emphasize clinical training. Bloom asked students at one medical school in the early 1960s whether they would prefer to “work at some interesting research problem that does not involve any contact with patients,” or to “work directly with patients, even though tasks are relatively routine.” About 25 percent of the students in all four classes chose research, while 58 percent of the freshmen and 70 percent of the juniors and seniors chose patient care. The same study also asked students their criteria for ranking classmates “as medical students.” Clinical skills were the predominant criteria used by students, with “ability to carry out research” ranking far down on the list. Faculty members, on the other hand, have emphasized the basic and preliminary nature of undergraduate medical education.


Author(s):  
William G. Rothstein

Research in medical schools developed after World War I with specific projects funded by foundations, firms, and industries. After World War II, medical schools greatly expanded their research activities with funding from the federal government. Medical school researchers became the most important performers of research funded by the National Institutes of Health, which delegated most of its responsibility for setting research policy to academic medical researchers. Both basic science and clinical research in medical schools has been directed toward an understanding of biological processes rather than the prevention and treatment of disease. Medical school research has become a specialized activity separate from other medical school activities. Research in medical schools began in earnest after 1900 with the employment of full-time faculty members. The quantity of research was limited and the quality did not meet European standards. Erwin Chargaff reminisced that when he came to the United States in 1928, “I found a scientifically underdeveloped country dominated by an unhurried, good-natured, second-rateness. European scientists who visited the country at that time were attracted by the feeling of freedom generated by the wide open spaces and even more by the then very pleasant aroma of the dollar.” Research was at first funded from medical school endowments and grants from a few major foundations, such as the Rockefeller Foundation and the Carnegie Foundation. By the mid-1930s, about 20 private foundations had a major interest in health and spent a total of about $7 million annually for medical research and medical education. About this time also, the American Foundation for Mental Hygiene, the American Cancer Society, the National Foundation for Infantile Paralysis, and other health-related associations began to fund research related to their interests. Private firms also sponsored research with direct commercial applications. In return, they used the names of the medical schools in advertisements as providing “scientific” data to support their claims. By 1940, research had become a measurable factor in medical school budgets. In that year Deitrick and Berson found that 59 of the 77 medical schools spent $3.2 million on research: 22 public medical schools spent 8.9 percent of their combined budgets of $9.5 million on research, and 37 private medical schools spent 13.0 percent of their budgets of $17.8 million on research.


Sign in / Sign up

Export Citation Format

Share Document