nursing: History of Nursing

History of Nursing

In ancient times, when medical lore was associated with good or evil spirits, the sick were usually cared for in temples and houses of worship. In the early Christian era nursing duties were undertaken by certain women in the church, their services being extended to patients in their homes. These women had no real training by today's standards, but experience taught them valuable skills, especially in the use of herbs and drugs, and some gained fame as the physicians of their era. In later centuries, however, nursing duties fell mostly to relatively ignorant women.

In the 17th cent., St. Vincent de Paul began to encourage women to undertake some form of training for their work, but there was no real hospital training school for nurses until one was established in Kaiserwerth, Germany, in 1846. There, Florence Nightingale received the training that later enabled her to establish, at St. Thomas's Hospital in London, the first school designed primarily to train nurses rather than to provide nursing service for the hospital. Similar schools were established in 1873 in New York City, New Haven (Conn.), and Boston. Nursing subsequently became one of the most important professions open to women until the social changes wrought by the revival of the feminist movement that began in the 1960s (see feminism). The late 20th cent. saw growing nursing shortages in U.S. hospitals as stagnant salaries, increasing workloads, and greater job opportunities for women led to falling enrollments in nursing degree programs.

Sections in this article:

The Columbia Electronic Encyclopedia, 6th ed. Copyright © 2024, Columbia University Press. All rights reserved.

See more Encyclopedia articles on: Medicine