The US used to have more multi-generational households than there are today. Several generations under one roof made for a symbiotic relationship among the family members. For instance, the elders preserved the traditions of the family, shared their wisdom with the young, and provided caregiving to the infants.
Then as they grew older, the youngest child would stay home to care for their aging parents or grandparents, instead of searching for work when they became of age. When the elders passed away, the youngest would inherit most of the land and the wealth of the family. At the beginning of the 1900s in America, the average life expectancy at birth was 41 years1. Because life expectancy was short, it wasn’t long before the youngest would inherit the land and then proceed with raising a family of their own.
As real GDP per capita climbed, life expectancy increased, caregiving became more accessible, and families had fewer children, the value-proposition tilted towards children seeking independence over caring for their parents and grandparents at home. Additionally, today’s parents and grandparents have been fortunate to have access to social, retirement, and health care programs that have helped them remain independent longer. These historical trends have slowly increased the elderly’s demand to live in retirement communities.
Retirement communities in America date back to the 17th century when the English settlers brought the idea of nursing homes to America; they called them almshouses. However, almshouses were also open to orphans and the mentally ill. During the Great Depression, these houses became overwhelmed with residents, and the elderly complained2. The US responded to the complaints by passing The Social Security Act of 1935. The Act provided a small amount of federal and state assistance to what we now know as nursing homes3.
Nursing homes’ popularity accelerated later when the US enacted two health insurance programs, called Medicare and Medicaid. Modern day health insurance started in the Depression Era when Baylor University Hospital discovered a new way to charge its patients for health care. Prior to Baylor’s discovery, hospitals billed its patients directly, using a fee-for-service model. During the Great Depression, patients didn’t want to pay large, one-time medical bills when they had little-to-no income. Baylor thought that it could change consumers’ perception of health care by changing its payment structure. It began to charge patients a small amount of money each month in exchange for covering their cost of care. While that payment system is widespread today, the idea didn’t catch on quickly. In 1940, only about 9% of Americans had some form of health insurance3. However, WWII changed the health insurance marketplace.
During the War, the US had a shortage of labor, so companies tried to increase their wages to attract workers. However, the US didn’t want wages to rise. Higher wages would translate to more expensive goods, which means a more-expensive war, and increased the risk of hyper-inflation. In response, the US passed the 1942 Stabilization Act to limit rising wages. Employers responded to the Act by offering their employees health insurance–if they couldn’t entice qualified workers with higher wages, then they had to attract them with better fringe benefits.
Harry Truman saw that access to health insurance corresponded to people living longer. He tried and then failed to pass a national health care plan in 1949. Inspired by Truman, Lyndon Jonson later signed the Social Security Amendments of 1965 at the Truman library. The law established Medicare and Medicaid, which provided health insurance to many Americans, in addition to funding to nursing homes. As nursing homes’ popularity surged, assisted living communities began to sprout. Assisted living facilities were designed as a variation of nursing homes. The idea of assisted living was to reduce environmental and organizational stress while promoting residents’ autonomy: the model includes private living spaces, a full array of services, and the residents’ right to make choices regarding daily activities and health care4. The success of assisted living in promoting independence, while offering supplemental care, has led to the constructions of over 28,000 assisted living facilities in the US.