1900s & 2000s
The diversity of old age in the 20th century had never been higher. Death in childhood, youth or middle age became unusual and shocking; more people lived longer, and remained healthy and active until later in life. For example, at the beginning of the century in Britain, an average of 74 people a year reached 100; by the end of the century, 3,000. Expectations of life in 1901 for men and women were age 51 and 58 respectively. In 1991, it was 76 and 81.*
The contributing factors for living longer included unprecedented improvements in living standards, especially income, diet and hygiene. Somewhat less important—leaps in medical knowledge and techniques, although this achievement was generally received with concern and pessimism about the burden that the growing numbers of older, supposedly dependent people would impose upon a shrinking younger population.
The disparity between official and popular perceptions of old age continued throughout the 20th century and became, if anything, starker. The decade of the 1930s found the United States facing the worst economic crisis in its modern history. Millions of people were unemployed, two million adult men wandered aimlessly around the country, banks and businesses failed, and the majority of the elderly lived in dependency. These circumstances led to the establishment of Social Security.
One of the core principles of the Social Security program that was adopted in late 1935 relied on the concept of social insurance. Social insurance was a respectable and serious intellectual tradition that began in Europe in the 19th century, and was an expression of a European social welfare tradition.
Philosophically, social insurance emphasized government-sponsored efforts to provide for the economic security of its citizens. It was first adopted in Germany in 1889 at the urging of Chancellor Otto von Bismarck. By the time the U.S. adopted social insurance, there were 34 nations already operating some form of social insurance program.
The U.S. Social Security program was designed to pay retired workers age 65 or older a continuing income after retirement. From 1937 until 1940, Social Security paid benefits in the form of a single, lump-sum payment. The purpose of these one-time payments was to provide some "payback" to those people who contributed to the program.
Under the 1935 law, monthly benefits were to begin in 1942, with the period 1937-1942 used both to build up the Trust Funds and to provide a minimum period for participation in order to qualify for monthly benefits.The average lump-sum payment during this period was $58.06. The smallest payment ever made was 5 cents.
The decade of the 1960s brought major changes to the Social Security program. Under the Amendments of 1961, the age at which men were first eligible for old-age insurance was lowered to 62, with reduced benefits. Women had been given this option in 1956.
The most significant administrative change involved the signing of the Medicare bill by President Lyndon Johnson on July 30, 1965, at which time the Social Security Administration became responsible for administering a new social insurance program that extended health coverage to almost all Americans aged 65 or older. Nearly 20 million beneficiaries enrolled in Medicare in the first three years of the program.
In 1972, automatic Cost-of-Living-Adjustments (COLAs) were introduced to maintain the purchasing power of benefits already awarded. Soon after, it became apparent that Social Security faced a funding shortfall, both in the short-term and in the long-term. The short-term problem was caused by the bad economy, and the long-term problem by the demographics associated with the baby boom.
Aside from Social Security, by the 1960s company-sponsored pensions of some kind were the normal expectation of the older worker. Pensions were payable from age 60 or 65 and even 70. The pension age became the normal age of voluntary or involuntary retirement, although in Communist countries, especially the Soviet Union (USSR/Russia), strenuous efforts were made to avoid linking pensions and retirement.
Regardless, for many the sudden experience of limitless leisure late in life was not always easy. British car-workers and shipbuilders retiring in the 1970s, for example, seemed worn-out and despairingly lost, cut off from former workmates. They often wept on their last day at work.
By the late 1990s, almost one-third of West European workers had retired permanently from paid work by the age of 60. Some left willingly on comfortable pensions. Others gave up work more reluctantly. It was sometimes argued that early retirement was the unavoidable consequences of changing technologies where skills and knowledge became obsolete, and older people could not keep pace.
But all evidence pointed in the opposite direction. Older workers, when given the opportunity, could cope better in the high-tech labor market and proved they were highly capable of learning new skills. They were also more reliable when compared with younger workers, but less malleable.
The 20th century was also the first where medicine achieved the capacity to diagnose and cure extensively, and only since the mid-20th century have medical services been easily and cheaply available to people of all ages in most developed countries. The term geriatrics was coined in 1909 by Ignatz Nascher, a physician who was born in Austria and brought up in New York City.
Nascher believed that doctors paid insufficient attention to the ill-health of older people since they had not long to live, and it was not thought worthwhile trying to cure them. The desire to prolong active life and reveal the secret of rejuvenation received stronger support in the Soviet Union, driven by the conviction that Bolshevism could revolutionize even someone’s lifespan. Soviet gerontologists argued that humankind had the capacity to live to 120 and beyond.
Geriatric medicine became increasingly attractive in non-Communist countries as the numbers of older people, and the costs of their medical care increased. In 1948, Britain introduced a National Health Service which provided free healthcare for all citizens. Establishment of similar systems throughout Europe became more widespread. Greater access to healthcare for poorer aging people revealed that attention to mundane but disabling conditions affecting hearing, eyesight, teeth and feet could greatly improve their lives.
The net effect--those who recovered from acute conditions which would have killed them in earlier times began to succumb to chronic disorders such as arthritis, diabetes and Alzheimer’s. Still, the majority of people surviving to their 80s and 90s did not suffer from acute illnesses and regarded themselves as in good health and capable of independent activity. For most people, even at very late ages, death was not preceded by a long period of serious dependency.
Consequently, older people in the later part of the 20th century tended to look younger than people of similar age in the past. Medical specialists suggested that 75-year-olds were physiologically similar to 60-or-65-year-olds at the beginning of the century. Also, the age specific dress codes of earlier centuries slowly disappeared. Some argued that a cult of youth forced older people to disguise their ages, denying them the possibility to grow old gracefully and naturally.
Regardless, old people by the end of the 20th century had more freedoms than ever before. Instead of the number of years lived, the inability to sustain independent living emerged as the most important perceived marker of the onset of old age.
* Part of 1900s section was derived from the chapter written by Pat Thane.
In the 21st century, the unprecedented development and diffusion of technologies will offer the potential of improving the independence and quality of life of older people. As a result, the world's population of seniors is projected to surge, increasing from 530.5 million in 2010 to 1.5 billion in 2050.
About one-in-six people will be 65 and older by 2050, creating a situation where health care, insurance, and retirement systems will face the daunting task of meeting the needs of a rising numbers of customers with fewer working-age people (ages 15 to 64) to help pay for it.
At a more personal level, longer life spans may strain household finances, cause people to extend their working lives or rearrange family structures. Determining how long to live before letting the next generation take over could prove to be the ultimate challenge for older people.
A recent Pew Research survey found a wide divergence in people’s confidence that they will have an adequate standard of living in their old age. Confidence in one’s standard of living in old age appears to be related to the rate at which a country is aging and its economic vitality.
Confidence was lowest in Japan, Italy and Russia--countries that are aging and where economic growth has been anemic in recent years. In these three countries, less than one-third of people are confident about their old-age standard of living. Meanwhile, there was considerable optimism about the old-age standard of living in countries whose populations are projected to be relatively young in the future or that have done well economically in recent years, such as in Nigeria, Kenya, South Africa and China.
Americans were less likely than most of the global public to view the growing number of older people as a major problem. They were more confident than Europeans that they would have an adequate standard of living in their old age. And the U.S. was one of very few countries where a large plurality of the public believed that individuals are primarily responsible for their own well-being.
Some Things to Think About:
Universal Basic Income (UBI)
Exponential advances in robotics and artificial intelligence could prove formidable. In a recent report to Congress, the White House put the probability at 83 percent that a worker making less than $20 an hour in 2010 will eventually lose their job to a machine while workers making as much as $40 an hour face a risk of 31 percent. An often cited Oxford University research paper estimated the potential automation of about half all existing jobs by 2033. A report by the World Economic Forum estimated that despite the creation of millions of new jobs over the next four years, there will likely be a net loss of 5 million because of automation.
Consequently, over the next few years expect to hear more talk about decoupling income from work through universal basic income programs. Essentially, UBI programs will try to balance the negative effects of automation, and much like Social Security, ensure that all citizens have the financial means to meet their basic needs. Switzerland, Finland and the Netherlands are at the forefront in this area. Of course, there are pros and cons to such programs. Click here to learn more.
Some researchers believe that future breakthroughs in technology will eventually enable humans to have indefinite lifespans through complete rejuvenation to a healthy youthful condition, and that it may be conceivable for human beings to transform themselves into beings with abilities so greatly expanded from the natural condition as to be no longer human by our current standards.
These "post or trans-humans" could be completely synthetic artificial intelligences, or a symbiosis of human and artificial intelligence, or uploaded consciousnesses, or the result of making many smaller but cumulatively profound technological augmentations to a biological human, such as a cyborg.
A variation on the posthuman theme is the notion that once a human being is no longer confined to the traditional parameters of human nature, they might grow physically and mentally so powerful as to appear god-like, and may ascend to a higher plane of existence that their behavior would not possibly be comprehensible to modern humans, purely by reason of their limited intelligence and imagination.
Apocalyptic scenarios forecast the end of the human race, while others argue for the likelihood that both humans and posthumans will continue to exist, but the latter will predominate in society over the former because of their abilities.
Living Together Apart
Since 1990, the divorce rate among adults 50 years and older has doubled. This trend, along with longer life expectancy, has resulted in many adults forming new partnerships later in life. A new phenomenon called ‘Living Apart Together’ (LAT)—an intimate relationship without a shared residence—is gaining popularity as an alternative form of commitment. Researchers at the University of Missouri say that while the trend is well understood in Europe, it is lesser known in the U.S. This means that challenges, such as how LAT partners can engage in family caregiving or decision-making, could affect family needs.
The Fourth Industrial Revolution
Growth in the first industrial revolution was driven by engineering, the second through electricity and production lines, and the third by technology and information. The modern economies that will undergo a fourth industrial revolution will not be those that worship machines, but those that support human creativity. When we understand how people think and work best, we will be compelled to put our workers’ well-being first in the name of both health and economic productivity.
The Dark Secret of Artificial Intelligence
No one really knows how the most advanced Artificial Intelligence algorithms do what they do. That could be a problem as deep-learning technologies transform how we make decisions.
Learn about the Science of Aging and Reverse Aging.