Wealth and generations: by focusing on the growing riches of the "1 percent," we miss another form of inequality that is bigger, and arguably even more dangerous.
That's because today's talk about inequality generally isn't about actual people. It's about disparities between different, abstractly drawn, arithmetically defined, statistical categories. And so we hear about how, say, the "top 1 percent" compares in income to the "bottom decile."
Of course, important truths are revealed by such comparisons. Mostly these have to do with the strong trend of super-rich people getting even richer--a trend that leads many people to worry, with reason, about rule by plutocrats and the coming of a new Gilded Age.
But, frankly, that kind of inequality is not the half of it. For one, most of us are much more concerned with how well we are doing compared to five or ten years ago, or compared to the life we remember our parents having at our age, or about whether our children will ever do as well. Those kinds of questions have a lot more emotional and practical personal import than whether the "1 percent" gobbled up another percentage point of the nation's income.
Nor, at least arguably, is the kind of inequality measured by such statistics the kind that has been growing the most over the last several decades. Worse, because of the habits of thought that build up from constant use of such statistical artifices in political discourse, our political conversations tend to minimize the full extent of inequality as it's experienced by most Americans.
Here's an example. How often have you heard it said that the average middle-class family is suffering from "stagnant" income? That's the kind of formulation that emerges if you look at, say, the middle three-fifths of the income distribution today and compare it to the middle three-fifths of the income distribution in, say, 1979. After adjusting for inflation, the picture shows middle-class households having basically the same income as they had thirty-six years ago.
Yet there is a big problem with that conclusion: these are not the same people! The heads of today's young households weren't even born in 1979. And those who were middle-aged in 1979 are now deep into their retirement years.
What happens when we use statistics that treat these very different people as if they were the same people? We come up with averages or medians that smooth over and obscure the vast differences in the economic trajectories that Americans of different generations have been on during the last several decades.
Specifically, we miss two huge trends. The first is positive.
We miss that those Americans who were middle-aged in 1979 have, as a whole, seen their standard of living rise sharply compared both to their own previous experience and to that of their counterparts in the previous generation. So, for example, when people who were fortysomething in the late 1970s became fifty something in the late 1980s, their income and net wealth were not only higher than they had been ten years before, they were also far better off than fiftysomethings had been in the 1970s. And as retired seventysomethings today, not only have most seen their personal income net worth hold even or even continue to rise, they are also way better off financially than were seventysomethings in the 1990s. For this birth cohort of Americans, dramatic upward mobility, not stagnation, has been the norm.
The other big trend is what has been happening to each subsequent younger cohort of Americans, which is basically the opposite. Start, for example, with the twentysomethings of 1979. They had a lower real income in 1979 than twentysomethings did in 1969. And as fiftysomethings now, they not only make less money than they did when they were fortysomething, they are also far worse off as a whole than were the fiftysomethings of 2005. This generalization applies to white members of this cohort and even more so to those who are African American or Hispanic.
Today's fiftysomethings may be part of the first generation in American history to experience this kind of lifetime downward mobility, in which at every stage of adult life, they have had less income and less net wealth than did people who were their age ten years before. Yet these mid-wave Baby Boomers shouldn't feel too sorry for themselves. That's because, as we shall see, they were far better off as twentysomethings than were subsequent cohorts of Generation X twentysomethings, and especially better off than today's Millennials.
These vastly different economic trajectories experienced by today's living generations are basically unprecedented. Throughout most of our history, inequality between generations was large and usually increasing, to be sure, but for the happy reason that most members of each new generation far surpassed their parents' material standard of living. Today, inequality between generations is increasing for the opposite reason. Though much more productive and generally better educated, most of today's workers are falling farther and farther behind their parents' generation in most measures of economic well-being.
If it were just a matter of the old getting richer while the young get poorer, it would not necessarily be so bad. Under that scenario, most of us might struggle financially until we grew old, but we could at least look forward to realizing a variant of the American Dream in retirement. But that's not how these trends are playing out. The downward mobility of today's younger Americans leads to the downward mobility of tomorrow's older Americans, making the problem of growing generational inequality truly dire. It's time to get clear about just what's been going on and what we can do about it.
Let's first get some predictable objections out of the way, starting with the problem of defining and measuring what we mean by a standard of living. Leave aside those philosophical questions we could all debate endlessly about whether and how, say, smartphones, gay marriage, and climate change make younger generations better or worse off. Other more straightforward challenges apply even when comparing the strictly material standard of living of different birth cohorts at different times in their lives.
Among the complicating factors are changes in family structure and size (more single-parent homes, fewer children overall), the rise in the number of working women, and the increasing proportion of the population drawn from historically disadvantaged groups. Other considerations include the true measure of inflation, the amount of financial and unemployment risk borne by individuals in different eras, and changes in educational attainment. Yet while no single metric is perfect, in combination they tell a dramatic and, by and large, depressing story.
The most straightforward apples-to-apples comparison is between the amount of income working-age men with a specific level of education make today compared to what their counterparts in the previous generation made. According to work done by the economists Michael Greenstone of the Massachusetts Institute of Technology and Adam Looney of the Brookings Institution, the steepest downward mobility has been among male high school dropouts, who in 2009 earned 66 percent less (adjusted for inflation) than their counterparts did in 1969, due to a combination of falling real wages and declining labor force participation rates.
The slide for men with only a high school degree, who constitute the majority of men in the United States, was almost as bad: a staggering 47 percent. College-educated men did better, of course, but only by falling not as far. For prime-age male college graduates, real earnings in 2009 were 12 percent below that enjoyed by their counterparts forty years before. Even among those who worked full-time, real earnings were 2 percent below that of their counterparts in 1969.
These trends were well in place before the coming of the Great Recession from 2007 to 2010. According to work by Jeff Madrick and Nikolaos Papanikolaou, between 1969 and 2005 real earnings for full-time male workers, ages twenty-five to thirty-four with only a high school degree, declined from $34,681 to $30,000 (in 2005 dollars). Meanwhile, full-time college-educated male workers of the same age eked out hardly any gains compared to their counterparts in the previous generation, as real wage and salary income for this group increased at an annual growth rate of just 0.1 percent between 1969 and 2005.
Making a similar comparison between today's working women and their counterparts a generation ago reveals an only slightly less dramatic story. For example, according to a recent Brookings study, among full-time working women, ages thirty to forty-five, who lack a high school degree, real wages were 12 percent lower in 2013 than they were for their counterparts in 1990. For the typical working woman in this age group who has a high school degree but never graduated from college, wage and salary increases have been hardly measurable from one generation to the next, rising by just 3 percent between 1990 and 2013.
Only college-educated women in this age cohort who worked full-time saw any substantial gains over their counterparts of 1990. Primarily this was a compositional effect, mostly due to modest increases in the numbers of women in managerial jobs rather than to any general increase in wages for women doing the same jobs.
These trends for men and women converge in the statistics on family income, which, especially for the young, show dramatic downward mobility. The median income among families headed by someone under thirty-five was just $35,500 in 2013. Adjusted for changes in the consumer price index, this was down nearly 20 percent from what young families earned in 2001.
But this hardly tells the whole story. Another meat sure of generational downward mobility is the ever-earlier age at which workers in successive cohorts have typically seen their earning top out and then fall into permanent decline. In nearly all previous eras, workers normally saw their income rise in their twenties, thirties, forties, and fifties as they gained education and experience and as wage rates in general grew. A worker's earnings (as well as savings) would then typically peak and level off in late middle age before declining around age sixty-five. Though their earnings might have been interrupted by illness or temporary unemployment, most workers generally earned more than they had in the past until they retired. This historical pattern still held strongly through about 2000, after which successive birth cohorts of Americans started seeing their earnings peak and then decline at younger and younger ages.
The tipping point came with the cohort born between 1946 and 1950. The median household income of these early-wave Baby Boomers rose steadily during their early working-age years, in accord with the historical pattern. Adding to these gains in household income was a sharp increase in the number of working women, as the two-paycheck family gradually became the middle-class norm. Yet despite this increase in female paid labor, median income for these households started declining while their prime wage earners were still in their early fifties--a time of life when members of previous generations were typically gaining in real income from year to year. According to census data compiled by Robert J. Shapiro of Brookings, for members of this cohort median household income peaked in 2000 at $78,458 and fell each year thereafter, winding up at an inflation-adjusted $50,834 in 2013.
This pattern has grown progressively worse among Americans born in subsequent years. For example, mid-wave Boomers born between 1953 and 1957 saw their median household income peak at $77,543 in 2002, when they were between forty-five and forty-nine. For this cohort, household income subsequently fell by half a percent annually during the so-called economic recovery years of 2002 to 2007 and then fell much more during and after the Great Recession, to $60,100 by 2013. Financially speaking, fifty turned out to be the new sixty-five for these cohorts, even as they were expected to live longer.
For late-wave Boomers and early-wave Gen Xers, the story has been worse. For example, among persons born between 1962 and 1966, median household income peaked in 2007, when they were still between the ages of forty-one and forty-five, and has not yet recovered.
Late-wave Gen Xers and all Millennials are still young enough that most have probably not yet reached (let's hope!) their personal lifetime peak of annual earnings. Yet the trend in median household income among these younger birth cohorts shows that most members have already missed out on the rapid increase in earnings that members of previous generations typically experienced in their twenties and thirties. This early-career earnings deficit has left them with fewer dollars to save while young, putting them even further behind older cohorts in their ability to build long-term assets, such as adequate saving for retirement.
Contributing to this trend are large numbers of Americans who were raised in middle-class homes but who have fallen down the economic ladder as adults. According to a study by the Pew Charitable Trusts of children born in the late 1970s, a third of those raised in middle-class families--defined as families between the 30th and 70th percentiles of the income distribution--have fallen out of the middle class in adulthood. This phenomenon is particularly pronounced among members of minority groups. Among African Americans who were raised in middle-class families, for example, 37 percent fell out of the middle class by the time they reached middle age. The corresponding number for their white counterparts was 25 percent.
How do these rates compare with the numbers of Americans who move up from poverty? Recent research by Raj Chetty and others shows that over the last two generations, fewer than one out of ten children born to parents in the bottom fifth of the income distribution managed to rise to the top fifth as adults. This ratio has apparently not changed since the 1970s. Yet because overall income inequality has increased substantially since then, the consequences of failing to rise up from the bottom of the income ladder have become more extreme, as have the consequences of falling down the ladder.
Income alone does not define a standard of living, of course. Getting ahead in life also requires accumulating assets, such as home equity and financial savings, that exceed one's debts and other liabilities. Without at least some net wealth, it is impossible to finance a first home, pay for a child's college education, enjoy financial security in old age, or leave behind an inheritance. The opposite of net wealth is insolvency.
Until the present era, despite vast disparities and inequalities across different racial, ethnic, and other demographic groups, most American families realized a rising net worth, not just within the life course of each generation but from one generation to the next. Today's older Americans still exemplify this historical pattern. For example, according to work done by the Urban Institute, Americans who were seventy-four or older in 2010 had an average net worth that was 149 percent higher than that enjoyed by Americans who were the same age in 1983 (after adjusting for inflation).
This pattern has disappeared, however, among all subsequent birth cohorts. The tipping point came among people born in precisely 1952, who, by 2010, became perhaps the first birth cohort in American history to have less real net worth on the threshold of retirement than people born ten years earlier had at the same age. From there, the real net worth of subsequent birth cohorts has generally remained stagnant or has declined compared to the life-cycle experience of birth cohorts ten to twenty years older. For example, after adjusting for inflation, the median net worth of families headed by a person thirty-five to forty years old was 30 percent less in 2010 than it was for their counterparts in 1983.
Because of the vast upward mobility of the cohorts born before the 1950s, and the general downward mobility of Americans born during and after that decade, the economic status of the next generation of elders will, on current course, be lower than that of today's retirees--and their children are even less likely to be able to make up any shortfall. One study by the Pew Charitable Trusts has found that the typical retiree couple born between 1936 and 1945 had enough net wealth to replace 100 percent of their pre-retirement income when combined with annuitized assets, such as private pensions and Social Security. For younger Americans, however, that replacement ratio drops steadily. Because of the comparatively meager net wealth of most Gen Xers, for example, the typical Gen X couple is on course to see their income drop in half in retirement.
This is assuming that both members of such a couple are able to continue working until the previously normal retirement age, which may well not happen. Labor force participation rates for men now sixty-five and older have increased compared to those of their counterparts in the early 1990s. But for today's prime-age men, corporate downsizing, low wages, obsolete job skills, rising rates of chronic illness such as diabetes, long-term unemployment, and other factors have been driving down labor force participation rates sharply. The share of prime-age men--those twenty-five to fifty-four years old--who are in the workforce declined by 5 percent between 1992 and 2012. Since the 1960s, the share of prime-age men no longer in the workforce has roughly tripled. Taken together, these trends paint a picture of a new America in which most members of each successive generation typically have lower real household income and net wealth than did their counterparts in the previous generation, while men, at least, also have a shorter and less secure attachment to the workforce.
Adding to the difficulties facing future elderly Americans is the disappearance of windfall Social Security benefits. In the late 1970s, Social Security paid out benefits to retirees that exceeded the value of their contributions by between $250,000 and $300,000 in today's money. Subsequent birth cohorts have paid a far higher share of their income into the system, but under current law, most members are promised little more in benefits than they paid in taxes. Social Security payroll taxes remained below 2.5 percent through the 1950s and below 4 percent until the end of the 1960s. But workers born in the 1960s have paid 6.2 percent of their income into the system throughout most of their working lives--and, in truth, it's double that, since most economists agree that the employer contribution in payroll taxes is ultimately borne by employees.
Having effectively paid about one out of every eight dollars they earned into Social Security, the ability of Americans born during and since the 1960s to save for their own retirement has been correspondingly reduced, even as the system's rate of return has become progressively less for each new generation. The same diminishing rate of return is found in many private pension plans as well, even as pension coverage itself has also fallen precipitously among today's young and middle-aged workers.
The declining cost and increasing quality of digital technologies, as manifested by smartphones and their apps, give many of today's Americans access to goods and services that were beyond the reach of even the richest people on the planet a generation ago. The price of food, cars, and many other consumer items is also lower, relative to wages, than was the case thirty or forty years ago. Yet the cost of the particular goods and services Americans most need to help themselves and their children rise up the economic ladder have at the same time grown much faster than family income or general inflation. This is another large factor behind the stark increase in wealth inequality among the generations.
One major example is the inflation in higher education costs. Over the last generation, graduating from college has become a near prerequisite to obtaining middle-class status, or avoiding losing it. Yet even as the cost of paying for higher education became, for that reason, harder for families and individuals to avoid, the cost of attending a public or private college escalated 40 percentage points more than the consumer price index between 2005 and 2015.
Compounding the burden, the share of the higher education sector's revenue paid by families and students rose from one-third in 1980 to one-half in 2012, reflecting not just rising tuition but also a sharp decline in need-based financial aid over the last generation. Closing the gap has been a mountain of debt on household balance sheets. The share of young adults with student loans rose from 26 percent in 2001 to 40 percent in 2010. Sadly, much of this debt is held by people who never finished college, and who have often been victimized by predatory lending practices.
Meanwhile, the dramatic rise of health care costs relative to family incomes has been, and will continue to be, particularly burdensome on younger generations. As recently as the 1960s, health care costs were an incidental expense for most young American families. In 1964, health care spending was just $197 per person per year. This low cost meant that with a mere seventy-eight hours of labor (or by the end of the second workweek in January, for those working full-time), the average nonsupervisory worker earned enough to cover the per capita cost of health care, including that of all children and retirees.
By contrast, such a worker had to put in 452 hours in 2012 before earning enough to cover the average per capita burden of medical expenses, which by then had risen to over $8,915. Put another way, in that year it was nearly March before the typical American working a forty-hour week earned enough to pay the health care sector's growing claim on his or her personal output.
The total annual cost of health care for a typical family of four covered by a typical employer-sponsored plan reached $23,215 in 2014, or roughly the equivalent cost of buying a brand-new Honda Accord LX every year. The continually growing burden of health care costs is a major reason why employers are so reluctant to hire and wages remain stagnant.
While some of the increase in health care costs reflects genuine advances in the practice of medicine, most simply reflects rising prices for existing medical services combined with an increasing volume of redundant tests, unnecessary surgeries, and other forms of overtreatment that don't improve health. Peer countries achieve better population health and life expectancy while expending as little as half as much per person in health care services. As such, most of the increasing cost of health care is pure inflation and does not reflect improvement to the average American's standard of living.
Another factor behind the aggregate downward mobility of Americans born since roughly 1950 has been their exposure to the massive growth of payday loans, subprime mortgage lending, and other wealth-destroying consumer finance products. Americans who came of age before the 1970s were largely protected from predatory lending by usury laws, for example, which capped fees and interest costs on loans. But starting in the 1980s, these consumer finance protections largely disappeared. At the same time, financial engineering, including securitization, led to the growth of financial institutions with business models that allowed them to prosper--at least in the short term--by lending money to people who could not afford to repay.
These trends, combined with generally lagging or falling individual and household incomes and rapidly expanding access to credit, often on predatory terms, lead to an explosion of borrowing. When this was followed, in turn, by a collapse in home prices, the result was devastation to the balance sheets of most Americans under fifty. By 2010, the average family headed by a person twenty-five to forty-nine had a net worth that was 32 percent below that of their counterparts in 1989.
This sequence of events particularly damaged members of Generation X, many of whom took out mortgages on predatory terms at, or near, the top of the housing bubble. Largely as a result, from 2007 to 2010, Gen Xers as a whole lost nearly half (45 percent) of their wealth, or an average of about $33,000 subtracted from already low levels. Many were pushed into negative net worth, as their houses became worth less than their debts. By contrast, those born during the Great Depression era (between 1926 and 1935) experienced zero loss of net wealth as a group during the Great Recession. Indeed, in 2010, those ages sixty-five to seventy-four had a net worth 53 percent higher than that of their same-age counterparts in 1989.
Most Millennial were too young to be in the market for real estate during the housing bubble in the mid-20oos and therefore did not directly experience the evaporation of real estate wealth caused by the Great Recession. But while this might be counted as a blessing, the longer-term trend of declining asset ownership among today's younger Americans has potentially very negative implications for their future net wealth.
For example, rates of homeownership among households headed by a person under thirty-five have fallen from 43 percent in 2005 to 35 percent in 2014. To be sure, not every Millennial wants or needs to own a house. But homeownership has been the major means by which most ordinary Americans in previous generations built their net wealth and financed their retirements. Moreover, home prices have been recovering since the bottom of the Great Recession, and in many places have been escalating sharply. Thus, the continuing decline of homeownership rates among young households has probably subtracted from what the Millennials' aggregate net wealth would have otherwise been. And if the typical Millennial winds up a renter for much, if not all, of his or her life, this will certainly require that the generation acquire some other major means for building assets over the life course.
A sharper decline in stock ownership among young adults does not bode well for that possibility. In 2001, 48 percent of persons eighteen to thirty-one years of age owned stock; by 2013, this share had dropped to 37 percent. This long-term decline in stock ownership among the young occurred in a period in which stocks, despite volatility, appreciated in value by severalfold. It also occurred at a time when traditional, defined-benefit, employer-provided pensions were becoming vanishingly rare among younger workers.
Younger cohorts of Americans are also increasingly less likely to own their businesses, as this magazine was among the first to point out. (See "The Slow-Motion Collapse of American Entrepreneurship," July/August 2012.) On a per capita basis, the rate of new business formation declined by 50 percent between 1977 and 2009, a trend that leaves more businesses failing each year than are started. As Federal Reserve Chair Janet Yellen recently observed, the declining share of Americans who are business owners diminishes what historically has been "a vital source of opportunity for many households to improve their economic circumstances and position in the wealth distribution."
The trend now seems to be compounding among Millennials, who, despite high aspirations to entrepreneurship, are having a difficult time starting new successful businesses. As a recent report by the Kauffman Foundation concludes, even though Millennials have higher levels of education than previous cohorts and lifelong exposure to information technology, their shaky finances mean that most "can't afford to become entrepreneurs." (See also Matt Connolly, "The Lost Entrepreneurial Generation?," page 60.)
Millennials are also less likely than young adults in the past to own other forms of assets, including cars and many durable consumer items. In some instances, this can be positive. If, for example, the growth of services like Zipcar makes owning a rapidly depreciating asset like an automobile unnecessary, this is at least potentially a gain to one's net worth. Being able to "monetize" previously underused assets, such as by renting a spare bedroom through Airbnb, can also have the same positive effects on personal balance sheets.
Yet the flip side of the "sharing" economy is the "gig" economy, in which more and more of us, and particularly the young, are no longer employees but, rather, contingent workers who become responsible for buying and maintaining tools, equipment, and places of business, as well as securing health and retirement benefits, that, in previous eras, were furnished by employers. The Uber driver, for example, has responsibility for purchasing and maintaining the car he uses to work for the "ride-sharing" company, just as the contract white-collar worker must often finance and maintain her own office space, IT systems, career training, and other hard and soft assets necessary for her work. Both are also on their own when it comes to traditional employee benefits, and because they cannot count on a regular paycheck, they have an extra need for building savings to cover the increased volatility in their earnings. Though difficult to measure, the increasing uncertainty and contingency that surrounds today's employment has to be counted as a net negative for most workers' standard of living.
To be sure, some of the factors behind generational downward mobility are difficult to address through public policy. For example, over the last several generations there has been a huge increase within each successive birth cohort of Americans in the share of children being raised by single parents. As recently as the 1980s, among children born to mothers with only a high school degree, only 13 percent were born outside of marriage. By the late 2000s, that figure had risen to 44 percent. Abundant social science research documents that this is both a cause and a consequence of diminishing economic opportunity, yet there is no single policy lever that will reverse the trend.
But many of the major causes of downward mobility do rest squarely within the realm of political economy and public control. One example is the woeful inefficiency of the U.S. health care system, which costs far more and produces the same, or worse, outcomes relative to other industrialized nations. A large body of research now pegs the amount of waste in this burgeoning sector at between 30 and 50 percent of all health care spending. According to the National Institute of Medicine, eliminating this waste would be enough to provide every young person in America (ages eighteen to twenty-four) with the average annual tuition and fees of a four-year institution of higher learning for two years--to take but one example of its tremendous opportunity cost.
The higher education sector is also badly in need of systematic rethinking and overhaul. Individuals need to be cognizant of both the mounting cost of not acquiring an education and the lifelong damage that can result from excessive student debt. At the same time, government and society at large need to attack inflating college costs, which seem to result primarily from growth in administrative spending, and a lack of transparency about educational outcomes.
Another priority should be redirecting the vast subsides the federal government has long expended to help households accumulate financial and tangible assets. These subsidies currently total over $350 billion a year, with the lion's share going to already ready wealthy households and individuals. For example, American taxpayers annually spend roughly $70 billion to cover the cost of the home mortgage deduction. Yet 70 percent of this money goes to households in the top 20 percent of the income distribution, while just 8 percent goes to middle-income households, and almost nothing to the bottom 40 percent. Similar tax breaks nominally meant to encourage saving for college and retirement have similar "Robin Hood in reverse" qualities. Much more can and should be done to target resources for asset building for those in, or struggling to reach, the middle class.
Let's not forget another possible policy lever: the money supply. Moderate levels of price and wage inflation have always tended to benefit younger adults disproportionately, because younger households tend to have more debts and fewer assets than older households. Conversely, hard money tends to help older generations, who have fewer debts, less need to worry about unemployment, and more assets to protect from inflation. A big part of the reason that today's seventysomethings did so comparatively well financially over their life course was that while they were young, the general wage and price inflation of the 1960s and '70s eroded the value of their mortgages even as it inflated the value of their homes. Today's young people, being particularly encumbered by debt, would particularly benefit from modest levels of general inflation so long as wages kept pace.
More generally, we need policies that will allow today's workers to retain more of the value of their increased productivity. In many sectors of the economy, workers produce as much in one day as their counterparts in the 1960s did in a forty-hour workweek. Yet the benefits of this increased efficiency have gone overwhelmingly to already-established owners of assets rather than to each rising new generation of workers.
The reasons behind this shift are varied, but hardly inevitable or unalterable. Since the 1980s, for example, the U.S. has radically reduced enforcement of anti-trust and fair trade policies. The resulting trend toward concentration in many industries largely explains the diminishing opportunities for upward mobility available through entrepreneurship. Consolidation also reduces the number of employers competing for wage employees, thereby tending to reduce wages and upward mobility for that reason as well. (See, for example, "Who Broke America's Jobs Machine?," Washington Monthly, March/April 2010.) Meanwhile, thanks largely to changes made in tax law and enforcement policy since the early 1980s, major U.S. corporations have used almost all their profits in recent decades to reward their shareholders with dividends and stock-buyback schemes, leaving little for investment in productive enterprise or for raising the wages of rank-and-file workers.
Certainly the potential exists for our children to inherit a far more productive and broadly prosperous society than exists today. Yet for this to occur, it is not enough to dwell solely on the phenomenon of the "1 percent" growing richer. The bigger problem is how plutocrats and their political and intellectual enablers (including many who have had no idea of what they are doing) continue to use their increasing power and influence over our political economy to cause mass inequality and downward mobility across generations.
Phillip Longman is a senior editor at the Washington Monthly, policy director with New America's Open Markets Program, and a lecturer at Johns Hopkins University. This article is adapted from a chapter in the book What It's Worth: Strengthening the Financial Future of Families, Communities and the Nation, which will be published by Federal Reserve Bank of San Francisco in December 2015.
|Printer friendly Cite/link Email Feedback|
|Date:||Jun 1, 2015|
|Previous Article:||The post-ownership society: how the "sharing economy" allows Millennials to cope with downward mobility, and also makes them poorer.|
|Next Article:||The lost entrepreneurial generation?|