Writers in general focus on headlines, rather than content. For example, over the past two weeks, the media has jumped on a recent study published in the Journal of Economic Perspectives that accuses the actuaries of the Social Security Administration (“OCACT”) of systemically overstating the projections for the solvency of the trust funds.
While any question about the integrity of these forecasts deserves coverage, even the best coverage of this story failed to explain the basics of how this study fits into the questions about the stability of Social Security. Most of the stories inflated the breadth of the research, and applied the findings far removed the scope of the study.
The study isn’t about the future. It is about the past. It deals with the inputs to the forecast, not the output of the forecast. It deals with three inputs, not all inputs. It tells you almost nothing about the long-term decline of the projected solvency of the Trust Funds. In total, study suggests that OCACT is getting worse at fortune telling, and we don’t know why.
Oddly enough, the answer is actually pretty simple: OCACT did not foresee the Great Recession five years out. The irony here is that most of the reporters covering this story didn’t see the financial crisis coming when it was months away.
The study expresses the revelation in language that is highly inflammatory. It said, “In recent years, especially after about 2000, the Social Security Administration began issuing systematically biased forecasts with overconfident assessments of uncertainty.” It is the language rather than the content that has created the coverage.
It really can’t surprise anyone that forecasts during a steady economic expansion, 1982 to 2000, were more accurate than ones from a period of economic uncertainty which started with the end of Internet Bubble and finished in the Great Recession. The lesson of the study is that even the best forecasts are subject to the mercy of future events.
News coverage went in a different direction:
“[Since 2000], the forecasters proved overly optimistic, overestimating revenue and underestimating costs, with the total error reached nearly $1 trillion.” ~ Barron’sA great deal of coverage prominently cited a figure of $1 trillion dollars. This figure does not come from the study, or its authors. According to Gary King an author of the study, the figures were a calculation of the media writer.
The figure deals with issues that are well outside the realm of the study. The $1 trillion dollars of total forecasting error is the sum of ALL variance in forecasting inputs and modeling errors. The study on the other hand examines only three of the ingredients that go into baking the pie that we call the forecast. Moreover, the study provided the cost assessment of only a sliver of one of the variables.
That sliver happens to be the sliver that makes the forecast appear worse. The study estimated of the cost of people 65 and older outliving statistics. This is the number of people who lived longer than the actuaries expected. The calculated cost to the program was equal to the number of unexpected beneficiaries multiplied average benefits.
If you are going to calculate the impact of under-estimating mortality, the estimate needs to include all ages, not just the ones where people are collecting benefits. The estimate in the study is only meaningful if the only age group to outlive expectation is those people 65 and older. You have to know at what point in our lives that we are living longer.
The answer to that question may surprise you. OCACT recognizes that we are living longer. In 1940, somewhere between 50% and 60% of the population could expect to survive from 21 to 65. In 1990, that figure had risen to 72% to 83%. Big increase, yes. That increase in life expectancy is however occurring at a point in our lives where we are generally contributing to Social Security rather than drawing benefits.
Overall, the report doesn’t change my view. I use the information from OCACT almost exclusively. Over the years of writing about Social Security reform, I have come to trust the forecasts from the Social Security Administration as the best-effort available. They may not always be right, but I am confident that no one is paying them to be wrong.
The study largely represents a missed opportunity to ask more serious questions. As much as I use the data from OCACT, I recommend that you follow the trend. Since 1987, the system has lost about 1.5 years of solvency for year one calendar passed. At that rate, the system reaches insolvency in 2027. This study tells you nothing about the longer-term decline, and in fact seems to ignore it.
The projections of the Congressional Budget Office are even more troubling. It projects that Social Security will turn cash flow negative in 2017, rather than the more optimistic figure of 2020 provided by the SSA. The gap in forecasts is longer than it might take to arrive. No one is asking about that gap.
The coverage of the study drives home a larger issue. How can we expect to have an informed debate about Social Security when the media puts headlines over content?
- See more at: http://www.fedsmith.com/2015/05/26/the-medias-role-in-social-securitys-collapse/#sthash.P4vQeMWN.dpuf