Book chapter by Daniele Alimonti ’16 (Economics of Public Policy)
While working as a research assistant at the Barcelona GSE, Daniele Alimonti ’16 (Economics of Public Policy) co-authored this chapter of the book “Green Public Procurement Strategies for Environmental Sustainability” curated by Rajesh Kumar Shakya (The World Bank, USA) and published by IGI Global. His co-authors are professors and researchers from Tor Vergata University in Rome, Italy.
Daniele shares this summary of the article:
The article aims to highlight the advantages of Green Public Procurement (GPP) practices to address the environmental and economic problems during the different stages of the tendering procedure. Laying on the experiences of the European countries, the research has the objective to reconstruct the state of the art of green public procurement through the lens of a cross-country comparative analysis. After introducing a systematic review of the literature and the core regulations of the GPP practice, the article underlines the results of a multidimensional analysis on a cluster of 80 practices, identified by the European Union and implemented by governments in 25 countries at a central, regional, or local government level. The framework of the analysis builds on several dimensions, mapping the main results on the following levels: geographic origin, government level, implementation period, main criteria used for implementation, as well as environmental and economic impact of such practices.
Master’s in Economics of Public Policy alum George Bangham ’17 currently works as a policy analyst at the Resolution Foundation, an influential London-based think tank focused on living standards. In February George published a new report on subjective well-being in the UK, which marked the Foundation’s first detailed analysis of subjective well-being data and its lessons for economic policymakers.
The report received widespread media coverage in the UK Guardian, Times and elsewhere, as well as international coverage in France and India among other countries.
It was launched at an event in Westminster where speakers included the LSE’s Professor Paul Dolan, UK Member of Parliament Kate Green and former head of the UK Civil Service Lord Gus O’Donnell.
Speaking to the Barcelona GSE Voice, George said that while researching and writing the paper he had drawn closely on the material he covered while studying for the Master’s in Economics of Public Policy, particularly the courses on panel data econometrics, on the analysis of social survey microdata, and on the use of subjective well-being data for policy analysis.
Even when only 20% of the elder population in the world receives pension coverage (Pallares-Miralles, Romero and Whitehouse, 2012) which in addition is not always adequate according to ILO, non-contributory pensions are present only in a handful of developing countries. Moreover, the elderly population is currently growing as individuals tend to live longer, what further evidences the imperative need to apply this type of programs. Costa Rica implemented a non-contributory pension policy in 1975 to ensure the livelihood of those in economic need that were not able to save provisionary funds to confront the old age risks, so that elders aged above 65 living in extreme poverty are eligible for coverage. Additionally, Costa Rica adopted a 186% increase on the pension amount in 2007 in order to mitigate poverty. This study aims to provide further empirical evidence of the indirect effects of the non-contributory pensions in Latin America, through a study case for Costa Rica that explores the impact of this pension on employment and schooling, household composition, and changes in well-being for the period from 2001 to 2009.
The methodology applied includes a first difference-in-differences specification (DD) as a general model, which compares the group of receivers before and after 2007 with a control group aged above 65 years old. Secondly, we exploit the discontinuity on the treatment assignment regarding the age of the oldest household member to define a Regression Discontinuity Fuzzy Design (RD). This local analysis only identifies the effects of receiving the pension, so that we move towards a third Difference in Discontinuity Design (diff-in-disc) that combines the previous models, quantifying the impacts of the pension increase as well. The RD and diff-in-disc settings include an alternative sample where the treated are households with a member between 65 and 69 years, while the counterpart is aged between 61 and 64.
Conclusions and key results:
Our results show a generally positive picture of the Costa Rican non-contributory pension, if we consider that the policy was designed to provide an allowance to elder that never contributed to the formal system, allowing them to retire at age 65. However, conditional income transfers sometimes involve unintended consequences that characterize the policy as defective.
In the case of the DD sample, where the family structures are characterized by households with senior members and households where the recipient is father or mother of the household head, the results show major spillover effects on the remaining members, especially in terms of labor- related reactions. Indeed, the estimations show that those households that benefit from the non- contributory pension reduce significantly by 0.179 the number of individuals in the labor force, compared to non-beneficiaries. Individuals in the treated households work 1.747 hours less than their counterparts and receive a labor income 61.9 USD lower than those households that do not receive the pension. Given that the Costa Rican non-contributory pension policy requires leaving the labor market as a necessary condition for receiving the grant, we might relate the reduction in labor participation to perverse incentives, as the remaining household members might take advantage of this transfer to change their time allocation preferences between work and leisure.
Nonetheless, the results obtained in the RD and diff-in-diff models rule out our preliminary interpretation. Both estimates reveal no significant reactions at the household level for any of the outcomes analyzed, what means that households do not change their employment-related decisions in the short-run, even when the recipient must leave the labor market. In this case, the households with senior members predominate over other type of family structures, hence we would have expected a significant decreasing effect for labor force participation. Probably this is because unemployment and job instability hit the most vulnerable population groups, so that individuals with uncertain job prospects see in the non-contributory pension an opportunity to receive a steady income. Moreover, we do not find evidence neither for the incentive for other young members of the family to move in with the elderly participant, nor for the recipient to move out and live on her own.
Brian Albrecht ’14 (Economics of Public Policy) offers both a normative and a positive view
Brian Albrecht is a PhD candidate at the University of Minnesota and a graduate of the Barcelona GSE Master’s Program in Economics of Public Policy, as well as a past editor of the Barcelona GSE Voice. He is also a contributor to the Sound Money Project, a blog from the American Institute for Economic Research (AIER).
In two recent articles, he talks about money as a social contract, both from a normative and a positive perspective:
“Both monetary theory and social contract theory consider a hypothetical situation (a model) in which people in a society come together and collectively agree on some social institution. I have argued that both social contract theorists and monetary theorists use these hypotheticals to draw normative conclusions about what types of institutions are preferable. However, part of monetary theory is also concerned with the positive (i.e., not normative) question “Where does money come from?” In a similar way, part of social contract theory is concerned with the positive question “Where does the state come from?”
Read both of Brian’s articles over on the AIER website:
Our paper analyzes the impact of a cash transfer program targeting households in extreme poverty in Uruguay, called the Tarjeta Uruguay Social (henceforth referred to as TUS). In the past decades, cash transfers have become one of the main social assistance policies used to address poverty and inequality in developing countries. Their objective is to reduce vulnerability by increasing and smoothing household income, although additional objectives are usually defined depending on the program and country, such as increasing access to health and education, and reducing food insecurity (DFID 2011; Honorati et al. 2015).
The impact of these programs on different life outcomes has been widely studied. Overall, positive impacts on poverty, food insecurity, child school enrollment, labor outcomes, health and social cohesion have been found (DFID 2011; ODI 2016). Nevertheless, more research is still needed to understand the channels and particular aspects that determine their success, since countries differ widely in the details of program design. In our research, by taking advantage of considerable design modifications since the implementation of TUS, we evaluate the impact of the amount of the transfer and the benefit duration on relevant outcomes.
The Tarjeta Uruguay Social (TUS) is a conditional cash transfer program implemented in 2009 which aims at assisting those in situations of extreme poverty in Uruguay. It targets the 60,000 worst-off households by providing them with a monthly cash transfer on a prepaid magnetic card. This card can be used to purchase food items, cleaning supplies, and hygiene products, excluding cigarettes and alcohol. Eligibility for the program is based on the Critical Needs Index (CNI), a proxy means test that evaluates household poverty, using variables associated to education, dwelling, access to durable goods and household composition. The program has undergone many modifications since its inception, including increasing the number of participants, changing the eligibility criteria, and a doubling of the benefit for half of the recipients. Our analysis begins in 2013, in which the program had 60,000 participants, and the poorest 30,000 according to the CNI received a doubling of their benefit, creating two benefit categories: Simple TUS and Double TUS. In our research, we exploit the doubling of the benefit based on the CNI by using a Fuzzy Regression Discontinuity Design to evaluate the impact of the amount of the benefit on life outcomes.
The availability of an extensive set of administrative data allowed us to evaluate the impact of the doubling on an array of outcomes. There are many different channels through which this cash transfer program could have positive effects, since the resources freed up by the relaxation of the household budget constraint could be used differently according to household preferences. Therefore, by taking advantage of a rich set of administrative data, we analyzed 65 outcomes: housing and living conditions, food insecurity, formal labor market work, education enrollment of children and adolescents, prenatal and birth health conditions, and family composition. Additionally, we analyze how the duration of the benefit affects the impact of the program by comparing the effects for beneficiaries who receive the transfer for different time periods. We analyze short-term outcomes for those who receive the transfer for less than a year; medium-term outcomes for those who receive the transfer for two to three years; and long-term outcomes for those who receive the transfer consistently for three years.
Our results show than an increase in the amount of a cash transfer can in fact have important impacts on the life outcomes of recipients. Positive effects were found with regard to living conditions, with an increase in investment in durable goods and a betterment of housing conditions, such as purchasing water heaters or washing machines, adding a bathroom to the home, and upgrading from a trash roof to a concrete one. Additionally, results show positive impacts concerning individual outcomes, with improvements regarding prenatal care and months of formal work observed. Nevertheless, some negative results were found in the short-term, which could potentially be explained by an attempt of manipulation by the beneficiaries in order to ensure continued benefit provision under uncertainty. Results also show that the duration of the benefit has a considerable impact on how the transfer is spent. More positive significant household results are found in the medium-term, while individual results become stronger in the long-term. The increasing effects of more persistent benefits could potentially be explained due to uncertainty in the short-term regarding whether the benefit will continue to be provided, which decreases over time.
This study contributes to the literature of poverty alleviation policies by providing evidence which can be used to improve the design of cash transfer programs. The positive effects found in this paper from comparing different amounts of the transfer within the same program indicate that the monetary amount of the benefit is a relevant policy parameter with consequences for the effectiveness of the program. Additionally, the results for heterogeneous effects by benefit duration indicate that the persistence of the transfer is another relevant aspect of program design. The evidence provided in this paper indicates that a predefined duration upon entering the program together with a minimum duration of one year could constitute a good practice. This may mitigate negative effects regarding household manipulation attempts and potentiate positive effects by reducing income volatility and increasing housing investments. Our results suggest that further research on benefit size and timing is imperative for policy design of cash transfers, one of the main tools to reach universal social protection.
George Bangham (Economics of Public Policy ’17) is an economic researcher at the Resolution Foundation, a London-based think-tank that carries out research and policy analysis to improve the living standards of people in the UK on low and middle incomes.
George Bangham (Economics of Public Policy ’17) is an economic researcher at the Resolution Foundation, a London-based think-tank that carries out research and policy analysis to improve the living standards of people in the UK on low and middle incomes. In recent years the Foundation has been influential in advocating for a living wage and for policymakers to consider the intergenerational impact of public policy. George’s own work focuses on labour markets and social security policy, with his recent publications covering issues from working hours to tax reform.
One of his recent papers, “The new wealth of our nation: the case for a citizen’s inheritance,” has received international attention in the media and was featured in an article in La Vanguardia newspaper this May.
The Intergenerational Commission has identified two major trends affecting young adults today, beside the weak performance of their incomes and earnings, which barely featured in political debate for much of the 20thcentury. The first is that risk is being transferred from firms and government to families and individuals, in their jobs, their pensions and the houses they live in. The second is that assets are growing in importance as a determinant of people’s living standards, and asset ownership is becoming concentrated within older generations – on average only those born before 1960 have benefited from Britain’s wealth boom to the extent that they have been able to improve on the asset accumulation of their predecessors. Both trends risk weakening the social contract between the generations that the state has a duty to uphold, as well as undermining the notion that individuals have a fair opportunity to acquire wealth by their own efforts during their working lives.
This paper, the 22nd report for the Intergenerational Commission, makes the case for the UK to adopt a citizen’s inheritance – a universal sum of money made available to every young person when they reach the age of 25 to address some of the key risks they face – as a central component of a policy programme to renew the intergenerational contract that underpins society.
Policy recommendations from the report:
From 2030, citizen’s inheritances of £10,000 should be available from the age of 25 to all British nationals or people born in Britain as restricted-use cash grants, at a cost of £7 billion per year.
To reflect the experiences of those who entered the labour market during and since the financial crisis, and to minimise cliff edges between recipients and non-recipients, the introduction of citizen’s inheritances should be phased in, starting with 34 and 35 year olds receiving £1,000 in 2020. Each subsequent year, citizen’s inheritance amounts should then rise and be paid to younger groups, until the policy reaches a steady-state in 2030 when it is paid to 25 year olds only from then on.
The citizen’s inheritance should have four permitted uses: funding education and training or paying off tuition fee debt; deposits for rental or home purchase; investment in pensions; and start-up costs for new businesses that are also being supported through recognised entrepreneurship schemes.
The citizen’s inheritance should be funded principally by the new lifetime receipts tax, with additional revenues from terminating existing matched savings schemes – the Help to Buy and Lifetime ISAs.
We want to know what the BGSE community is thinking and reading about the Brexit.
We invite all Barcelona GSE students and alumni to share their early reflections on the potential economic consequences of the UK’s recent vote to leave the EU. Did you focus on a related topic in your master project? Are you working at a think tank, central bank, or consulting firm where your projects will be impacted by this decision? Have you seen any articles or links that you found useful for understanding what lies ahead?
Here are a couple of pieces we’ve found to get the discussion going:
The BGSE participates in A Dynamic Economic and Monetary Union (ADEMU), a project of the EU Horizon 2020 Program. Last week, ADEMU researchers held a webinar to discuss the Brexit.
Europe has grown out of its crises when reason and solidarity have prevailed, but it has also been devastated by its crises when fear and nationalism have taken the lead. Brexit, in the aftermath of the euro crisis, brings this dichotomy back to the foreground. Since 2010 there have been important advances in the development of the Economic and Monetary Union (EMU) and flexible forms of participation have allowed other EU countries, reluctant to join the euro, to share the basic principles that define the EU and have a common presence in the interdependent global world.
According to the panelists, Brexit raises 3 crucial questions:
Should the EMU be accelerated to become a centre of gravity within the EU, or slowed down to avoid a centrifugal diaspora? If accelerated, how?
Should an ‘exit’ country be allowed free entry to the single market and other EU public goods without accepting freedom of movement?
Should the EU remain as it is, or increase its capacity to offer common public services (Banking Union, border security, research funding, environment, etc.), or limit its scope of activity to the EU single and integrated market?
– Joaquín Almunia (Former Vice-President of the European Commission, honorary president of the Barcelona GSE)
– Ramon Marimon (European University Institute and UPF – Barcelona GSE; ADEMU)
– Gorgio Monti (European University Institute; ADEMU)
– Morten Ravn (University College London; ADEMU)
Annika Zorn (European University Institute; Florence School of Banking & Finance)
Nobel Laureate and Barcelona GSE Scientific Council member Joseph Stiglitz shares some reflections in the wake of the Brexit decision
What are you thoughts on Brexit?
We want to know what the BGSE community is thinking and reading about the Brexit. Please share your ideas, favorite sources for analysis, or observations from economists you respect in the comments below.
With over 700,000 users, data from the app aquienvoto.org suggests how VAAs could represent a whole new way of surveying the general public before an election and collecting data on the political position of the population.
With over 700,000 users, data from the app aquienvoto.org suggests how VAAs could represent a whole new way of surveying the general public before an election and collecting data on the political position of the population.
The creator of the app is BGSE alum Hugo Ferradáns ’15, graduate from the Economics of Public Policy Program.Follow him on Twitter @Hferradans.
The rise of the internet era opened a door for innovative ways to help voters be informed about their political choices prior to casting their ballot. During the past 2015 Spanish General Election, new tools such as aquienvoto.org (whodoivote.org in English), an app that matches users’ policy preferences with parties’ proposed policies, became an easy and straightforward alternative for users to explore their political position and compare it to that of the biggest parties. Its success, with over 800,000 users and more than 30 million responses, suggests how technology and the social sciences can work successfully together to create a more informed and accountable electorate, especially in a multiparty political system such as the Spanish one.
But encouraging are more informed electorate is not the only benefit of Voting Advice Applications. In fact, the large amount of data that is generated from online applications such as aquienvoto.org can be a source of analysis and study regarding why people make their choices1, as well as a way to estimate what users care most about in a real-time basis before an election. This article, thus, will try to shed light on the usefulness of Voting Advice Applications to gather data on the political positioning of users. I will show some of the results that were acquired from aquienvoto.org, both on the policy preferences of users and on their most politically-aligned parties.
But first things first- What is exactly aquienvoto.org?
Aquienvoto.org is what is called in the field of political economy research a “Voting Advice Application” (VAA). VAAs are essentially an online test that matches users to parties depending on individual responses to policy-related statements. The user can either disagree or agree with the statements, as well as indicate whether that specific policy is important to him or her. After replying to several questions, the VAA gives the user a summary of what parties the user disagrees and agrees most with, mainly in the form of a ranking or a political map.
Even though there some VAAs more sophisticated than others2, all VAAs acquire essentially the same data:
the position of the user regarding a specific question (in a scale of completely agree to completely disagree with the statement in question),
whether that user gives importance to that question and
after answering all questions, the ranking of most preferred parties for each user.
Aquienvoto.org was able to gather information on 756,908 people, after dropping all users that did not complete at least level 1 (that is, replied to 31 questions).
What did users get as an advice from aquienvoto.org?
If we look at what party was the most first-ranked among users, we see that the centre-right Ciudadanos was the most preferred party throughout the whole period for roughly 33% of users. However, interestingly enough, the overall amount of people that voted for parties that are more leaned towards the left (Podemos,PSOE, United Left and Nós, representing 62.8% of votes) is much higher than those in line with liberal and conservative policies (Ciudadanos, PP, PNV and DiL, being 37.2% of users’ first choices), indicating that users from aquienvoto.org are consistently left-wing.
It is particularly noticeable the different layout that the results present when compared to the results from General Elections. For example, the conservative Partido Popular, which was ranked first in the elections with roughly 25% of votes, appeared last almost throughout the whole period for aquienvoto.org. It is clear that this might certainly come from the fact that VAA users are consistently younger and more left-wing than the average citizen, but it also poses a question that would be interesting to explore: do people vote in line with their policy preferences or are there other factors that are influencing voters’ decisions in the field of electoral politics?
How do people position themselves about certain issues and what they think are most important?
Unsurprisingly, the topics related to corruption were the ones users gave most importance to, with almost 10.67% of respondents (that is, 80,410 individuals) giving importance to the question “Politicians accused of corruption should resign and be illegitimated to run for office”, of which almost 93% of people responded that they agree or completely agree with the statement.
The second and third place of most-given-importance questions are related to the presence of religion in the political sphere (second place) and the presence of religion in the education curriculum (third place), for which both find a strong rejection towards religion. Furthermore, social policy is an area of much importance to individuals as well, surely very much related to Spain’s current economic woes. Indeed, Spanish law related to mass evictions over the past years3 takes fourth place in most-given importance question (8.06% of total questions replied), followed by a statement on the education budget (7,46%), for which most people agree that increasing the budget is a top priority within government policy. These results are roughly constant throughout time, although the amount of users that gave importance to questions declined (graph 2).
In terms of the most controversial topics out of all questions, where there are large amounts of people agreeing and disagreeing with the statement, we find the prohibition of bullfighting, the abolition of escuelas concertadas4 and the law regarding underage abortion5, having all of them a rather high rate of importance-responses as well.
Regarding what users are not interested on, that is, the questions that were least given importance to, it is seen that the four topics that are least important to users (starting from the least important) are the deficit and the ceiling of government expenditure, the legalization of prostitution, the regulation the financial sector, and the financing of the Autonomous Communities (the different regions of Spain).
What is the political position of the average user?
In order to give users the most interactive experience when analyzing their results, we created a map of their political position using eight different axis, as the Swiss VAA smartvote6 did. Using an algorithm, each response that a user gives contributes to create its “political map”, which can be later compared to the political map of the parties. Thus, using the responses from each user, we computed the political map for the average user, creating the image below.
As it can be seen, the average user is very much in favor of strong democratic institutions that condemn corruption at all levels, as it presents a rather high value for the axis related to democratic regeneration. Furthermore, it also presents a high value for welfare state and liberal society, and quite a low value for those questions supporting a liberal economy and a restrictive fiscal policy, which goes in line with the results mentioned above that users are more prone to identify themselves with left-wing policies.
Also, it can be seen that the average user rejects all statements related to regional nationalism, and favors those regarding state centralization. This changes, however, when comparing the average users from different regions, as people from Autonomous Communities such as Catalonia and the Basque Country strongly reject state centralization and favor regional nationalist policies.
What is left to be done from VAAs like aquienvoto.org?
Although VAAs can give academics a rich database, there are a number of methodological challenges that need to be overcome7, mainly regarding the representativeness of the sample. Indeed, if we want to make inferences on the positioning of the whole Spanish population, it is crucial that we acquire good quality data on the characteristics of users; something that has been proved difficult for online surveys. From aquienvoto.org, we are working to improve the process of data collection, providing users with the option to sign into an account where they can store their information and reply to surveys at any time. Nevertheless, we believe that more attention from Universities and governments should be given to these tools so that institutions and VAA organizations collectively work to make VAAs a better tool both for users and for the academia. Hopefully, that is what will happen in the next years to come.
Evan Seyfried ’16 (Economics of Public Policy) summarizes the lecture by Princeton’s Atif Mian.
Evan Seyfried ’16 shares the following summary of a talk given by Princeton’s Atif Mian this May to the UPF Department of Economics and Business. Follow Evan on Twitter @evanseyf
In 2006, house prices in the U.S. reached their all-time peak. The S&P/Case-Shiller Housing Price Index had doubled in just eight years (not accounting for inflation).1 The year before, Robert Shiller (whose work on historical housing prices led to the creation of the Case-Shiller Index) had published an update to his book Irrational Exuberance warning that recent growth in housing prices was historically unprecedented—he argued that houses were wildly overpriced and would likely revert back to a relatively constant historical value in the long run.2 His research showed that if you looked at real prices (inflation-adjusted) in the U.S. housing market prior to the early 2000s bubble, you would find that prices have not changed much since 1890!
The frenzy of the early 2000s finally caught up with lenders, homeowners, and investors, who began to doubt the continued rise of house prices. In late 2005, with interest rates rising, a growing number of homeowners with Adjustable-Rate Mortgages (ARM) began to default on their mortgages. Finally, by the end of 2006 the housing bubble began to collapse under its own weight, and the shockwaves ripped through the financial sector—which had bet heavily on the U.S. housing market through mortgage-backed securities and newer exotic financial instruments. French bank BNP Paribas, on August 7, 2007, famously suspended withdrawals from its investment funds associated with subprime mortgages, a move that triggered a shadow banking run, and is often considered the official start of the financial crisis—when the housing market instability truly began to upend the financial sector. What followed was the most severe financial crisis since the Great Depression and a long recession for the rest of the U.S. economy.
But there is still much to be learned about the interaction of the housing bust (leaving many homeowners with very high debt compared to their assets), the crisis in the financial sector (wherein banks have been generally unwilling to either extend new credit or restructure existing loans), and the continuing economic malaise in the U.S. and other economies around the world.
From the housing bubble to household debt
A great deal of Princeton economist Atif Mian’s research—much of it in collaboration with University of Chicago economist Amir Sufi—has studied these interactions, exploring the fallout from the housing bubble in the U.S. and the subsequent “debt overhang.”
What is household debt overhang?
Imagine a family owes $200,000 on their mortgage. If the market crashes and the house value suddenly declines to $180,000, then the family now owes $20,000 more than the value of their house. Thus, even if the family chooses to sell the house, they will not be able to pay back the mortgage in full. This is also called being “underwater” on a mortgage. In the context of all household finances, debt overhang is a similar concept to being underwater, and refers to the amount of indebtedness of a family beyond the value of their assets, taking into account their anticipated income. Debt overhang makes a household unattractive to lenders (both for new loans and for refinancing old loans), because they do not have any collateral that is not already used to cover existing debt.
Note that household debt is treated separately from other private sector debt (mainly non-financial firm debt), and shows notably different dynamics. All of Atif Mian’s research mentioned here focuses specifically on household debt.
In 2013, Mian published evidence that poorer families who were highly leveraged in the housing market reacted very sharply to the loss of wealth when their homes depreciated following the housing bust. Because their marginal propensity to consume out of housing wealth (how much families spend knowing that they have a certain amount of wealth in their house to fall back on) is higher than for middle- or upper-income families, their consumption dropped disproportionately in the years after the bubble.4 Of course, at the individual level this behavior is rational, but at the national level low consumption growth in a demand-constrained economy has created a negative feedback loop of lower job growth, lower income growth, and a further drop in consumption growth.
One of the takeaways from this body of research is that governments and international finance organizations need to do a better job of properly accounting for how private sector debt affects consumption. Optimistic forecasts for recovery from the 2008-2010 Great Recession did not sufficiently account for depressed demand as homeowners and those with credit card and student debt eschewed consumption to deleverage themselves. In a comment on Karen Dynan’s research on household debt overhang and consumption, Mian wrote: “… macroeconomic policy in a world where consumption is driven by debt overhang needs to be seen through its implications for the net worth of the borrowing households.”5
But Mian also wanted to take these insights from the Great Recession and ask more fundamental questions about private debt and predictions of economic growth: Was consumption affected similarly affected during other periods of high household debt? Do we see similar household debt effects in other countries? If so, how does this extra drag on consumption affect how economists forecast economic growth?
Mian recently gave a lecture at the Universitat Pompeu Fabra in Barcelona, presenting the findings from his attempt to answer those questions. (The working paper, coauthored with Amir Sufi and Emil Verner, is available from the National Bureau of Economic Research.6 ) They took a sample of 30 countries (mostly advanced economies) and compiled private debt data back to 1960. Then they identified shocks to household credit and looked at the relationship between those shocks and subsequent GDP growth. (In this context, shocks should be thought of as sudden increases in the availability of credit.)
Initially they found that high growth in household credit was predictive of subsequent low GDP growth. But they needed to identify the nature of those credit shocks to find possible causal channels. According to Mian they wanted to “rule out demand-driven shocks.” Demand-driven shocks come from the consumer side and could be an increase in the use of credit to smooth lifetime consumption, or as an “insurance effect” to get liquidity today due to uncertainty or an expectation of economic shocks tomorrow. On the other hand, a supply-driven credit shock would be banks extending more and more credit due to government policy changes or financial innovation.
The first demand-driven possibility is relatively simple to disprove. Because the Permanent Income Hypothesis suggests that households borrow today in the expectation of higher future income, the fact that household debt increases should be indicative of economic growth. As mentioned before, Mian, Sufi, and Verner find the exact opposite relationship. The second demand-driven possibility is unlikely because much of the growth in household debt across all the countries in the survey is in mortgage debt, which is generally not taken on to provide liquidity.
Next, they looked into the supply-driven credit shock mechanism and tried to find a way to overcome the presumably endogenous relationship between credit supply shocks and subsequent lower GDP growth. The mechanism must explain why people borrow in the first place, especially what causes them to over-borrow (what Mian calls an “aggregate demand externality”—an effect that spills over to other borrowers), and explain why excessive borrowing actually leads to a decline in real output (what Mian calls “macro frictions” that generate the slowdown, such as monetary policy and “wage rigidity”). As the authors write in the paper: “The key ingredient in this model is an aggregate demand externality that is not properly internalized by borrowing households at the time they make their borrowing decision.”
Two problems remained. First, the authors had to come up with a measure of “credit supply shocks” that could apply to dozens of different countries. Second, they had to choose a measure that could help identify the causal relationship, not just the correlation. Their solution was to use one measure for the U.S. (share of debt issuance by risky firms) and a simpler one for non-U.S. economies (the spread of sovereign debt yields compared to equivalent U.S. Treasury notes). According to Mian, these are “not instruments in the usual sense of the word” (which must satisfy the requirements of independence from the outcome variable and relevance to the explanatory variable). Rather, they are “imperfect instruments” (see Nevo and Rosen, 2012.7 for more information) and, per Mian, “as long as we can sign the covariance of the instrument, we can partially identify the range in which the coefficient lies.” In other words, because these proxies for credit supply shocks typically signify expectations of good times, then if we see that they actually predict bad times, we can at least identify a range of values for how strong the link is between an increase in household debt and subsequent low growth.
The methodology is admittedly complex, and audience members had some reservations about how the authors had dealt with household debt (particularly since household debt is mostly mortgage debt). One audience member suggested that housing bubbles could be the main driver of subsequent low growth, with the extension of credit simply a side effect. Mian acknowledged that he cannot outright reject this concern, but added that the results are robust to controls for house prices, so the bubbles should be controlled for. Another audience member suggested that this could be tested for if the data set included any countries which had seen a credit boom with no attendant housing bubble. There are, in fact, some countries in the data set, but, as Mian stressed, there was not enough of a subsample for a strong statistical test of this hypothesis.
Onward to global growth!
After presenting the “within country” results—showing that household credit supply shocks tended to lead to lower growth in the five or so years following—Mian pivoted to the global portion of the paper. The goal here was to establish the spillover effects of these credit supply shocks among different countries. Sure enough, Mian stated that “the global cycle is more destructive” due to financial spillovers between countries. Because the growth slowdown in a given country after the credit shock leads to a reduction in imports, the problem is transferred to that country’s trading partners. Furthermore, the effects are exacerbated by “macro frictions,” especially in countries that employ fixed exchange rate regimes, borrow primarily in foreign currency, and are near the zero interest rate lower bound (although recently the zero interest rate bound has been proving not to be much of a hard bound after all). Figure 2 shows these global aggregate effects.
Mian stressed that these dynamics between debt and growth, especially the global ones, should be seen as relatively recent (“last-forty-years effects”) side effects of globalization and the financialization of household debt. He concluded that governments must respond to these powerful forces with targeted macroprudential policies, and forecasters at organizations like the IMF and OECD must learn to better account for household debt in their growth projections.
In the past few weeks, there have been a barrage of media reports about educational achievement and, more generally, life outcomes for the youth of Durham.
The positive news is that these issues are receiving attention, but the downside is that the reports may be more harmful than helpful. At its best, data optimizes decision-making, but at its worst data can be deceptive and divisive.
Specialized knowledge is required to leverage data for decision-making, whereas selectively reporting figures requires some effort but no expertise. In the latter scenario, the ambiguity of statistical assumptions predisposes the audience to personal, as well as, framing bias. Those who go through the effort to produce data often have an agenda, and therefore, have incentives to make claims which imply causes and solutions. Data is dangerous when misused. It can create tension, undermine trust and unity, and result in costly adverse decision-making.
One key characteristic of amateur statistics, aside from lacking an experimental design, is that they do not account for the fact that outcomes are a function of many different variables. For example, schools clearly play a crucial role in influencing academic attainment, but a report drawing relative comparisons between attainment outcomes within or across cities usually implicates unidentified failures of one school district versus another while all but ignoring the effects of transportation, affordable housing, food, healthcare, and social support accessibility, as well as people’s different lived experiences, including traumatic exposure of various kinds.
Reactivity to outcomes is strongly linked to bias and emotion. Making decisions about problems and solutions based exclusively on outcomes is the logical equivalent to going with your gut.
Descriptive statistics alone have a tendency to reinforce what we already think we know rather than helping us to gain an objective understanding of the issues because we often overestimate our understanding of the context. Shards of truth may be buried in our presumptions or between the different storylines, but other times the truth isn’t within sight.
If one wanted to know what public schools are doing right and what positive changes could be made, the reported outcomes would not meaningfully increase understanding. This would be like a college basketball coach using the Ratings Percentage Index (RPI) to make game plans. The RPI is simply a function of outcome variables that are influenced by other, more influential variables over a team’s success, such as shot selection, rebounding, ball control and many others.
Similarly, objective inference about the determinants of academic achievement are impossible when we simply have some measure of the output, like grade level proficiency, graduation rates or achievement gaps. Summarized outcomes do not even begin to untangle the multifaceted causal factors of student achievement, or even point to which factors are within the schools’ control and which are shaped by other institutions that govern infrastructure, real estate development, credit markets and criminal justice.
Good intentions often lead to unintended consequences. Calculating outcomes or deriving slightly new definitions of them does not enhance the cultural or intellectual competence of our community, its citizens or the institutions within it.
This is troubling because the extent of harm done with every report that subjectively frames and selectively reports data will never be known. A symptomatic obsession can enable data to have a negative social impact, leading to the proliferation of economic and racial segregation, adverse selection of people and funds from public schools, victim blaming and the marginalization of objectivity. The focus needs to shift from symptoms to solutions.
Data should be collected and analyzed in a way that enables us to separately identify effects on outcomes, including those determinants within the school’s control and those outside, so that all can be addressed in order of impact and feasibility. Robust evaluations should yield insight, pointing out specific causal factors that affect outcomes that the schools, nonprofits policy and citizens can address.
Applying a scientific lens to social issues transforms data from punitive to instructive. Careful investigation using valid quantitative methods can help us gain an understanding of the inferences that the data will and will not permit. Through empirical analysis, we have the opportunity to disentangle the effects that different factors have on certain outcomes. This is powerful because it enables us to create informed strategies.
Subsequently, when we know how our potential actions will affect an outcome, a cost-benefit analysis can help decide which evidence should be brought to action. Operating in the public and nonprofit sectors, the cost-benefit analysis goes beyond fiscal considerations to examine social returns. Combining these empirical tools puts us in a position to optimize social welfare. Data or analysis vacant of these characteristics will result in suboptimal decision-making.
An empirical basis for decision-making that respects the complexity of determinants on outcomes and the tradeoffs between various actions or lack of action should be utilized at all levels – from the systemic to the programmatic. A symptomatic focus and a preoccupation with a single area will not result in systemic improvement. As institutions, organizations and programs, our goal should be to improve, which can only be achieved through learning.
Durham has great potential to grow while enhancing the well-being of all, including the most marginalized. Continuous improvement requires the commitment of people in the public, private, and social sectors to work together.
Part of analytical integrity is the acknowledgement that sometimes our data tells us nothing at all. If we truly care about addressing systemic issues, lack of information is a strong argument for why we should build more robust datasets that incorporate variables across institutions and the socio-economic environment. This requires a willingness to coordinate and to learn. Importantly, these actions imply the willingness to change.
The Made in Durham partnership exists to address issues of the highest importance. It is the job of data is to increase the role of evidence in the partnership’s decision-making, and because of the gravity of these decisions, I also feel an ethical accountability to this work.
If we aren’t asking the right questions, data can lead to costly decisions that undermine improvement. As members of the community, we should all be able to ask the right questions to hold decision-makers accountable to analytical standards that drive improvement.
Regardless of what the outcomes show now, or anytime in the future, what we should be asking is: what are the causes of these outcomes, what are their magnitudes, and thus, what can we do to improve.
We use our own and third party cookies to carry out analysis and measurement of our website activities, in order to improve our services. Your continuing on this website implies your acceptance of these terms.