Mini-Bot: Ingenuity or Ignorance

Alessandro Franconi ’17 (Macroeconomic Policy and Financial Markets)

“Mini-Bot: Ingenuity or Ignorance” is my first policy brief for the Luiss School of European Political Economy.

“The concept of using mini-BOTs to pay off trade payables may seem like a good idea, but if we analyze it in detail we can intuitively conclude that such a tool is futile and limited…It is clear that the mini-BOTs are a completely sterile, if not harmful, device for public finances, as their implementation (or just the information that the government is officially studying their implementation) would put again Italy on the road to leaving the euro.

How do firms adjust to rises in the minimum wage? Survey evidence from Central and Eastern Europe

Publication by Nataša T. Jemec ’09 (Economics) and Ludmila Fadejeva ’11 (Macro)

Nataša Todorović Jemec ’09 (Economics) and Ludmila Fadejeva ’11 (Macroeconomic Policy and Financial Markets) have published a paper in the IZA Journal of Labour Policy, together with a few other colleagues from central banks of new EU member states. The paper, “How do firms adjust to rises in the minimum wage? Survey evidence from Central and Eastern Europe,” studies the transmission channels for rises in the minimum wage using a unique firm-level dataset from eight Central and Eastern European countries.

They wrote the publication within the ECB Wage Dynamics Network (WDN). At the time, Nataša and Ludmila were working at the Central Bank of Slovenia and the Central Bank of Latvia respectively, and they were their banks’ representatives in the WDN. Increase of the minimum wage was a common topic of many new EU members, and they decided to write a paper on that based on the data that they collected through a WDN survey in their countries.

Researchers can use this form to request access to the data of the WDN network which includes many EU countries.

Paper abstract

We study the transmission channels for rises in the minimum wage using a unique firm-level dataset from eight Central and Eastern European countries. Representative samples of firms in each country were asked to evaluate the relevance of a wide range of adjustment channels following specific instances of rises in the minimum wage during the recent post-crisis period. The paper adds to the rest of literature by presenting the reactions of firms as a combination of strategies and evaluates the relative importance of those strategies. Our findings suggest that the most popular adjustment channels are cuts in non-labour costs, rises in product prices, and improvements in productivity. Cuts in employment are less popular and occur mostly through reduced hiring rather than direct layoffs. Our study also provides evidence of potential spillover effects that rises in the minimum wage can have on firms without minimum wage workers.

About the authors

Nataša T. Jemec ’09 is a Senior Economist at IMAD. She is an alum of the Barcelona GSE Master’s in Economics.

Ludmila Fadejeva ’11 is a Senior Econometrician at the Bank of Latvia. She is an alum of the Barcelona GSE Master’s in Macroeconomic Policy and Financial Markets.

Forecasting Currency Crises

Macroeconomic master project by Ivana Ganeva and Rana Mohie ’19

Editor’s note: This post is part of a series showcasing Barcelona School of Economics master projects. The project is a required component of all BSE Master’s programs.

Introduction

The question of whether a currency crisis can be predicted beforehand has been discussed in the literature for decades. Economists and econometricians have been trying to develop different prediction models that can work as an Early Warning System (EWS) for a currency crisis. The significance of such systems is that they provide policy makers with a valuable tool to aid them in tackling economic issues or speculation pressure, and in taking decisions that would prevent that from turning into a crisis. This topic is especially relevant to Emerging Markets Economies due to the presence of a greater number of fluctuations in their exchange rate translating to a bigger currency crisis risk.

In this paper, we propose an Early Warning System for predicting currency crises that is based on an Artificial Neural Networks (ANN) algorithm. The performance of this EWS is then evaluated both in-sample and out-of-sample using a data set of 17 developed and developing countries over the period of 1980-2019. The performance of this Neural-Network-based EWS is then compared to two other models that are widely used in the literature. The first one is the Probit model dependent variable which is considered the standard model in predicting currency crises, and is based on Berg and Patillo, 1999. The second model under consideration is a regime switching prediction model based on that proposed by Abiad, 2006.

Artificial Neural Networks

Artificial Neural Networks (ANN-s) is a Machine Learning technique which drives its inspiration from biological nervous systems and the (human) brain structure. With recent advancement in the computing technologies, computer scientists were able to mimic the brain functionality using artificial algorithms. This has motivated researchers to use the same brain functionality to design algorithms that can solve complex and non-linear problems. As a result, ANN-s have become a source of inspiration for a large number of techniques across a vast variety of fields. The main financial areas where ANN-s are utilized include credit authorisation and screening, financial and economic forecasting, fixed income investments, and prediction of default and bankruptcy and credit card manipulations (Öztemel, 2003).

Main Contributions

1. Machine Learning Techniques:

(a) Using an Artificial Neural Network predictive model based on the multi-layered feed forward neural network (MLFN), also known as the “Back-propagation Network” which is one of the most widely used architectures in the financial series neural network literature (Aydin and Savdar 2015). To the best of our knowledge, this is the first study that used a purely neural network model in forecasting currency crises.

(b) Improving the forecast performance of the Neural Network model by allowing the model to be trained (learn) from the data of other countries in the cluster; i.e countries with similar traits and nominal exchange rate depreciation properties. The idea behind this model extension is mainly adopted from the “Transferred Learning” technique that is used in image recognition applications.

2. The Data Set: Comparing models across a large data set of 17 countries in 5 continents, and including both developing and developed economies.

3. Crisis Definition: Adding an extra step to the Early Warning System design by clustering the set of countries into 6 clusters based on their economy’s traits, and the behavior of their nominal exchange rate depreciation fluctuations. This allows for having a crisis definition that is uniquely based on each set of countries properties – we call it the ’middle-ground’ definition. Moreover, this allowed to test for the potential of improving the forecasting performance of the neural network by training the model on data sets of other countries within the same cluster. 4. Reproducible Research: Downloading and Cleaning Data has been automated, so that the results can be easily updated or extended.

Conclusions

We compare between models based on two main measures. The Good Signals measure captures the percentage of currency crises predicted out of the total crises that actually occurred in the data set. The second measure used for comparing across models is the False Alarms measure. That is, the percentage of false signals that the EWS gives out of the total number of crises it predicts. In other words, that is the percentage of times when the EWS predicts a crisis that never happens.

The tables presented below show our findings and how the models perform against each other on our data set of 17 countries. We also provide the relevant findings from literature as a benchmark for our research.

The results in Table 1 show that Berg & Patillo’s clustering of all countries together generally works worse than our way of clustering data. Therefore, we can confirm that the choice of a ’middle-ground’ crisis definition has indeed helped us preserve any potential important country- or cluster-specific traits. In brief, we get comparable results to the ones found in the literature when using conventional methods, as highlighted by the table to follow.

After introducing the ANN model and its extension, we observe their Out-of-Sample models performance and obtain some of the key results to our research.

Summary of the key results

  • The proposed Artificial Neural Network model crisis predictability is shown to be comparable to the standard currency crisis forecasting model across both measures of Good Signals and less False Alarms. However, the modified Neural Network model on the special clustering data set has shown superior performance to the standard forecasting model.
  • The performance of the Artificial Neural Network model observed a tangible improvement when introducing our method of clustering the data. That is, data from similar countries as part of the training set of the network could indeed serve as an advantage rather than a distortion. To the contrary, using the standard Probit model with the panels of clustered data resulted in lower performance as compared to the respective country-by-country measures.

Authors: Ivana Ganeva and Rana Mohie

About the BSE Master’s Program in Macroeconomic Policy and Financial Markets

Economics by Barcelona GSE alumni at CaixaBank Research (Vol. 2)

Recent work by alumni at CaixaBank Research

It’s our second roundup of articles by Barcelona GSE Alumni who are now working as research assistants and economists at CaixaBank Research in Barcelona (see Vol. 1).

This roundup includes posts and videos from the second half of 2018 and early 2019, listed in reverse chronological order. Click each author’s name to view all of his or her articles from CaixaBank Research in English, Catalan, and Spanish.

Education as a lever for inclusive growth

Ricard Murillo ’17 (International Trade, Finance, and Development)

The importance of education for people’s well-being throughout all stages of their lives is beyond any doubt. At the economic level, individuals with higher levels of education tend to enjoy higher employment rates and income levels. What is more, all the indicators suggest that in the years to come, the role of education will be even more important. The challenges posed by technological change and globalisation have a profound effect on the educational model.


Social cohesion and inclusive growth: inseparable

Javier Ibáñez de Aldecoa ’18 (Economics)

Faced with the major transformation of the productive system brought about by technological change and globalisation, as well as the challenges posed by an ageing population, it is important to take action to strengthen social cohesion – an indispensable element if we are to carry out reforms that foster an inclusive and sustained form of growth.


The central banks, at the helm of a more volatile environment

Adrià Morron ’12 (Economics) and Ricard Murillo ’17 (ITFD)

The US and the euro area are at different stages of their financial cycles: while the Fed’s monetary policy is close to becoming neutral or even restrictive, the ECB remains in clearly accommodative territory. However, to some extent, both are facing a common risk: the decoupling between their monetary policy and the financial conditions. The two institutions will try to manage their tools carefully, in order to facilitate a gradual adjustment of the financial conditions in the US and, in the case of the euro area, to keep them in accommodative territory.


Regulation more appropriate to the nature of the banking sector

Gerard Arqué ’09 (Macroeconomic Policy and Financial Markets)

Thanks to the implementation of the measures introduced following the financial crisis, today the financial sector is more robust than before. This will help to minimise the impact to the economy and financial stability in periods of upheaval, since countries with better-capitalised banking systems tend to experience shorter recessions and less contraction in the supply of credit. However, the outstanding tasks we have mentioned should be properly addressed sooner rather than later.

Bonus video! An unconventional monetary policy cycle

Adrià Morron ’12 (Economics)

Central banks are facing the challenge of removing the extraordinary measures imposed during the financial crisis of 2007-2008 and the subsequent economic recession. In normal times, central banks would simply raise interest rates up to the desired level. However, monetary policy is currently in a rather unconventional cycle.


Source: Caixabank Research

If you’re an alum and you’re also writing about Economics, let us know where we can find your stuff!

Labor Markets, Search Frictions and International Trade: Assessing the China Shock

Master project by Marcos Mac Mullen ’18

Editor’s note: This post is part of a series showcasing Barcelona GSE master projects by students in the Class of 2018. The project is a required component of every master program.


Authors:

Marcos Mac Mullen

Master’s Program:

Macroeconomic Policy and Financial Markets

Paper Abstract:

The goal of this paper is to assess quantitatively the impact that the emergence of China in the international markets during the 1990s had on the U.S. economy (i.e. the so-called China Shock). To do so, I build a model with two sectors producing two final goods, each of them using as the only input of production an intermediate good specific to each sector. Final goods are produced in a perfectly competitive environment. The intermediate goods are produced in a frictional environment with labor as the only input. First I calibrate the close economy model to match some salient stylized facts from the 1980s in the U.S. Then to assess the China Shock I introduce a new country (China) in the international scene. I proceed with two calibration strategies: (i) calibrate China such that it matches the variation in the price of imports relative to the price of exports for the U.S. between the average of the 1980s and the average of 2005-2007, (ii) Calibrate China such that variation in allocations are close to the ones observed in data, for the same window of time. I found that under calibration (i) the China Shock in the model explains 26.38% of the variation in the share of employment in the manufacturing sector, 16.28% of the variation in the share of manufacturing production and 27.40% of the variation in the share of wages of the manufacturing sector. Finally, under calibration (ii) I found that the change in relative price needed to match between 80 to 90 percent of the variation in allocations is around 3.47 times the one observed in data.

Conclusions and key results:

According to the model, the China Shock explains 26.35% of the variation in the share of manufacture employment, 16.28% of the variation in the share of manufacturing production and 27.44% of the variation in the share of wages of the manufacturing sector. The first of these results is consistent with findings in Autor et al. (2013). On the other hand, the variation in the unemployment rate of the economy is not matched, neither for the first nor the second calibration of the open economy. I also found that as a consequence of the China Shock, real wages increase when measuring them in terms of the price of the import good, and decrease when measured in terms of the price of the export good. This result is not in line with findings in Autor et al. (2013). The optimal unemployment insurance in the open economy is 6.13% of average wages higher than in the close economy because the unemployment rate of the open economy is higher than in the close economy (0.9% difference). Finally, the model generates a non-traditional source of comparative advantage, arising from differences in the relative bargaining power of workers.

Download the full paper [pdf]


More about the Macro Program at the Barcelona Graduate School of Economics

The dynamic relationship between long term interest rates and fiscal stances in the EMU

Editor’s note: This post is part of a series showcasing Barcelona GSE master projects by students in the Class of 2018. The project is a required component of every master program.


Authors:

Sybrand Brekelmans, Guillermo Sanz Marin, and Luca Tomassetti

Master’s Program:

Macroeconomic Policy and Financial Markets

Paper Abstract:

In this paper we study the dynamics and drivers of 10 year’s sovereign bond yields using a panel of the original 11 Eurozone countries (excluding Luxembourg). The interest of this study relies on the fact that despite very different macroeconomic policy stances in the variables that we believe determine interest rates among these countries, 10 years Eurozone bond yields almost perfectly converged during the 2000’s, before they suffered a sudden disconnection in the aftermath of the Great Financial Crisis.

To this end, we apply two different methodologies. A Panel Data approach (that we end discarding) and a Time Varying Coefficients model using the Kalman Filter, which allows us for capturing changes in the pricing mechanism of bond yields over time. Initially, by using the latter methodology without controlling for the volatility of the interest rates (which dramatically increased after 2008), we obtain very noisy results that are barely explainable, since the coefficients seem to be capturing these changes in volatility. Once we introduce in the filter a GARCH process for the variance-covariance matrix of the interest rates that we use in the Time Varying Coefficients approach, we manage to obtain much more meaningful and explicative results.

One of our key contributions is the inclusion of new fiscal and macroeconomic variables as determinants of yields in the different Eurozone countries, which were discarded by other studies in the field. We also contribute by controlling by common determinants to all the Eurozone countries, which we obtained by applying a common component approach. Furthermore, our findings confirm that after the period of divergence in interest rates, started in the aftermath of the Great Financial Crisis, and caused by a refocus on fundamentals, Eurozone interest rates have converged again under the effect of a normalization of bond yield drivers, similarly to their pre-crisis levels, although not to the same extent. Another implication that we find is that in times of economic uncertainty and financial hysteresis, when default risk becomes an issue, the effects of government policy on interest rates can significantly lead to accentuated crowding out effects.

Conclusions and key results:

Our work indicates that there has been a significant break in the way sovereign debt was priced after the Great Financial Crisis of 2008, indicating a return to fundamentals as main drivers of sovereign yields. We find that several factors reflective of fiscal and macroeconomic stances became increasingly important during the crisis, after having been ignored in previous years. As such, Debt to GDP, Deficit to GDP, GDP growth and Current Account balances to GDP, among others, started to play important roles in the determination of long term interest rates for Eurozone government bonds. In line with previous research, our findings confirm the existence of 3 distinct phases in the euro bond market. A period of high integration, a period of disintegration, and a phase of partial reintegration (Adam and Lo Duca (2017)).

Our findings suggest that during periods of economic uncertainty characterized by high volatility in the financial markets, investors tend to focus on fundamentals, while in times of economic boom they do not discriminate too much among the different stance of these macroeconomic determinants. This finding has important policy implications since it suggests that during economic crises interest rates react much more to unsustainable fiscal policies and macroeconomic imbalances than during calmer times, causing a great private sector crowding out effect (Laubach (2011)).

Therefore, our results suggest that governments should pay closer attention to their fiscal stances during times of economic turbulence in order to avoid the detrimental effects of high interest rates on activity in a period of economic agent´s lack of confidence. As argued before by De Grauwe and Ji (2013), this former effect is exacerbated by the fact that Eurozone governments have no control over monetary policy, making impossible for them to reduce interest rates by no other means than sound fiscal policies. In line with this result, we notice that the ECB’s unconventional monetary policy (we obtain that the impact of short term interest rates -one of our common determinants obtained by principal components- on long yields has diminished over time) helped to bring down European bond yields after 2014. This fact contributed to put the fiscal stances of these countries, and other essential macroeconomic variables, back to sustainable levels, that along with the structural reforms carried out (which in addition to the former effect, have also contributed to bring back economic confidence and dynamism) have had by its own another loosening impact in the interest rates that these countries have been facing in every debt issuance.

Regarding the methodologies used to address our research question, we were able to obtain robust results and determine which method was the most appropriate to investigate the drivers of 10 year’s sovereign bond yields. We found that panel data approaches, which are widely used in the literature, lead to unstable and unsatisfactory results, causing us to attach limited credibility to the outcomes of such analysis. However, the Time Varying Coefficients approach seems more reliable and yields more robust and plausible results after we model the changes in volatility appropriately. We believe that having a larger sample (we use the forecasts released twice a year by the IMF in its World Economic Outlook and by the OECD in their Economic Outlook in order to control by the effect of the market´s forward looking in current levels of interest rates, as well as by reverse causality) would have allowed us to obtain more reliable results on this approach as well.

A suggestion for further research would be to apply Bayesian techniques to estimate our model. Indeed, given the limited amount of data available and the complexity of our models, these methods seem to suit better in this kind of estimation, where the great amount of parameters, as well as the possible presence of non-linearities, can make the optimization process very costly. Consequently, this methodology would have allowed us to also model the variance of the Time Varying Parameters, and not only the ones of the interest rates (our observables) with another GARCH or stochastic volatility process, since we expect that these variances could also follow a conditional process, which might have an impact on our estimation results.

figure

figure

figure

Download the full paper [pdf]


More about the Macro Program at the Barcelona Graduate School of Economics

Economics articles by BGSE alumni at CaixaBank Research

Ricard Murillo, Marta Guasch, and Mar Domènech in front of Caixabank. Photo by Marta Guasch.

We’ve just come across some articles written by several Barcelona GSE Alumni who are now Research Assistants and Economists at Caixabank Research in Barcelona. New articles are published each month on a range of topics.

Below is a list of all the alumni we found listed as article contributors, as well as their most recent publications in English (click each author to view his or her full list of articles in English, Catalan, and Spanish).

If you’re an alum and you’re also writing about Economics, let us know where we can find your stuff!

Gerard Arqué (Master’s in Macroeconomic Policy and Financial Markets ’09)

The (r)evolution in the regulatory and supervisory framework resulting from the crisis

Mar Domènech (Master’s in International Trade, Finance, and Development ’17)

Registered workers affiliated to Social Security: situation and outlook across sectors

Active labour market policies: a results-based evaluation

Equal opportunities: levelling the playing field for everyone

Cristina Farràs (Master’s in Macroeconomic Policy and Financial Markets ’17)

The financial situation of Millennial households in the US and Spain: will they catch up with previous generations?

Measures to improve equality of opportunities

Marta Guasch (Master’s in International Trade, Finance, and Development ’17)
and Adrià Morron (Master’s in Economics ’12)

Jay Gatsby’s American Dream: between inequality and social mobility

Ricard Murillo (Master’s in International Trade, Finance, and Development ’17)

Inflation will gradually recover in the euro area

Millenials and politics: mind the gap!

The sensitivity of inflation to the euro’s appreciation

Ariadna Vidal Martínez (Master’s in Finance ’12)

Situation and outlook for consumer financing


Source: Caixabank Research

Could post-Brexit uncertainty have been predicted?

By Cox Bogaards, Marceline Noumoe Feze, Swasti Gupta, Mia Kim Veloso

Almost a year since the UK voted to leave the EU, uncertainty still remains elevated with the UK’s Economic Policy Index at historical highs.  With Theresa May’s snap General Election in just under two weeks, the Labour party has narrowed the gap from Conservative lead to five percentage points, which combined with weak GDP data of only 0.2 per cent growth in Q1 2017 released yesterday, has driven the pound sterling to a three-week low against the dollar. Given potentially large repurcussions of market sentiment and financial market volatility on the economy as a whole, this series of events has further emphasised the the need for policymakers to implement effective forecasting models.

In this analysis, we contribute to ongoing research by assessing whether the uncertainty in the aftermath of the UK’s vote to leave the EU could have been predicted. Using the volatility of the Pound-Euro exchange rate as a measure of risk and uncertainty, we test the performance of one-step ahead forecast models including ARCH, GARCH and rolling variance in explaining the uncertainty that ensued in the aftermath of the Brexit vote.

Introduction

The UK’s referendum on EU membership is a prime example of an event which perpetuated financial market volatility and wider uncertainty.  On 20th February 2016, UK Prime Minister David Cameron announced the official referendum date on whether Britain should remain in the EU, and it was largely seen as one of the biggest political decisions made by the British government in decades.

Assessment by HM Treasury (2016) on the immediate impacts suggested “a vote to leave would cause an immediate and profound economic shock creating instability and uncertainty”, and in a severe shock scenario could see sterling effective exchange rate index depreciate by as much as 15 percent.  This was echoed in responses to the Centre for Macroeconomics’ (CFM) survey (25th February 2016), where 93 percent of respondents agreed that the possibility of the UK leaving the EU would lead to increased volatility in financial markets and the broader economy, expressing uncertainty about the post-Brexit world.

Resonating these views, the UK’s vote to leave the EU on 23rd June 2016 indeed led to significant currency impacts including GBP devaluation and greater volatility. On 27th June 2016, the Pound Sterling fell to $1.315, reaching a 31-year low against the dollar since 1985 and below the value of the Pound’s “Black Wednesday” value in 1992 when the UK left the ERM.

In this analysis, we assess whether the uncertainty in the aftermath of the UK’s vote to leave the EU could have been predicted. Using the volatility of Pound-Euro exchange rate as a measure of risk and uncertainty, we test the performance of one-step ahead forecast models including ARCH, GARCH and rolling variance. We conduct an out-of-sample forecast based on models using daily data pre-announcement (from 1st January 2010 until 19th February 2016) and test performance against the actual data from 22nd February 2016 to 28th February 2017.

Descriptive Statistics and Dynamic Properties

As can be seen in Figure 1, the value of the Pound exhibits a general upward trend against the Euro over the majority of our sample. The series peaks at the start of 2016, and begins a sharp downtrend afterwards.  There are several noticeable movements in the exchange rate, which can be traced back to key events, and we can also comment on the volatility of exchange rate returns surrounding these events, as a proxy for the level of uncertainty, shown in Figure 2.

Figure 1: GBP/EUR Exchange Rate

Fig 1

Source: Sveriges Riksbank and authors’ calculations

Notably, over our sample, the pound reached its lowest level against the Euro at €1.10 in March 2010, amid pressure from the European Commission on the UK government to cut spending, along with a bearish housing market in England and Wales. The Pound was still recovering from the recent financial crisis in which it was severely affected during which it almost reached parity with the Euro at €1.02 in December 2008 – its lowest recorded value since the Euro’s inception (Kollewe 2008).

However, from the second half of 2011 the Pound began rising against the Euro, as the Eurozone debt crisis began to unfold. After some fears over a new recession due to consistently weak industrial output, by July 2015 the pound hit a seven and a half year high against the Euro at 1.44.   Volatility over this period remained relatively low, except in the run up to the UK General elections in early 2015.

However, Britain’s vote to leave the EU on 23rd June 2016 raised investors’ concerns about the economic prospects of the UK. In the next 24 hours, the Pound depreciated by 1.5 per cent on the immediate news of the exit vote and by a further 5.5 per cent over the weekend that followed, causing volatility to spike to new record levels as can be seen in Figure 2.

Figure 2: Volatility of GBP/EUR Exchange Rate

fig 2

Source: Sveriges Riksbank and authors’ calculations

As seen in Figure 1, the GBP-EUR exchange rate series is trending for majority of the sample, and this may reflect non-stationarity in which case standard asymptotic theory would be violated, resulting in infinitely persistent shocks. We conduct an Augmented Dickey Fuller test on the exchange rate and find evidence of non-stationarity, and proceed by creating daily log returns in order to de-trend the series. Table 1 summarises the first four moments of the log daily returns series, which is stationary.

Table 1: Summary Statistics

Table 1.PNG

Source: Sveriges Riksbank and authors’ calculations

The series has a mean close to zero, suggesting that on average the Pound neither appreciates or depreciates against the Euro on a daily basis. There is a slight negative skew and significant kurtosis – almost five times higher than that of the normal distribution of three – as depicted in the kernel density plot below. This suggests that the distribution of daily returns for the GBP-EUR, like many financial time series, exhibits fat tails, i.e. it exhibits a higher probability of extreme changes than the normal distribution, as would be expected.

To determine whether there is any dependence in our series, we assess the autocorrelation in the returns. Carrying out a Ljung-Box test using 22 lags, as this corresponds to a month of daily data, we cannot reject the null of no autocorrelation in the returns series, which is confirmed by an inspection of the autocorrelograms. While we find no evidence of dependence in the returns series, we find strong autocorrelations in the absolute and squared returns.

The non-significant ACF and PACF of returns, but significant ACFs of absolute and squared returns indicate that the series exhibits ARCH effects. This suggests that the variance of returns is changing over time, and there may be volatility clustering. To test this, we conduct an ARCH-LM test using four lag returns and find that the F-statistic is significant at the 0.05 level.

Estimation

For the in-sample analysis we proceed using the Box-Jenkins methodology. Given the evidence of ARCH effects and volatility clustering using an ARCH-LM test but lack of any leverage effects in line with economic theory, we proceed to estimate models which can capture this: ARCH (1), ARCH (2), and the GARCH (1,1).  Estimation of ARCH (1) suggests low persistence as captured by α1 and relatively fast mean reversion. The ARCH(2) model generates greater persistence measured by sum of α1 and α2 and but still not as large as the GARCH(1,1) model, sum of  α1 and β as shown in table 2.

Table 2: Parameter Estimates

table 2

We proceed to forecast using the ARCH(1) as it has the lowest AIC and BIC in-sample, and GARCH (1,1) which has the most normally distributed residuals, no dependence in absolute levels, and the largest log-likelihood. We compare performance against a baseline 5 day rolling variance model.

Figure 3 plots the out of sample forecasts of the three models (from 22nd February 2016 to 28th February 2017). The ARCH model is able to capture the spike in volatility surrounding the referendum, however the shock does not persist. In contrast, the effect of this shock in the GARCH model fades more slowly suggesting that uncertainty persists for a longer time. However neither of the models fully capture the magnitude of the spike in volatility. This is in line with Dukich et al’s (2010) and Miletic’s (2014) findings that GARCH models are not able to adequately capture the sudden shifts in volatility associated with shocks.

Figure 3: Volatility forecasts and Squared Returns (5-day Rolling window)

Fig 3

We use two losses traditionally used in the volatility forecasting literature namely the quasi-likelihood (QL) loss and the mean-squared error (MSE) loss. QL depends only on the multiplicative forecast error, whereas the MSE depends only on the additive forecast error. Among the two losses, QL is often more recommended as MSE has a bias that is proportional to the square of the true variance, while the bias of QL is independent of the volatility level. As shown in table 3, GARCH(1,1) has the lowest QL, while the ARCH (1) and rolling variance perform better on the MSE measure.

Table 3: QL & MSE

Table 3 QL and MSE

Table 4: Diebold- Mariano Test (w/5-day Rolling window)

Table 4 DM test

Employing the Diebold-Mariano (DM) Test, we find that there is no significance in the DM statistics of both the QL and MSE. Neither the GARCH nor ARCH are found to perform significantly better than the 5-day Rolling Variance.

Conclusion

In this analysis, we tested various models to forecast the volatility of the Pound exchange rate against the Euro in light of the Brexit referendum. In line with Miletić (2014), we find that despite accounting for volatility clustering through ARCH effects, our models do not fully capture volatility during periods of extremely high uncertainty.

We find that the shock to the exchange rate resulted in a large but temporary swing in volatility but this did not persist as long as predicted by the GARCH model. In contrast, the ARCH model has a very low persistence, and while it captures the temporary spike in volatility well, it quickly reverts to the unconditional mean.  To the extent that we can consider exchange rate volatility as a measure of risk and uncertainty, we may have expected the outcome of Brexit to have a long term effect on uncertainty. However, we observe that the exchange rate volatility after Brexit does not seem significantly higher than before. This may suggest that either uncertainty does not persist (unlikely) or that the Pound-Euro exchange rate volatility does not capture fully the uncertainty surrounding the future of the UK outside the EU.

References

Abdalla S.Z.S (2012), “Modelling Exchange Rate Volatility using GARCH Models: Empirical Evidence from Arab Countries”, International Journal of Economics and Finance, 4(3), 216-229

Allen K.and Monaghan A. “Brexit Fallout – the Economic Impact in Six Key Charts.” www.theguardian.com. Guardian News and Media Limited, 8 Jul. 2016. Web. Accessed: March 11, 2017

Brownlees C., Engle R., and Kelly B. (2011), “A Practical Guide to Volatility Forecasting Through Calm and Storm”, The Journal of Risk, 14(2), 3-22.

Centre for Macroeconomics (2016), “Brexit and Financial Market Volatility”. Accessed: March 9, 2017.

Cox, J. (2017) “Pound sterling falls after Labour slashes Tory lead in latest election poll”, independent.co.uk. Web. Accessed May 26, 2017

Diebold F. X. (2013), “Comparing Predictive Accuracy, Twenty Years Later: A Personal Perspective on the Use and Abuse of Diebold-Mariano Tests”. Dukich J., Kim K.Y., and Lin H.H. (2010), “Modeling Exchange Rates using the GARCH Model”

HM Treasury (2016), “HM Treasury analysis: the immediate economic impact of leaving the EU”, published 23rd May 2016.

Sveriges Riksbank, “Cross Rates” www.riksbank.se. Web. Accessed 16 Feb 2017

Taylor, A. and Taylor, M. (2004), “The Purchasing Power Parity Debate”, Journal of Economic Perspectives, 18(4), 135-158.

Van Dijk, D., and Franses P.H. (2003), “Selecting a Nonlinear Time Series Model Using Weighted Tests of Equal Forecast Accuracy”, Oxford Bulletin of Economics and Statistics, 65, 727–44.

Tani, S. (2017), “Asian companies muddle through Brexit uncertainty” asia.nikkei.com. Web. Accessed: May 26, 2017

TEDxGothenburg: Money talks – but where does it come from? (Sascha Buetzer ’11)

In this post, Sascha Buetzer ’11 (Macro) shares his experience of giving a talk at TEDx Gothenburg.

buetzer.jpg

Barcelona GSE Macroeconomics and Financial Markets alum Sascha Buetzer ’11 is currently a PhD candidate in Economics at the University of Munich. After starting his career at the European Central Bank, he has been working as an economist in the International Monetary Affairs Division at the Deutsche Bundesbank, primarily in an advisory role for the German executive director at the International Monetary Fund.

In this post, Sascha shares his experience of giving a talk at TEDx Gothenburg in November 2016. The video is included below.


tedx

Last November I was given the unique opportunity to present my ideas to an audience that usually doesn’t get to hear much about the inner workings of the financial system and central banking. At the TEDx Conference “Spectrum” in Gothenburg, Sweden, I was one of 10 speakers who were given not more than 20 minutes to convey an “idea worth spreading.”

My talk centered around how money is created in a modern economy and what can be done to improve upon this process, in particular in the context of the euro crisis. Quite a challenge, as it turned out, given the short amount of time and the need to keep things understandable and entertaining since most of the audience did not have an economic background.

It was, however, an extremely exciting and rewarding experience.

After getting in touch with the organizers through a chance encounter at the Annual Meeting of the Asian Development Bank (ADB) last year, all speakers got an individually assigned coach who provided excellent feedback and recommendations for the talk.

The day of the conference itself was thoroughly enjoyable (at least after the initial rush of adrenaline had subsided). It was fascinating to interact with people from widely varying backgrounds that all shared a natural curiosity and desire to learn from each other. And at the end of the day it felt great for having been able to contribute to this, by providing people with insight into a topic so important, yet so little-known.

Watch the talk here:

See also:
Coverage of the talk by University of Gothenburg
– More coverage from University of Gothenburg
– Other talks from TEDxGothenburg “Spectrum” 2016

Monetary policy effects on inequality: A country state-level analysis for the United States

Editor’s note: This post is part of a series showcasing Barcelona GSE master projects by students in the Class of 2016. The project is a required component of every master program.


Authors:
Carola Ebert, Sigurdur Olafson, and Hannah Pfarr

Master’s Program:

Macroeconomic Policy and Financial Markets

Paper Abstract:

Our project focuses on the assessment of a potential relationship between monetary policy actions and economic inequality in form of the Gini index. The analysis is conducted for the United States on a country- as well as on a state-level. Lack of quality and comparability of inequality data is a major empirical challenge in the research on inequality in general. Thus, we use different methodology for the data on the Gini index to ensure the robustness of the main results on the country level. Moreover, the main contribution of this work is the analysis on a more dis-aggregated level, i.e. the state level and a regional level. The state- and region- level analysis provide a further line of investigation with respect to the relationship between monetary policy and inequality. When considering monetary policy effects on inequality on an aggregated country level, one cannot be sure whether potentially heterogeneous reactions of inequality within the country are washed away by the aggregation over states. Therefore, this paper does not only analyse the relation between monetary policy and inequality on the aggregate country level, but also on the state level. Furthermore, as a first step in the direction of investigating potential transmission channels of monetary policy into inequality, further tests with regards to the initial wealth levels across states are conducted.


Main Conclusion:

In this paper we analyzed the implication of a monetary policy shock for inequality on a country-level, regional-level and (on a) state-level (as well). The results are largely consistent across different model specifications and considering different geographic levels, implying that a contractionary monetary policy shock seems to raise inequality on impact. Although the effect is small, it is consistently positive across states and regions. Already the fact that we find that there is an effect of monetary policy on inequality is a contribution in itself.

chart

Our benchmark model using OLS leads to the conclusion that a contractionary monetary policy shock leads to an increase of inequality. This finding holds for richer as well as poorer states. The contemporaneous impact stays positive over the considered time horizon. We have run several robustness checks using different model specification for OLS as well as a VAR approach. These checks have confirmed the earlier findings.

chart


Outlook:

However, we have stressed that there is room for further investigation to shed light in this area.One potential way to go would be to expand the analysis by using alternative inequality measures. We conducted our analyses using the Theil and Atkinson indices obtained from Frank’s website. However, since the results obtained by using these two measures do not add anything to the analysis, we have not included them in the paper.

Furthermore, some authors stress the relevance of the top income and low income distribution when analyzing the effects of inequality. Therefore, one might think about controlling for the share of top incomes within the states to asses to what extent the reaction to monetary policy would change. We already tried capturing parts of these potential dynamics using the simple mean comparison tests for wealth dependence.

We find evidence that (seems to) suggest that monetary policy has a stronger impact on wealthy states and regions compared to poorer states and regions. Those results are obtained using simple mean-comparison tests and should be viewed as preliminary, as further research on the issue is necessary to concretely conclude whether the initial wealth-level of states and regions is relevant for the transmission mechanism from monetary policy on inequality.

After our results show that there is an effect of monetary policy on inequality, the next step in this line of research could be the investigation of the actual transmission mechanism. One could include a more elaborate analysis of the potential channels through which monetary policy affects inequality.

As a last remark, we want to point out that we were originally interested in investigating this topic for the European Monetary Union member states. This, however, is not possible due to data limitations. Firstly, across European countries there is no uniform definition and measurement of inequality. Secondly, the monetary union is fairly new and the data of a unitary monetary policy shock would be too short to conduct our analysis. Nevertheless, we do believe this is an interesting path for future research and, as pointed out in the literature review, the finding of a robust relationship between monetary policy and inequality backs this claim.