I am a Postdoc at the Institute for Macroeconomics and Econometrics of the University of Bonn. Currently, I am directly funded by the German Reserach Foundation (I am the principal investigator of DFG Project 441540692). My research focusses on Dynamic Macroeconomics, Monetary Theory, and Inequality with emphasis on structural empirical analysis and heterogeneity. I also have a strong background in quantitative methods and IT. My research statement can be found here.
A Structural Investigation of Quantitative Easing
Review of Economics and Statistics, forthcoming
[paper, ungated, WP version, with Gavin Goy and Felix Strobel, code, replication] (read more)
We provide evidence that the Federal Reserve's large-scale asset purchases actually reduce inflation. Using nonlinear Bayesian methods that fully account for the binding zero lower bound (ZLB), we estimate a macro-finance DSGE model. Counterfactual analysis suggests that by easing financing conditions, quantitative easing facilitated an increase in aggregate investment. The resulting expansion of firms’ production capacities lowered their marginal costs. These disinflationary supply side effects dominated over the inflationary effects coming from the higher aggregate demand. At the ZLB, the concomitant rise in real interest rates in turn induced a net fall in aggregate consumption.
Monetary Policy and Speculative Asset Markets
European Economic Review, 2022
I study monetary policy in an estimated financial New-Keynesian model extended by behavioral expectation formation in the asset market. Credit frictions create a feedback between asset markets and the macroeconomy, and behaviorally motivated speculation can amplify fundamental swings in asset prices, that potentially cause endogenous, nonfundamental bubbles and bursts. Booms in asset prices improve firms financing conditions and are therefore deflationary. These features significantly improve the power of the model to replicate empirical key moments. A monetary policy that targets asset prices can dampen financial cycles and reduce volatility in asset markets (dampening effect). This comes at the cost of creating an additional channel through which asset price fluctuations transmit to macroeconomic fundamentals (spillover effect). I find that unless financial markets are severely overheated, the undesirable fluctuations in inflation and output caused by the spillover effect more than outweigh the benefits of the dampening effect.
Efficient Solution and Computation of Models with Occasionally Binding Constraints
Journal of Economic Dynamics and Control, 2022
[paper, ungated, WP version, code, replication] (read more)
Structural estimation of macroeconomic models and new HANK-type models with extremely high dimensionality require fast and robust methods to efficiently deal with occasionally binding constraints (OBCs). This paper proposes a novel algorithm that solves for the perfect foresight path of piecewise-linear dynamic models. In terms of computation speed, the method outperforms its competitors by more than three orders of magnitude. I develop a closed-form solution for the full trajectory given the expected duration of the constraint. This allows to quickly iterate and validate guesses on the expected duration until a perfect-foresight equilibrium is found. A toolbox, featuring an efficient implementation, a model parser and various econometric tools, is provided in the Python programming language. Benchmarking results show that for medium-scale models with an occasionally binding interest rate lower bound, more than 150,000 periods can be simulated per second. Even simulating large HANK-type models with almost 1000 endogenous variables requires only 0.2 milliseconds per period.
Ensemble MCMC Sampling for Robust Bayesian Inference
[current version (01/2023), working paper, code (Python), code (Julia), code (matlab)] (read more)
This paper proposes a Differential-Independence Mixture Ensemble (DIME) sampler for the Bayesian estimation of structural models. It allows for the estimation of models that are computationally expensive to evaluate and that feature challenging, multimodal, high-dimensional black-box posterior distributions. DIME is a "Swiss Army knife," combining the advantages of gradient-free global multi-start optimizers with the properties of Monte Carlo Markov chains. The sampler runs many chains in parallel, with the number of necessary iterations scaling well with the number of chains. This permits more likelihood evaluations in less time while also requiring fewer evaluations. To demonstrate its potential, DIME is used to estimate a medium-scale heterogeneous agent New Keynesian ("HANK") model with two assets, thereby, for the first time, including the households' preference parameters which determine the model's steady state distribution. The results point towards a somewhat less accentuated role of household heterogeneity for the empirical macroeconomic dynamics.
The Empirical Performance of the Financial Accelerator since 2008 (under review)
[current version (12/2022), working paper, with Felix Strobel, code] (read more)
We evaluate the empirical performance of financial frictions á la Bernanke et al. (1999) during and after the Global Financial Crisis. We document that in an ex-post analysis, these frictions do not improve the canonical medium-scale DSGE model's ability to explain macroeconomic dynamics during the Great Recession. The reason is that in the estimated model with financial frictions, the drastic post-2008 collapse of investment causes firms' leverage to decline, which in the model would trigger a counterfactual narrowing of the credit spread. This would imply financial decelerator effects for the post-2008 period. Additionally, the estimated model only attributes a minor role to the associated financial shocks. These findings are confirmed independently for US and euro area data. Our analysis is based on nonlinear Bayesian methods, which account for the binding effective lower bound on nominal interest rates.
Estimation of DSGE Models with the Effective Lower Bound (under review)
[current version (07/2022), working paper, with Felix Strobel, posterior & historic shocks, code] (read more)
We propose a set of tools for the efficient and robust Bayesian estimation of medium- and large-scale DSGE models while accounting for the effective lower bound on nominal interest rates. We combine a novel nonlinear recursive filter with a computationally efficient piece-wise linear solution method and a state-of-the-art MCMC sampler. The filter allows for fast likelihood approximations, in particular of models with large state spaces. Using artificial data, we demonstrate that our methods accurately capture the true model parameters even with very long lower bound episodes. We apply our approach to analyze post-2008 US business cycle properties.
The Hockey Stick Phillips Curve and the Zero Lower Bound (under review)
[current version (01/2023), working paper, with Philipp Lieberknecht, code] (read more)
We show that the interplay between a binding effective lower bound (ELB) and the costs of external financing weakens the disinflationary effect of financial shocks. In normal times, factor costs dominate firms' marginal costs and hence inflation; credit spreads and the nominal interest rate, which together constitute external financing costs, balance out. When nominal rates are constrained by the ELB, larger spreads can in parts offset the effect of lower factor costs on firms' price setting. The Phillips curve is hence flat at the ELB, but features a positive slope in normal times and thus an overall hockey stick shape. This mechanism also weakens the effects of forward guidance on inflation, since such policy reduces spreads and thereby financing costs.
Rational vs. Irrational Beliefs in a Complex World (under review)
[current version (09/2022), working paper, with Cars Hommes, code] (read more)
Can boundedly rational agents survive competition with fully rational agents? We develop a highly nonlinear heterogeneous agents model with rational forward looking versus boundedly rational backward looking agents and evolving market shares depending on their relative performance. Our novel numerical solution method detects equilibrium paths characterized by complex bubble and crash dynamics. Boundedly rational trend-extrapolators amplify small deviations from fundamentals, while rational agents anticipate market crashes after large bubbles and drive prices back close to fundamental value. Overall rational and non-rational beliefs co-evolve over time, with time-varying impact, and their interaction produces complex endogenous bubble and crashes, without any exogenous shocks.
Can Taxation Predict US-Top-Wealth Share Dynamics?
[current version, working paper, with Thomas Fischer] (read more)
We show that the degree of capital gains taxation can retrace the dynamics of wealth inequality in the US since the 1920s. Precisely matching up- and downturns and levels of top shares, it has high overall explanatory power. This result is drawn from an estimated and micro-founded portfolio-choice model where idiosyncratic return risk and disagreement in expectations on asset returns generate an analytically tractable fat-tailed Pareto distribution for the top-wealthy. This allows us to decompose the sample into periods of transient and stationary wealth concentration. The model generates good out-of-sample forecasts. As an addition we predict the future evolution of inequality for different tax regimes.
The Micro & Macro of (Unconventional) Monetary Policy: the Role of the Banking Sector
[current draft (07/2022), code, slides] (read more)
Macroeconomic theory assigns a central role to the risk free savings rate, which in reality corresponds to the banks' deposit rate that is only indirectly controlled by the central bank. Backed by Euro-Area evidence, this paper shows that the pass-through of central bank reserves and interest-on-reserves policy on equilibrium rates may be state-dependent. I develop an industrial organization model of the banking sector where loans are financed by deposits and banks use reserves to hedge against liquidity risk from holding deposits. Swapping assets and reserves (i.e. central bank asset purchases) compresses the liquidity premium between lending and deposit rate, thereby stimulating lending. If the lending rate falls by less than the liquidity premium (if investment demand is elastic, e.g. in a recession), deposit rates may increase, thereby dampening consumption. Additionally, inflation may remain subdued as lower lending rates reduce firms' financing costs. Incorporated into a DSGE model, the estimated model suggests that the effects of the ECB policy amount to 0.25 percent of quarterly GDP, and the effects on inflation is negligible.
Econpizza: Solving Nonlinear Heterogenous Agent Models Using Machine Learning Techniques
[current draft (now), code] (read more)
Econpizza is a framework to solve and simulate nonlinear perfect foresight models, with or without heterogeneous agents. A parser allows to express economic models in a simple, high-level fashion as yaml-files. Additionally, generic and robust routines for steady state search are provided.
(Unconventional) Fiscal Policy at the ELB
[work in progress, with Ralph Luetticke] (read more)
We document that the effects of the zero lower bound on nominal interest rates (ZLB) on inequality dynamics can be decomposed in two channels:
(i) the real rate channel compounds the effects of a recessionary shock on inequality while (ii) the liquidity premium channel compresses the spread between liquid and illiquid assets, therby reducing inequality. We find that the real rate channel dominates for income inequality whereas wealth inequality is driven by the liquidity premium channel. Large-scale asset purchases (LSAPs) and helicopter money drops (HDs) are moderately effective in stimulating the economy.
Their effects are greatly amplified by the duration for which the ZLB is expected to bind.
HDs are always egalitarian. For LSAPs, short ZLB durations cause a hike in consumption inequality and a longer binding ZLB causes wealth inequality to increase.
The Quantitative Effects of Taxation on Inequality Dynamics
[work in progress] (read more)
We use a novel identification strategy of functional-coefficient structural inference to estimate the relationship between different forms of taxation and the concentration of income and wealth. We find that overall, the degree of taxation has a very high explanatory power on the dynamics of top shares, in particular so the series of income taxes. This regularity holds for the US, UK, France and Sweden. We estimate that an 1 percent increase of taxation reduces the concentration of wealth in the long run by approximately 0.5 percent and concentration of income by 0.25 percent. This effect is more emphasized in the US and less so for Sweden.
Helicopter Money and the Cross-Section of Households
[with Keith Kuester, work in progress ]
Low nominal rates and the saver: The dark side of lower(ing) interest rates
[with Pablo Guerrón-Quintana and Keith Kuester, work in progress ]
The Federal Reserve and quantitative easing: A boost for investment, a burden on inflation
[ VoxEU August 2020, with Gavin Goy and Felix Strobel]
The ETACE Virtual Appliance: An Exploratory for the Eurace@Unibi model
[with Sander van der Hoog, download paper, download software] (read more)
This paper presents the Etace Virtual Appliance. The purpose of the software package is, among others, to provide researchers the possibility to explore the dynamics of the Eurace@Unibi agent-based macroeconomic model and to encourage the reproducibility and transparency of research. The package contains various components that allow the user to initialize, simulate and analyze the model. We also give a short overview of what can be done with the Etace Virtual appliance.
In brief, my current position is fully funded by the German Reserach Foundation (DFG) as part of my reserach project on nonlinear Bayesian estimations. With the OSE initiative I also successfully raised funding for teaching and research infrastructure. Prior to joining the University of Bonn I spent two years as a Postdoc at the IMFS at Goethe University Frankfurt in cooperation with the Hoover Institution at Stanford University (longer-term visits in Stanford in 2018 and 2019). I completed my PhD at the University of Amsterdam and Bielefeld University, supervised jointly by Cars Hommes and Herbert Dawid. I won the 2017 Student Price of the Society for Computational Economics. During my PhD I was financed by scholarships from the Bielefeld Graduate School in Economics and the DFG. I obtained my MSc in economics from the University of Granada (top of class) and studied economics at Humboldt University Berlin at undergraduate level. I have worked as a professional guitar player and as an IT consultant for several start-up companies.
My packages on (the python packages can be installed via `pip`):
emcwrap grgrlib DIMESampler.jl dime-mcmc-matlab
econpizza is a generic nonlinear solver for general equilibrium models, including heterogenous agent models (and including exressing and parsing models). It uses an alternative shooting algoritm that is faster and more robust than the extended path method implemented in Dynare. Example heterogenous agents models (and representative agent models) are provided. The package is documented in this draft.
pydsge is a Python based solution and simulation toolbox, specifically targeted to provide tools for nonlinear filtering and estimation of models with occasionally binding constraints. Its back-end for nonlinear filtering is econsieve, a hybrid between the Particle filter and the Kalman filter. Both packages are explained in the respective method paper above.
emcwrap (Python), DIMESampler.jl (Julia) and dime-mcmc-matlab (matlab) provide the differential-independence mixture ensemble (DIME) MCMC sampler from my paper on ensemble MCMC sampling. DIME MCMC is a (very fast) global multi-start optimizer and, at the same time, a MCMC sampler that converges to the posterior distribution. This makes any posterior mode density maximization prior to MCMC sampling superfluous. The DIME sampler is pretty robust for odd shaped, multimodal distributions. DIME MCMC is parallelizable: many chains can run in parallel, and the necessary number of draws decreases almost one-to-one with the number of chains.
- This guide by John Cochrane summarizes many useful tips on how to write a paper (and how not to).
- This post by Keith Head gives great advice on how to write an intro.
- The website of Eric Sims provides great course materials on macroeconomics.
- This is another nice collection of guides for early-career researchers.
- This course from Ken Judd and the website of Fabrice Collard have nice material on computational economics.
- If you are looking for implementations of macroeconomic models, the MMB replication archive and the repo of Johannes Pfeifer are a good place to start searching. Some models can also be found in my *.yaml archive.
- These notes by Jesús Fernández-Villaverde and Juan Robio-Ramírez develop the microfoundations of the complete Smets-Wouters-like medium-scale DSGE model.
- This interactive online textbook (by Roger Labbe) gives an excellent and hands-on introduction into Bayesian filtering.
- This post explains very nicely how the Hamiltonian Monte Carlo (HMC) Sampler works and, en passant, shows why using Metropolis Hastings might not be a good idea for many problems in practice. Note that HMC is also behind the NUTS sampler used in Stan (a widely used sampling package), but not very feasible for many applications in structural (macro-)econometrics. The reason is that HMC requires the evaluation of the gradient at each draw, which is relatively costly for most of our likelihood functions. Have a look at the DIME sampler if you are looking for a powerful multi-purpose sampler.
- Python (with numba or jax) is incredbibly fast
- Python is easy to debug and (normally) provides meaningful error messages
- Python is a general purpose language, and (truely) object-oriented
- Python is free and open source. There are no limiting licences or any additional costs
- Python is matured: updates don't tend to break things and bugs are very rare. Mayor tech firms just recently invested heavily into Python (jax by google, PyTorch by Meta, amazon, ...), so it will probably be around for longer
- Python is well documented and has a huge active user base. The answers to many questions can be found on Stack Overflow
- Python enforces a high code quality and readability
- Knowing python is also an asset outside of academia. It is the industry standard in machine learning and big data
- There are a trillion well documented and well tested packages out there for essentially any purpose
- Python is very well integrated into modern software development workflows (version control and continuous integration using github, automatic documentation using sphinx)
- To get started with Python, my recommendation is to install Anaconda and to use Spyder as the IDE (which comes with Anaconda). This is also a good starting point for people switching from matlab.
- This is an introductory course on Python programming I gave at the University of Bonn.
- Another nice introduction to Python from QuantEcon. They have great stuff on numerical methods as well.
- Python for matlab users: short vs. long.
- A good introduction to Git.
- Why matlab may be a barrier to scientific advancement.
- Paul Romer on using Python vs. matlab.
- A comparison of Julia, Python and matlab.
- numba - probably by now the first address to speed up your code.
- jax - emerging to be the new numba. Developed and endorsed by google. Different philosophy featuring automatic differenciation.
- chaospy - for quasi-random numbers and uncertainty quantification.
- filterpy (by Roger Labbe, see above) is a collection of linear and nonlinear Bayesian filters.
- interpolation.py (also by Pablo Winant) provides fast-as-light interpolation tools.
mail [ät] gregorboehl [döt] com
gboehl [ät] uni-bonn [döt] de
Dr. Gregor Boehl
Institute for Macroeconomics and Econometrics
University of Bonn