In a recent paper, Michael Gofman, assistant professor of finance in the Wisconsin School of Business, tackles the implications of regulating large interconnected financial institutions. Assisted by his Ph.D. student, Alexander Dentler, Gofman has been using the Open Science Grid (OSG) through the University of Wisconsin-Madison’s Center for High Throughput Computing. For more than a year, Gofman has pursued a fundamental question: What is an optimal structure for the financial system?
“Especially since the financial crisis in 2008-2009, the desired structure of the financial system became a big question among academics and regulators,” said Gofman. “Policy makers began to ask about the roles of large interconnected financial institutions in the financial system. Some of them had to be bailed out by governments in the United States and elsewhere. Now that the crisis is behind us, the question is whether we should make any changes going forward.”
In order to model a very complicated financial architecture, Gofman began to do computer simulations of how banks trade. “This is hard to do on a single computer,” said Gofman. “We were very grateful to learn about distributed high-throughput computing and how it could solve our problem.”
The single biggest intellectual challenge, according to Gofman, is the too-interconnected-to-fail question. “There were various proposals during and after the crisis,” continued Gofman. “It’s easy to see the costs of too-interconnected-to-fail banks, but what are the benefits?”
Gofman set out to understand not only the benefits and costs of the current structure, but also what would happen if the current structure were changed or if they could set a limit on the number of connections that a given bank could have. How would regulation affect the efficiency and stability of the financial architecture? Would there be a trade-off between the two? From a societal point of view, how much weight should be put on efficiency versus stability?
“That’s what this project is trying to do,” said Gofman. “First, calculate the efficiency and stability of the current financial structure by uncovering the hidden links in the economy based on some observable traits. Second, calculate a model with changes to the structure and see what would happen. We want to answer the question of what will happen if we put restrictions on large interconnected banks.”
“There is no analytical solution, so you need to compute it,” said Dentler. “We used the university’s distributed high-throughput systems and OSG for around 400,000 hours, the equivalent of 45 years on one computer running 24/7. To give valid recommendations to policy makers, we need to be able to compute the model, vary the model, and try to predict what the outcomes will be. We can then recompute with new variations without the barriers we would get using a single computer.”
This is an ideal problem for distributed high-throughput computing because each node on the grid can be used to compute a different counterfactual financial architecture. There is no information transfer required between different models, so each could run separately without talking to the other. Computing the same model under different parameterizations uncovers which parameters should be chosen to match the interbank trading patterns in the real market with thousands of banks. “If you can match the current market structure, you can compute the efficiency and stability of a financial architecture with too-interconnected-to-fail banks,” said Gofman. Finding the model’s parameters took the most CPU hours, Dentler noted.
“There are thousands of banks, and we don’t see what they do. We don’t know their bilateral constructs or their legal agreements. All we can see are some patterns of realized trades between them. The goal of the computational model is to uncover the trading relationships that we don’t see but that can be important for contagion in the future,” Gofman continued.
The big problem is how to predict the influence of regulation on the financial architecture given that experimenting with the real financial architecture by breaking down banks is not a feasible option. After fitting the model to match the real market, the model was recalculated under new parameters that reflect the potential regulation.
The standard argument by policy makers in favor of changing the current structure is improving stability. However, Gofman says that restricting the number of counterparties that banks are allowed to have has three adverse effects. “First,” he says, “the resource allocation process of the interbank market becomes less efficient. Second, the new financial architecture that emerges is more fragile. Third, more banks become systemically involved, while identifying these banks becomes more difficult. We need to be careful about changing the current structure. A careful quantitative modeling of large complex financial networks is necessary for understanding the implications of financial regulation. Regulation that is based on simple heuristic rules might not result in a more stable financial architecture that would mitigate the severity of the next financial crisis.”
Gofman has presented his paper at various academic conferences, as well as at conferences organized by central banks in the United States and in Europe. He says it’s hard to measure its impact so far. “I think the point is that large interconnected banks are perceived to be bad, but high trading capability has its benefits. If we make the financial structure more homogeneous where no one is too big to fail, market efficiency will decline.”
Gofman’s research is helping work toward a more stable financial structure, and it provides a framework for understanding different possible models. The ability to do experiments with the current financial architecture is a critical step for understanding what financial regulation is beneficial. “Many more experiments could be done,” he says. “This has been so much easier because of the availability of free resources like the OSG.”
~ Greg Moore and Sarah Engel