Hyman Minsky and the Dilemmas of Contemporary Economic Method By Duncan K. Foley (Paper)

Hyman Minsky and the Dilemmas of Contemporary Economic Method Duncan K. Foley

December, 1998

Department of Economics, Barnard College, After January, 1999, Department of Economics, Graduate Faculty, New School University, 65 Fifth Avenue, New York, NY, 10003.

Minsky’s influential work on financial fragility grows out of a recognition of the pivotal role financial markets and institutions play in converting liquid general claims to undifferentiated output into illiquid, risky, long-lived investment projects. This process requires savers, investors and intermediaries to value the prospective profit of particular investments in historically unique circumstances. Contemporary economic methodology, on the other hand, sees statistically repeatable observed events as the operational equivalents of the abstract variables of mathematical models of economic behavior. The attempts of Minsky and his
followers to express his vision in the language of mathematical and econometric modeling founder on the incompatibility of these two points of view. The inescapable relevance of Minsky’s ideas to the understanding of world political economic developments challenges us to expand received notions of scientific method applied to economics and economic data.


Hyman Minsky’s work on financial fragility and the political economy of instability in advanced capitalist economies has had more influence in the policy-making and financial communities than among academic economists. This fate of Minsky’s thought raises some basic questions about contemporary economic methodology, particularly the relations between economic theory, mathematical models, and statistical estimation. Minsky, in pursuing seriously the project of understanding the dynamics of contemporary capitalist society, ran into fundamental limitations of contemporary economic modeling technique. Given Minsky’s strong quantitative training and the nature of his early work in economics, his refusal, often remarked upon, to develop a rigorous mathematical model to express his ideas about financial instability is a sharp reminder of the limits of our current methods. The fertility of Minsky’s insights and the resonance they met in the practical worlds of finance and policy-making suggest that the
examination of Minsky’s work offers a valuable critical perspective on modern economic method. Minsky himself was aware of these dilemmas, and refers to them from time to time in explicating his ideas (for example, Minsky, 1989), but never, I think, systematically addressed the methodological crisis inherent in his work, nor did he put forward an explicit methodological alternative.

Contemporary economics is a distinctive branch of statistical social science. Statistical social science, in turn, is an attempt to adapt the methods of experimental and observational physical sciences to the analysis of data generated in the course of human social interactions. Both the general attempt to apply the methods of physical science to social data through the use of statistics, and the specific attempt of economists to provide a rigorous theoretical foundation for social statistical models raise subtle and unresolved philosophical questions, on which Minsky’s work on financial dynamics shed a powerful, if harsh, light.

At their root these problems involve the limits of statistical analysis to data generated by repeated similar events and the difficulty of representing dialectical transformations of systems mathematically. The phenomenon of financial fragility and its impact on the political economy of contemporary capitalism raise these problems in a particularly poignant form. Finance mediates the inescapable gap between an imperfectly imagined future and an inadequately
equilibrated present that is inherent in the reproduction of capitalist economic life. While this mediation as a general phenomenon is a predictably repeated event, it always takes place in historically specific circumstances of changing technology, mores and beliefs, so that the underlying process is in statistical terms inherently non stationary. Historical financial data thus carries at best a limited amount of information about the present. Furthermore, financial institutions, markets, and instruments are historically highly fluid and adaptable. The financial response to disequilibrating shocks takes the form of transformation and innovation of institutions and instruments as often as it takes the form of the re-pricing of existing securities. The rapid evolution of U.S. financial instruments and practices in the 1970s and 1980s, in response to the crises Minsky analyzed with such penetration, is a dramatic example. But contemporary of traders, tradable commodities and assets.

Thus the difficulties Minsky and his followers encounter in formalizing the Minskian world view in mathematical and econometric models reflect not merely technical mathematical and econometric problems, but fundamental philosophical limitations. Since the problems of financial fragility and instability Minsky identified and worked on continue to grow in importance as they reappear on a global scale, and since Minsky’s work has evident scientific value in organizing our understanding of these issues, these limits pose an important historical
challenge to received economic method.

2 How It Is Out There

Economics studies the way human beings organize themselves to provide for their material needs and wants. As a human and social science, economics must inherently confront the existential conditions of human life, such as the unresolved puzzle we call the passage of time. As we experience it, the present, in which we have our consciousness and experience ourselves as acting, divides time curiously between a past about which we have limited information, but do not seem to be able to change by our present actions, and a future which our present actions seem to shape, but over which we have limited control. Broadly speaking, two philosophical attitudes underly our thinking about the passage of time, deterministic and dialectical.

The determinist view, deeply entwined in physical science theory, sees the passage of time as the unfolding of a lawful progression of causally connected events, and attributes our human inability to predict and control the future to limitations in our understanding of the laws linking the past to the future and to our information about the current state of the world. In the determinist view the future and past are quite symmetrical, in that there “is” a definite future, just as there “is” a definite past; the quality of our uncertainty about the actual unfolding of the future is no different from the quality of our uncertainty about the specifics of the past, in that both are attributable to our limited information.

The dialectical view, on the other hand, sees the future as genuinely undetermined but always in the process of coming to be through the unfolding of the present, and particularly the unfolding of human action. In the dialectical perspective the past is fixed, despite our limited knowledge of it, in a way that the future is not. Our actions in the present can change our knowledge of the past, but not what actually happened, while these same actions actually constitute the process of shaping the future.

These general philosophical conundrums appear in economic life in recognizable, and even simplified, concrete guises. The past imposes on the present economically in the form of the stocks of concrete capital goods, knowledge, habits and information we inherit from the past. From an economic point of view it does not matter very much exactly what the history that led to this capital endowment is, since there's nothing we can do to change it (the doctrine
of “sunk costs”). The stubborn resistance of existing stocks to change spontaneously in accord with our present wishes reflects the determinate quality of the past. On the other hand, human action in the present clearly changes the stock of capital goods and information we pass on to the future. In market-oriented societies the actions that determine investment are mediated by valuing existing stocks of goods and information and speculating on the future valuation
of various possible investments. This process is the immediate metabolism of economic life. The economic moments of finance and investment express the deepest existential dilemmas.

To those who devote their lives professionally to this process it has something of a sacred mystery, partaking equally of faith and fear. Savers accumulate wealth in the form of money, general purchasing power, which can command all the huge variety of present pleasures the world offers. Yet savers know that their only hope of perpetuating their wealth into the future lies in converting it into some concrete investment that may turn out to be worthless. Like college sophomores horrified at the prospect of “limiting their options”, savers twist
and turn in this contradiction, and in the process evolve fantastically elaborate mediations to disguise the plain distastefulness of the act of concrete investment. So wealth, which could provide so much immediate and certain pleasure, goes instead to build the ninth or nineteenth shopping mall around Springfield, Massachusetts, or an impossibly expensive apartment building in Tokyo. Financiers take on the dangerous, if well-compensated, priestly role of consecrating this sacrificial activity, carving up the entrails of historical statistics for omens, and
seeking auguries in the flocking and flights of politicians.

Yet this existentially fragile moment of investment drives forward the whole process of production through which we secure our survival and reproduction. When finance and investment are proceeding smoothly and with controlled optimism, production and reproduction also flourish; when the worlds of finance and investment degenerate into hysterical over-optimism or disheveled panic, as they inevitably do, they disorder and disrupt the systems of valuation on which production itself rests.

From one point of view, these are repeated phenomena, which might be likened to the motion of the planets, or the fluctuations of the seasons. All market-oriented societies exhibit the development of financial and investment sectors, and experience the turbulence associated with them. There are recognizable similarities between the financial booms and busts of different economies and different ages, between the South Sea Bubble, the Baring crisis and the Penn Central bankruptcy. But there are also crucial differences. Each generation of savers and investors has to place its bets on its own array of technological and organizational options, in its own historical context. From the subjective point of view, each individual act of investment is a unique existential crisis, not to speak of the unfolding of a systemic boom or bust. Contemplation of the follies and triumphs of the past can perhaps console the investor with the feeling of company, but the South Sea Bubble has limited practical information relevant to the unraveling of world capital markets in 1997 and 1998.

Hy Minsky understood that this is how it is (as did John Maynard Keynes), and had the unusual philosophical courage to insist on the relevance of the dialectical point of view to the understanding of financial economics. His theses on financial fragility and instability stem from his vision of the financial process as an essentially human confrontation of the present with the future, mediated by all the psychological baggage humans carry to major life decisions: their self-deception, opportunism, and insecurity as well as their faith, imagination, and steadfastness. From this vision Minsky drew genuine and valuable insight about
real historical events, and gained some perspective on the deeper rhythms of financial life. This insight attracted to his work a constituency and an audience, a mixture of financial professionals and heterodox critics of market economies, who disagree on many aspects of capitalist society, but share a fascination with its financial spectacle.

3 How Economics Presents Matters

The high road of twentieth century economics has been the systematic combination of the methods of statistical inference and mathematical modeling. Economists’ knowledge of the real world takes the form of simplified mathematical models, in which the relevant phenomena are represented by an unchanging set of quantitative variables. Mathematical models have the great conceptual advantage that they can easily be manipulated to generate counter factual pre-
dictions by changing the parameters of the model. Since one of the main things we want to know about the world is how it would have been (or would be) if we took (or take) some alternative course of action, the predictions of mathematical models satisfy a pressing demand. There are, however, many mathematical models that can plausibly represent relevant aspects of economic reality.Contemporary economics proposes to choose among them by testing their explanatory power over actual observed data. To accomplish this, it is necessary to stipulate a set of rules for connecting the abstract variables of a mathematical model to observable data. Given these rules, the counter factual predictions of a mathematical model can be interpreted as predicted correlations among observed variables. The underlying mathematical model is validated by econ-
metric technique when at least some observed data can be shown to exhibit at least some of the correlations predicted by the model under at least one set of rules for connecting the mathematical variables to the observed data.

The methodological weaknesses of this procedure turn up every day in econometric practice. It is easy to quarrel over the rules connecting the abstract quantities of mathematical models to observed data (as in the case of measurement of the money supply, for example). Even with agreed-upon rules for operationalizing models, the available data may be compatible with a wide range of mathematical models that have contradictory counterfactual implications (as
the debates over identification problems illustrate). As a result, econometric tests have limited power to dislodge economic models that reflect strongly and widely held prior prejudices. Models get into trouble only when they exhibit gross anomalies that show up under robust statistical procedures in a wide range of data (as in the case of the real business cycle attempts to explain fluctuations in unemployment).

Beyond these chronic methodological weaknesses in contemporary economic practice, however, lies a philosophical limitation. Both the practice of mathematical modeling and the procedures of statistical inference are strongly biased against a dialectical understanding of the unfolding of time.

In the case of mathematical models, this bias arises because they represent change only as the quantitative variation of a given set of variables. A central feature of the dialectical vision, by contrast, is the emergence of qualitatively new phenomena as a systematic response to the contradictions of existing structures. In the field of financial economics, for example, the market responded to the rise in interest rates of the late 1960s and early 1970s not just through a reallocation of portfolios over the existing spectrum of assets, but also through the invention of new assets and intermediaries. In a purely formal sense it might be possible to shoehorn this phenomenon into a mathematical model by positing the latent existence of these assets and intermediaries, and treating their emergence as a quantitative change from zero levels of trading to positive levels. But this is at best an ex post rhetorical maneuver to save the methodology, not a serious scientific strategy, since no sane modeler would venture to include all possible evolutions of the commodity or asset space in a usable model.

The philosophical bedrock of statistical inference is the assumption that the future will be like the past, so that we can use repeated observations of past events to infer at least some features of the future. This essential assumption of statistical method disables it from dealing with qualitative change in interactions, and therefore from coming to grips with a dialectical understanding of the world. Contemporary econometric practice is fertile in devices to minimize or evade this problem. Once one recognizes the pervasiveness of qualitative change, for example, in macroeconomic or financial interactions, it is tempting to save the statistical point of view by retreating to a higher perspective, and segregating the data points into separate “regimes”, each of which has its own statistical order. There is nothing logically wrong with these maneuvers, but the complexity of economic and financial reality sharply tests their practical usefulness. Once regimes are allowed to multiply, they rapidly deplete the degrees of freedom necessary to draw strong statistical inferences.

Contemporary econometric practice is most comfortable with a representation of the passage of time as the unfolding of a determinate future governed by the same laws and statistical regularities as the past. This leads economists to represent the moments of finance and investment as problems in forward-looking statistical inference: investors appear in economic models as merely imperfectly informed decision makers. The results of their actions are represented as the potentially insurable consequences of statistical variation. The future in contemporary economics is knowable, though imperfectly known. In a formal sense this strategy can accomodate any degree of irregularity in the operation of financial markets, but its fidelity to the actual conditions in which financial and investment decisions take place is questionable. Financial professionals eagerly consume the information generated from econometric models, because, like all information, it is potentially valuable, but are restless with the contention that their work is to divine a determinate future rather than to create an undetermined one.

4 Minsky’s Methodological Dilemma

Hy Minsky was an able mathematical economist with a mastery of modern statistical method. Yet his most influential work largely eschews sophisticated mathematical and econometric methods. There is much to be learned from contemplating the dilemma he faced and the practical resolution of it he accomplished in his scientific career.

Minsky recognized important repeated patterns in the evolution of financial markets. For example, he calls attention to the gradual shift during expansions from extreme caution (hedge finance) through boldness (speculative finance) to abandon (Ponzi finance). This gradual loosening of financial restraint is characteristic of all capitalist booms, and plays an important role in supporting and extending boom conditions. But the exact form the progression takes and the mechanisms that support it differ from one historical episode to the next. In one
boom the speculative vehicle may be equities, in another real estate, in another speculation in high-profit margin foreign investment. Investors can work themselves into perilous positions by borrowing to buy stock on margin, or through derivatives trades, or by depending on political promises of exchange rate stability to hedge positions across different currencies. Minsky can call our attention with hindsight to the structural similarities of these episodes but cannot give us the power to recognize the next concrete form the progression will take. Awareness of these structural similarities is surely valuable knowledge, rooted in a
scientific recognition of a kind of repeatable and therefore partly predictable phenomenon. But the qualitative metamorphoses of financial fragility are at a far remove from the apparent quantitative regularities of growth rates of GDP, or inter-temporal correlations of business cycle indicators.

When Minsky’s followers try to formulate his vision into mathematical models, they face a series of methodological riddles. It is not easy to formulate a single, generic, range of assets to represent the multifarious vehicles for the financial maneuvers that lie behind financial fragility. The model needs to be able to represent a shift in the average riskiness of position. This presents a challenge to conventional portfolio analysis, which largely analyzes the distribution of a given pattern of risk among asset holders, or its pricing by a single representative asset holder. It is not clear exactly where to locate a parameter to represent the financial boldness of investors. In Minsky’s discourse, the shift toward more
exposed financial positions is not simply a psychological phenomenon based in the increasing optimism or level of denial of investors (though that is surely part of the process), but involves strong competitive pressures on individual investors to conform to group norms that are themselves shifting. Surveying episodes of financial fragility from the perspective of the rubble left by the eventual meltdown, we are predictably shocked and amazed by the willingness of responsible and experienced bankers or portfolio managers to continue to pour money into
already gorged emerging markets, or negotiate even more improbably enormous loans with insolvent states. But if all the other banks or funds are generating exciting levels of fees and initial returns from these commitments, the manager who swims against the tide frequently lands in a department far away from the hot action and big bonuses long before the crash reveals her or his wisdom in retrospect.

Most attempts to model financial fragility mathematically settle on the procedure of positing a quantitative variable representing the risk tolerance of investors that appears as a shift parameter in asset demand functions. (See Taylor and O’Connell, 1985, and Sethi, 1992 for two closely argued examples and references to further literature in this vein.) This variable itself is assumed to evolve in response to the history of the modeled system. It is possible to create models of this kind in which the instability of positive feedback between rising asset prices and higher risk tolerance leads to catastrophes or cyclical or chaotic
trajectories. These models give us some general insight into the possible consequences of shifting asset preferences, but inevitably lose most of the richness of Minsky’s account in the translation to mathematical language.

Furthermore, it is inherently difficult to calibrate these models to statistical data because of the key role played by the unobservable variable representing the shift in asset demands. It is tempting to associate this variable empirically with some measure of financial market stress, such as the difference in yield between privately issued bonds subject to default risk and government issued bonds free of default risk. But from a conventional modeling point of view these market measures are themselves endogenous to the financial process, so that they are unsuitable representatives of exogenous shift factors in the underlying mathematical model.

Even without the complications introduced by the need to pass from a mathematical model to an operational and estimable econometric model, Minsky’s work raises significant problems concerning the quantitative measurement of financial fragility. Many financial statistics, such as the flow of funds tables, are essentially the statistical means of variables aggregated across a population of institutions or markets. These averages show rather low variability over the business cycle, even in periods where qualitative evidence for financial fragility
is quite persuasive. What seems to be happening is that the the drift toward exposed financial positions the financial fragility hypothesis posits takes place through a shift in the statistical distribution of balance sheets which has a smaller impact on the center of the distribution than on its tails. When financial crisis or pressure begins to tell, it hits a few highly exposed firms or sectors first, and only gradually spreads through the whole system. Thus there may be some reason to think that a research program focused on the higher moments of the distributions of key financial variables may shed more quantitative light
on financial fragility.

But there are limits to all balance sheet based data as reflections of Minskian fragility, because some of the central characteristics of assets are not and perhaps cannot be reported on balance sheets. Take the issue of “nonperforming loans” of banks, for example. Bank examiners require banks to set aside capital reserves to offset potentially worthless loan assets. (The failure of Asian bank examination systems to enforce this procedure consistently is one of the forms of “nontransparency” on which some commentators blame the Asian crisis of
1997–98.) These reserves are a kind of quantitative measure of the qualitative status of the bank’s loans. But the reserves and judgements of potential “performance” of loans are administrative evaluations, not market-generated data. Bank examiners may tend to fall behind the curve of the evolution of the market, either in underestimating or overestimating the actual value of assets. In a serious financial crisis and panic, for example, when the chips are down on the question of financial fragility, it is almost certain that some loans examiners
view as sound will fail to perform as a result of disruptions spreading through the financial and production system.

In view of these problems, I am not surprised that Minsky largely eschewed the project of formalizing the financial fragility hypothesis as a mathematical and econometric model. The pursuit of this project would have diluted the powerful insights of the financial fragility hypothesis in a host of ways. The general hypothesis would have become identified with a particular mathematical model and procedure for operationalizing the relevant concepts, and its scientific status would become hostage to the performance of that particular instantiation of the ideas. Since the financial fragility hypothesis calls into question the laissez-
faire dogma of the superior informational and resource-allocating performance of markets around which mainstream economics unifies itself, any particular operationalization would become a target of critical work searching out its weaknesses. Minsky may have been justifiably wary of pursuing this rocky road.

But I think the deeper reason for Minsky’s reluctance to formalize his ideas in line with econometric fashion was his recognition that the formal, statistical methods adopted by contemporary economists are inherently hostile to critical and qualitative insights into the performance of markets as human and social institutions. In this sense Minsky was loyal to what he knew from his voracious reading of financial and economic history, from his experience as a bank director, and from his lifelong astute observation of the drama of financial market instability and policy reaction to it. If mathematical or econometric models failed to confirm these insights, so much the worse for the models. Why make hard won insights hostage to methodologies that approach subtle questions with blunt instruments?


And what about those blunt instruments? Is it possible to improve the traditional methods of mathematical modeling and econometrics to embrace some or all of the dialectical point of view? While there are some constructive steps to be taken in this direction, it is doubtful that technical improvement alone can address the deeper problems involved here.

Perhaps the most pressing shortcoming of current mathematical modeling and econometric practice is its excessive reliance on assumptions of linearity in the relations it studies. The last fifteen years have seen tremendous progress in the introduction of nonlinear methods into economic and econometric models. We now understand better the interplay between local and global stability in economic models, and the mathematical modeling toolbox has been greatly
enriched by the addition of the ideas of bifurcation analysis and nonlinear dynamical systems. Nonlinear models can come closer to reflecting dialectical insights. For example, the emergence of new equilibria in bifurcations bears some resemblance to the dialectical notion of the qualitative evolution of a system. The existence of multiple equilibria in nonlinear systems and the consequent division of the state space into multiple basins of attraction allow small shocks to have large global impacts. As I remarked above, these tools have been the main
route through which economists have tried to formalize the insights of Minsky’s financial fragility hypothesis. Despite the fertility and beauty of nonlinear mathematics, it is doubtful that these techniques, which remain firmly rooted in the assumption of an unchanging underlying state space, can completely express the fluidity of human social interactions.

Nonlinear econometrics has also made important advances in recent years, with the salubrious effect of revealing how much of what we thought we knew from empirical studies reflects researchers’ assumptions of linearity rather than the underlying structure of the data. One helpful effect of nonlinear perspectives is to make econometric investigation more cautious in its claims. The use of nonlinear and nonparametric methods has revealed important nonlinear structures in economic data. But nonlinear and nonparametric methods also underline how limited the empirical basis for identifying these structures actually is. The assumption of parametric linearity (as in linear regression models) tends to exaggerate the resolving power of data over hypotheses by averaging correlations over different domains. Thus relations which are well-identified in a small region of the domain of independent variables appear to be equally well identified by linear methods over the whole domain. Nonlinear and nonparametric methods, on the other hand, limit their inferences from each subset of the domain to that local region. The nonlinear perspective in this respect helps to reconcile econometric approaches to financial and macroeconomic problems with the qualitative insights of financial fragility analysis. Linear econometric studies appear to identify robust and pervasive relations in the data that are incompatible with the dialectical insights of financial fragility, while nonlinear studies of the same phenomena lead to more modest empirical claims which are more compatible with qualitative insights.

Another important extension of traditional econometric techniques is the explicit study of the higher moments of distributions of financial and economic variables across populations. Traditional econometrics tends overwhelmingly to assume that relevant distributions are Gaussian, and that information about means and variances are thus sufficient to characterize the whole distribution. As I indicated above, many of the effects of financial fragility are manifested in changes in the spreads of distributions that do not affect the mean much,
or at all, and that escape methods that assume constant variance over the sample period. As in the case of nonlinear methods, the extension of econometric techniques to a wider class of parametrically defined distributions or to nonparametrically estimated distributions greatly widens the range of empirical phenomena that we can recognize. But the apparent success of Gaussian methods in identifying structure in empirical data rests heavily on the a priori
assumption of normality of the underlying distributions. With the abandonment of this assumption, it becomes much more difficult to identify structure reliably in the relatively small samples of macroeconomic time series.

There is a long minority tradition in data analysis, with which Minsky’s work has considerable affinity, which has regarded the methods of linear, Gaussian, statistical inference with skepticism. This tradition puts its money on accounting identities, which do not by definition need to be estimated, only measured. A great deal can be learned about the evolution of contemporary economies by a careful study of accounting data. Since accounting methods are constructed to reflect actual developments with as much fidelity as possible, they have no bias toward representing the economic system as regular or stable. Thus it is possible to go a considerable distance toward joining the dialectical and accounting approaches to economic analysis. There always remains, however, an unbridgable gap: accounting methods are inherently limited to measuring the outcomes of the system ex post, that is, after the fact and the resolution of the contradictions implicit in its motion. The dialectical point of view, on the other hand, strives to represent the world with its contradictions and possible futures
intact, and thus to allow a discussion of such inherently ex ante issues as the
appropriateness of policy.

While the development of more sophisticated and more modest mathematical and econometric methods to attack problems of the type that preoccupied Minsky is surely worthwhile, it is unlikely that this path will lead to a satisfactory synthesis of the dialectical and mathematical/statistical points of view in the foreseeable future. Lacking such a synthesis, financial and economic analysis could still benefit from two changes in methodological style.

First, we could acknowledge more directly the limits of traditional mathematical and econometric modeling techniques in attacking inherently dialectical problems in social science. The point here is not to abandon these methods altogether, since their utility in organizing masses of statistical information and clarifying our thinking about a huge range of complex policy problems is unquestionable. But the power and utility of these traditional methods is restricted to answering a particular type and range of questions, and this fact needs to
be more widely appreciated. Mathematical models are tools to clarify our understanding of simplified, imaginary systems that we hope represent coherent aspects of a complex economic reality. They are possibly the only reliable tool for exploring the response of such imaginary worlds to parametric changes. Since we depend heavily on simplified imaginary metaphors to grasp real complexity, mathematical modeling is an indispensable moment in the development of our economic knowledge. But the insights mathematical models offer should not be exaggerated into a parody of science that claims that roughly calibrated models are “true”, or recklessly proposes to extrapolate their implications far beyond the natural limits of the metaphor on which they are based. Similarly, statistical techniques can tell us whether a given data set can resolve the magnitude of an empirical effect on the basis of a particular operationalization of a theoretical concept and within the limits of a particular stipulated statistical methodology. This is also important information, especially in cases where the result is relatively robust against changes in the operationalization of the concept and the
assumptions of the statistical model, and is thus relatively uncontroversial. But the ability of statistical studies to answer this important type of question does not authorize us to elevate statistical correlations into facts of nature or society that transcend particular time periods or institutional structures.

Second, it would be well for economic and financial scholars to recognize more generously the validity and importance of the insights won through dialectical investigation, like Minsky’s financial fragility hypothesis. This is also real human knowledge, and people who understand its discoveries are in a better position to understand historical developments, to manage their personal affairs, and to make economic and financial policy. The dialectical tradition in economic analysis serves, in any case, as the (too often unacknowledged) seedbed of ideas out of which mathematical and econometric models are constructed. There are surely important methodological problems that need to be thought through on the dialectical side as well. Given the historically specific and institutionally fluid nature of the dialectical vision, what types of arguments can its practitioners offer to recommend their claims? Here the financial fragility hypothesis, and Minsky’s development of the discourse based on it, offer some promising suggestions. Minsky had a lively sense of the importance of validating his insights with historical examples and parallels, and of linking general patterns to a detailed examination of institutional reality and the unfolding of particular episodes of financial life. Without this thick texture of well-informed argument, the financial fragility hypothesis would have been less persuasive and less well-crafted as a part of economic knowledge.

The real knowledge that resides in qualitative, dialectical insights is recognized in the practice of many successful contemporary policy makers. Good central bankers, for example, supplement whatever they can glean from state-of-the-art econometrics and mathematical models with information about the details and specifics of institutional evolution and market fluctuations, and a strong sense of historical perspective, precedent, and timing. A stronger dialogue between dialectical and mathematical/statistical perspectives and methods would bring academic discussion closer to the norms of financial and policy making practice, and provide a stronger base for informed decisions. It would be well for economic and financial scholars to take Minsky’s work and methodological dilemma seriously. The seeds of future economic knowledge
most likely are to be found in the fields Minsky tilled.

Minsky, H. 1989. Comments on Friedman and Laibson. Brookings Papers on Economic Activity, 2, 173–182.
Sethi, R. 1992. Dynamics of learning and the financial instability hypothesis.Journal of Economics (Zeitschrift fur Nationalokonomie), 56(1), 39–70.
Taylor, L., and S. A. O’Connell. 1985. A Minsky crisis. Quarterly Journal of
Economics, 100, 871–886.

Powered by Drupal, an open source content management system