Orit Halpern, author of Financializing Intelligence

Financializing Intelligence. On the Integration of Machines and Markets

Orit Halpern
Cite as
Halpern, Orit: "Financializing Intelligence. On the Integration of Machines and Markets". carrier-bag.net, 28. March 2025. https://doi.org/10.59350/vcwb7-hgt49.
Import as

“Noise in the sense of a large number of small events is often a causal factor much more powerful than a small number of large events can be. Noise makes trading in financial markets possible, and thus allows us to observe prices for financial assets… We are forced to act largely in the dark …” (Fischer Black 1986).#noise

In 1986, Fischer Black, one of the founders of contemporary finance, made a rather surprising announcement: bad data, incomplete information, wrong decisions, excess data, and fake news, all make arbitrage possible. In the famous article Noise, Black posited that we trade and profit from misinformation and information overload. Assuming a large number of ‘small’ events networked together as far more powerful than large scale planned events, the vision of the market here is not one of Cartesian mastery or fully informed decision makers. Noise is the very infrastructure for value.#noise

In an age of meme driven speculation, NFT’s, and democratized options trading, such a statement might seem common sense. Even natural. Does anyone, after all, really think a crypto currency named as a joke for a small dog, or an almost bankrupt mall-based game retailer are intrinsically worth anything? Much less billions of dollars? Of course they do. For the past few years, great fortunes and major funds have collapsed and risen on just such bets. In retrospect everyone seems to have perfect clarity about ‘value’ investing, but apparently at the time, no one does. “Irrational exuberance” to quote Federal Researve Board Chairman Alan Greenspan in the late 1990’s on the dot.com boom, might be the term. But Greenspan might have gotten it wrong on one point. Irrational exuberance was not market failure but market success.#crypto currency, #market

Fischer Black, who not incidentally was the student of Marvin Minsky and spent a lot of time thinking about intelligence, artificial or otherwise, was one of the inventors of the world’s preeminent trading instruments, the Black Scholes Options Pricing Model. For Black, “irrationality” was not an exception, but rather a norm. The very foundation for contemporary markets. Noise, Black argued, is about a lot of small actions networked together accumulating in greater effects on price and markets then large singular or perhaps planned events. Noise is the result of human subjectivity in systems with too much data to really process. Not incidentally perhaps, Black was also discussing a new technology, the derivative pricing equation, whose execution at scale demanded large infrastructures of high-speed networked digital computers. Noise is also the language of mathematical theories of communication, betraying the genealogy of how contemporary finance is linked to computing and even more specifically machine learning.#Fischer Black

While seemingly the territory of the few and the select in finance, quite on the contrary, such statements reflect an attitude that is ubiquitous today and integrated into our smartphone trading apps and social networks. Mainly, the idea that we are all networked together to make collective decisions within frameworks of self-organizing systems that cannot be perfectly regulated or guided. Furthermore, we have come to believe that human judgement is flawed, and that this is not a problem, but a frontier for social networks and artificial intelligence.#network, #self-organizing system

The options pricing model also exemplifies a broader problem for economists of finance; mainly that theories or models to paraphrase Milton Friedman, “are engines not cameras” (paraphrased by MacKenzie 2006, 11). One way to read that statement is that the model does not represent the world but makes it. Models make markets. Models in finance are instruments such as a derivative pricing equation or an algorithm for high-speed trading. There are assumptions built in to these technologies about gathering data, comparing prices, betting, selling, and timing bets, but not about whether that information is correct or ‘true’ or whether the market is mapped or shown in its entirety. These theories are tools, and they let people create markets by arbitraging differences in prices without necessarily knowing everything about the entire market or asset.#statistical model, #market

These financial models are, to use Donna Haraway’s terms, “god-tricks”. They perform omniscience and control over uncertain, complex, and massive markets. They are also embodiments of ideology—mainly that markets can neither be regulated or planned. These instruments naturalize and enact an imaginary that markets make the best decisions about allocation of value without planning by a state or other organization. In what follows I hope to trace how neo-liberal theory, psychology, and artificial intelligence intersected to produce the infrastructure for our contemporary noisy trading. If today we swipe and click as a route to imagined wealth, we should ask how we have come to so unthinkingly and unconsciously accept the dictates of finance and technology.#Donna Haraway, #market, #noise

Networked Intelligence

The idea that human judgement is flawed (or corrupt) and that markets could neither be regulated nor fully predicted and planned has long been central to the automation and computerization of financial exchanges. Throughout the middle of the twentieth century, increased trading volumes forced clerks to fall behind on transaction tapes and often omit or fail to enter specific prices and transactions at particular times. Human error and slowness came to be understood as untenable and “non-transparent,” or arbitrary in assigning price (Kennedy 2017).#transaction, #automation

In the case of the New York Stock Exchange, for example, there were also labor issues. Managers needed ways to manage and monitor labor, particularly lower paid clerical work. As a result, computerized trading desks were introduced to the NYSE in the 1960s. These computerized systems were understood as being algorithmic and rule bound. The more automated the market, thinking went, the more rule bound it would become. Officials also thought computing would save the securities industry from regulation; that if computers followed the rules algorithmically, there was no need for oversight or regulation (ibid.).#division of labor, #automation

This belief in the rationality and self-regulation of algorithms derived from a longer neoliberal tradition that reimagined human intelligence as machinic and networked. According to Austrian born economist Friedrich Hayek writing in 1945: #neoliberal

“The peculiar character of the problem of a rational economic order is determined precisely by the fact that the knowledge of the circumstances of which we must make use never exists in concentrated or integrated form, but solely as the dispersed bits of incomplete and frequently contradictory knowledge which all the separate individuals possess. The economic problem of society is thus not merely a problem of how to allocate ‘given’ resources—if ‘given’ is taken to mean given to a single mind which deliberately solves the problem set by these ‘data.’ It is rather a problem of how to secure the best use of resources known to any of the members of society, for ends whose relative importance only these individuals know. Or, to put it briefly, it is a problem of the utilization of knowledge not given to anyone in its totality” (Hayek 1945, 519-529).#data, #neoliberal, #Friedrich A. Hayek

Human beings, Hayek believed, were subjective, incapable of reason, and fundamentally limited in their attention and cognitive capacities. At the heart of Hayek’s conception of a market was the idea that no single subject, mind, or central authority can fully represent and understand the world. He argued that “The ‘data’ from which the economic calculus starts are never for the whole society ‘given’ to a single mind… and can never be so given” (ibid.). Instead, only markets can learn at scale and suitably evolve to coordinate dispersed resources and information in the best way possible.#market, #data, #Friedrich A. Hayek

Responding to what he understood to be the failure of democratic populism that resulted in Fascism and the rise of Communism, Hayek disavowed centralized planning or states. Instead, he turned to another model of both human agency and markets. First, Hayek posits that markets are not about matching supply and demand, but about coordinating information.A critical first step, as historians such as Philip Mirowski (2002a, 2002b, 2006) have noted, towards contemporary notions of information economies. Second, Hayek’s model of learning and “using knowledge” is grounded in the idea of a networked intelligence embodied in the market which can allow the creation of knowledge outside of and beyond the purview of individual humans: “The whole acts as one market, not because any of its members survey the whole field, but because their limited individual fields of vision sufficiently overlap so that through many intermediaries the relevant information is communicated to all” (Hayek 1945, 246). And third, the market therefore embodies a notion of cognition and decision that I would call “environmental intelligence,” in which the data upon which such a calculating machine operates is dispersed throughout the society, and where decision making is a population-grounded activity derived from but not congruent with individual bodies and thoughts.#planning, #market, #Intelligence, #Friedrich A. Hayek

Hayek’s idea of environmental intelligence was inherited directly from the work of Canadian psychologist Donald O. Hebb, who is known as the inventor of the neural network model and the theory that “cells [neurons] that wire together fire together.” In 1949, Hebb published the Organization of Behavior, a text that popularized the idea that the brain stores knowledge about the world in complex networks or “populations” of neurons. The research is today famous for presenting a new concept of functional neuroplasticity, which was developed through working with soldiers and other individuals who had been injured, lost limbs, blinded, or rendered deaf from proximity to blasts. While these individuals suffered changes to their sensory order, Hebb noted that the loss of a limb or a sense would be compensated for through training. He thus began to suspect that neurons might rewire themselves to accommodate the trauma and create new capacities.#Donald O. Hebb, #neuron, #neuroplasticity

The rewiring of neurons was not just a matter of attention, but also memory. Hebb theorized that brains don’t store inscriptions or exact representations of objects, but instead patterns of neurons firing. For example, if a baby sees a cat, a certain group of neurons fire. The more cats the baby sees, the more a certain set of stimuli become related to this animal, and the more the same set of neurons will fire when a “cat” enters the field of perception. This idea is the basis for contemporary ideas of learning in neural networks. It was also an inspiration to Hayek, who in his 1956 book The Sensory Order openly cited Hebb as providing a key model for imagining human cognition. Hayek used the idea that the brain is comprised of networks to remake the very idea of the liberal subject. The subject is not one of reasoned objectivity, but rather subjective with limited information and incapacity to make objective decisions.#neuron, #perception, #Donald O. Hebb

The concept of algorithmic, replicable, and computational decision making that was forwarded in the Cold War was not the model of conscious, affective, and informed decision making privileged since the democratic revolutions of the eighteenth century (Erickson 2015). But if Cold War technocrats were still experts with authority and predictive capacities, the ignorant and partially informed individual that Hayek presents us with is not.#Cold War

Hayek thus reconceptualized human agency and choice neither as informed technocratic guidance nor as the freedom to exercise reasoned decision making long linked to concepts of sovereignty. Rather, he reformulated agency as the freedom to become part of the market or network. He was very specific this point; theories of economy or politics based on collective or social models of market making and government were flawed in privileging the reason and objectivity of the few policy makers and governing officials over the many. This privileging he deduced results in Communism or Fascism. The state making plans quells the abilities of minorities, in his view, to take independent action. Hayek elaborated that freedom, therefore, was not the result of reasoned objective decision making, not the technocratic elite decision maker with volumes of data objectively and emotionlessly analyzed, but rather freedom from coercion. Coercion often coming to mean the effort to exclude individuals from chosen economic activities and markets. When linked to his discussions about subjectivity, ignorance, and the market as the only mechanism for making reasoned decisions as a collective, one can trace the bedrock of an argument against policy directed forms of equity making or civil rights and the assertion that all rights and freedoms are protections from the state, not services or support from the state. While in theory preserving the ‘freedom’ of an individual to participate equally in any market could be viewed as supporting the necessity of legal and humane infrastructures to allow all individuals this access, neo-liberal thinking and the Republican Party did not interpret in this direction (Hayek 1960).#Friedrich A. Hayek, #market, #freedom, #civil rights, #economics

The main point here, is that neo-liberal models of human agency, freedom, and markets reformulated ideas about intelligence, reason, and decision making. These reformulated ideas reflected and refracted, as we will see, ideas of networked computing, neural networks in psychology and machine learning; ultimately infrastructuring contemporary understandings of networks, finance, and artificial intelligence. This genealogy also reveals that models have politics and are socially embedded. These models of networked decision making aided and abetted broader political movement invested in countering other ideas of human agency and freedom including civil rights.#neoliberal, #neuron, #AI

Models and Machines

Neo-liberal theory posited the possibility that markets themselves possess reason or some sort of sovereignty; a reason built from networking human actions into a larger collective without planning, and theoretically, politics. The market can thus be understood as a sort of decision making machine; returning us to Milton Friedman’s original statement about economic models as being engines not cameras. But if markets and minds are engines, what type of machines would they be?#statistical model, #market

Efforts to produce digital computing and machine learning had long been related to economics and psychology. Whether in markets, machines, or human minds particularly in the post-war period many human, social, and natural sciences came to rely on models of communication and information related to computing. Models of the world such as those embedded in game theory reflected emerging ideas about rationality separated from human reason, and managing systems, whether political or economic, came to be understood as a question of information processing and analysis (Halpern and Mitchell 2023).#machine learning, #economics

Models of minds and machines took a dramatic turn in 1956, when a series of computer scientists, psychologists, and other scientists embarked on a project to develop machine forms of learning. In a proposal for a workshop at Dartmouth College in 1955, John McCarthy labelled this new concept “artificial intelligence.”#John McCarthy, #AI

While many of the participants, including Marvin Minsky, Nathaniel Rochester, Warren McCulloch, Ross Ashby, and Claude Shannon, focused on symbolic and linguistic processes, one model focused on the neuron. A psychologist, Frank Rosenblatt proposed that learning, whether in non-human animals, humans, or computers, could be modeled on artificial, cognitive devices that implement the basic architecture of the human brain (Rosenblatt 1962).#Frank Rosenblatt, #perception

In his initial paper that emerged from the Dartmouth program detailing the idea of a “perceptron,” Rosenblatt distances himself from his peers. These scientists, he claimed, had been “chiefly concerned with the question of how such functions as perception and recall might be achieved by a deterministic system of any sort, rather than how this is actually done by the brain” (ibid. 5). This approach, he argued, fundamentally ignored the question of scale and the emergent properties of biological systems. Instead, Rosenblatt based his approach on the theory of statistical separability, which he attributed to Hebb and Hayek, and a new conception of networked perception-cognition. According to Rosenblatt, neurons are mere switches or nodes in a network that classifies cognitive input, and intelligence emerges only at the level of the population and through the patterns of interaction between neurons.#Frank Rosenblatt, #Donald O. Hebb, #Friedrich A. Hayek, #perception, #neuron

Contemporary neural networks grounded as they are theories of Hebbian networks operate on the same principles. Groups of nets that exposed repeatedly to the same stimuli would eventually be trained to fire together; recall the cat and the baby. Each exposure increases the statistical likelihood that the net will fire together and ‘recognize’ the object. In supervised ‘learning,’ then, nets can be corrected through the comparison of their result with the original input. The key feature is that the input does not need to be ontologically defined or represented, meaning that a series of networked machines can come to identify a cat without having to be explained what a cat ‘is.’ Only through patterns of affiliation does sensory response emerge. The key to learning was therefore exposure to a “large sample of stimuli,” which Rosenblatt stressed meant approaching the nature of learning “in terms of probability theory rather than symbolic logic” (Rosenblatt 1962). The perceptron model suggests that machine systems, like markets, might be able to perceive what individual subjects cannot (Rosenblatt 1958, 288-89). While each human individual is limited to a specific set of external stimuli to which they are exposed, a computer perceptron can, by contrast, draw on data that are the result of judgements and experiences of not just one individual, but rather large populations of human individuals (ibid. 19-20).#neuron, #AI, #patterns, #perceptron

Against Thought

For Rosenblatt and Hayek, and their predecessors in psychology, notions of learning forwarded the idea that systems can change and adapt non-consciously, or automatically. The central feature of these models was that small operations done on parts of a problem might agglomerate into a group that is greater than the sum of their parts and solve problems not through representation but through action. Both Hayek and Rosenblatt take from theories of communication and information, particularly from cybernetics that posit communication in terms of thermodynamics. According to this theory, systems at different scales are only probabilistically related to their parts. Calculating individual components therefore cannot represent or predict the act of the entire system.For more on the influence of cybernetics and systems theories on producing notions of non-conscious growth and evolution in Hayek’s thought see Lewis 2016 and Oliva 2016. While never truly possible, this disavowal of ‘representation’ continues to fuel the desire for ever larger data sets and unsupervised learning in neural nets which would, at least in theory, be driven by the data.#statistical model, #cybernetics, #data

Hayek himself espoused an imaginary of this data rich world that could be increasingly calculated without (human) consciousness. He was apparently fond of quoting Alfred North Whitehead’s remark that “it is a profoundly erroneous truism… that we should cultivate the habit of thinking what we are doing. The precise opposite is the case. Civilization advances by extending the number of important operations we can perform without thinking about them” (Moore 2016, 50).I am indebted to Moore’s excellent discussion for much of the argument surrounding Hayek, democracy, and information. This quote is from Hayek 1945. The perceptron is the technological manifestation of the reconfiguration and reorganization of human subjectivity, physiology, psychology, and economy that this theory implies. And as a result of the belief that technical decision making not through governments but at the scale of populations might ameliorate the danger of populism or the errors of human judgement, the neural net became the embodiment of an idea (and ideology) of networked decision making that could scale from within the mind to the planetary networks of electronic trading platforms and global markets. As the genealogy between psychology, computing, and economics demonstrates, its clear that the idea of a networked intelligence, perhaps best exemplified in our present through the figure of the neural net and ‘deep’ learning, has been a grounding assumption and technique bringing media and finance together.#perceptron, #neuron, #economics

Derivation

This reorganization of rationality and technology has no better exemplar then derivative trading models. One of the central technologies for capitalizing on ‘noise’ and the market as information processor, was the Black-Scholes Option Pricing Model, which Black developed with his colleagues Myron Scholes and Robert Merton.#noise

Though it has traditionally been difficult for traders to determine how much the option to purchase an asset or stock should cost, up until the 1970s, it was widely assumed that the value of an option to buy a stock would necessarily be related to the expected rate of return of the underlying stock itself, which in turn would be function of the health and profitability of the company that issue the stock.For an account of earlier nineteenth and twentieth century models for pricing options, see MacKenzie 2006, 37-88. This is the old understanding related to objective measures of value. Its also an old understanding of models—that they represent or abstract from something real out there in the world.#derivative

Black and his colleague Scholes introduced the Black-Scholes Option pricing model in 1973 in order to provide a new way of relating options prices to the future.The three men most often credited with the formalization of the derivative pricing model are Black, an applied mathematician who had been trained by artificial intelligence pioneer Marvin Minsky; Myron Scholes, a Canadian-American economist from University of Chicago who came to MIT after his PhD under Eugene Fama; and Robert Merton, another economist trained at MIT. Collectively they developed the Black-Scholes-Merton derivative pricing model. While these three figures are hardly singularly responsible for global financialization, their history serves as a mirror to a situation where new computational techniques were produced to address geo-political-environmental transformation. See Szpiro 2011, 116-17. What made this model unique in the history of finance was that it completely detached the price of an option from any expectation about the likely value of the underlying asset at the option maturity date. Instead, the key value for Black and Scholes was the expected volatility of the stock, which meant the movement up and down of the price over time. The estimated volatility of a stock was not a function of one’s estimate of the profitability of the company that issued the stock, but was instead in part a function of the investment market as a whole.As Black noted in 1975, “[m]y initial estimates of volatility are based on 10 years of daily data on stock prices and dividends, with more weight on more recent data. Each month, I update the estimates. Roughly speaking, last month’s estimate gets four-fifths weight, and the most recent month’s actual volatility gets one-fifth weight. I also make some use of the changes in volatility on stocks generally, of the direction in which the stock price has been moving, and of the ‘market’s estimates’ of volatility, as suggested by the level of option prices for the stock” (Black 1975b, 5, cited in MacKenzie 2006, 321, note 18). The Black-Scholes option pricing model, in other words, was not interested in the “true” value of the underlying asset, but rather in the relationship of the stock to the market as a whole.#stock, #volatility, #derivative

Scholes and Black had begun working together in the late 1960’s while consulting for investment firms. Their work involved applying computers to modern portfolio theory and automating arbitrage.A ‘portfolio’ is a collection of multiple investments, which vary in their presumed riskiness, and which aim to maximize profit for a specific level of overall risk. ‘Arbitrage’ refers to purportedly risk-free investments, such as the profit that can be made when one takes advantage of slight differences between currency exchanges—or the price of the same stack—in two different locations. Scholes and Black opened The Pricing of Options and Corporate Liabilities, in which they introduced their option pricing equation, with a challenge: “If options are correctly priced in the market, it should not be possible to make sure profits by creating portfolios of long and short positions” (1973, 637). Since people do make money, options therefore cannot be correctly priced. Mispricing—that is, imperfect transmission of information—must be essential to the operation of markets. This also meant, though, that a trader could not, even in principle, simply ‘be rational’ in deciding on the risk assigned to an option (by, for example, attempting to determine the true value of the underlying asset).#statistical model, #derivative

Working between physics, machine learning, and cybernetics, Scholes and Black recognized that the insights of reasonable traders might matter less in pricing assets then would measuring the volatility of a stock (that is, the dynamics of upward and downward movement of price over time). Considering the context, and Black’s close relationship to computer sciences, it is possible to understand their conclusion as extending the assumptions inherent within neural network theories and neo-liberal economic theory, to building technologies for betting on futures.#neuron, #derivative

Stocks, they reasoned, behaved more like the random motions of particles in water (thermodynamics) then proxies or representations of some underlying economic reality. And agents (humans) behaved more like machines, or perhaps blindfolded individuals. The market is full of noise (as understood as unpredictable or not fully knowable signals), and the agents within it do not know the relationship between the price of a security and the “real” value of the underlying asset. The system is chaotic. However, if agentsrecognize the limits of their knowledge, they can focus on what they can know: namely, how a single stock price varies over time, and how that variation relates to the price variations of other stocks. Instead of trying to calculate the relationship between price of a security and the real value of the asset, something that Black and Scholes assumed one generally cannot know, they operated with the assumption that all the stocks in the market moved independently like gas particles in thermodynamics, and that measures of information like entropy and enthalpy could therefore also apply to the way stock prices ‘signal’ each other. Their innovation was to posit that in order to price an option, one needed only to take the current price and the price changes of the asset and figure out the complete distribution of share prices to calculate an option price.Robert Merton added the concept of continuous time and figured out a derivation equation to smooth the curve of prices. The final equation is essentially the merger of a normal curve with Brownian motion (Das 2006, 194-95).#volatility, #information theory, #statistical model

While initially no one was ready to publish the article due to its supposed overly technical approach, within weeks of its publication, numerous corporations were already offering software for such pricing equations (MacKenzie 2006, 60-67). This was in part a consequence of the fact that the model joined communications and information theories with calculation in a way that made the equation amenable to algorithmic enactment. In fact, as individuals created more complex derivative instruments tying many types of assets and markets together, computers became essential both for obtaining data about price volatility and calculating option prices. An entire industry, and the financial markets of today, were born from this innovation and its new understanding of noise. And because derivatives are bets on the future value of an asset, the derivatives markets is in fact far larger than the world’s current gross domestic product, by now exceeding the world’s GDP by twenty times. Since the 1970s, these markets have grown nearly massively (e.g., 25% per year over the last 25 years).#statistical model, #derivative, #economics

There is also a deeply repressed history of geopolitics behind these innovations in finance. The derivative pricing equation emerged with the end of Bretton woods, decolonization, post-Fordism, and the OPEC oil crisis, to name a few of the transformations at the time then, as a way to tame or circumvent extreme volatility in politics, currency, and commodity markets. New financial technologies and institutions such as hedge funds were created in order to literally ‘hedge’ bets: to ensure that risks were reallocated, decentralized, and networked. Through the likes of derivative technologies such as short bets, credit swaps, and futures markets, dangerous bets would be combined with safer ones and dispersed across multiple territories and temporalities. Corporations, governments, and financiers flocked to these techniques of uncertainty management in the face of seemingly unknowable, unnamable, and unquantifiable, risks.It is worth noting that the Black Scholes Derivative pricing equation inaugurating the financialization of the global economy was introduced in 1973. For an excellent summary of these links and of the insurance and urban planning fields please see Grove 2018. The impossibility of prediction, the subjective nature of human decision making, and the electronic networking of global media systems, all became infrastructures for new forms of betting on futures while evading the political-economic struggles of the day.#post-fordism, #economics, #derivative, #prediction

Models and Machines

Neoliberal economics often theorizes the world as a self-organizing adaptive system to counter the idea of planned and perfectly controllable political (and potentially totalitarian) orders. Within this ideology the market takes on an almost divine, or perhaps biologically determinist, capacity for chance and emergence, but never through consciousness or planning (Ramey 2015). Evolution was imagined against willed action and the reasoned decisions of individual humans. More critically, emerging in the backdrop of civil rights and calls for racial, sexual, and queer forms of justice and equity, the negation of any state intervention or planning (say affirmative action) became naturalized in the figure of the neural net and derivative; a model of mind and market that appeared to make human built institutions and organizations (such as the NYSE) seem as naturalized necessities. Any efforts to address structural injustice became a conspiracy against emergence, economy, and intelligence.https://www.researchgate.net/figure/GROWTH-OF-GLOBAL-DERIVATIVE-MARKET-SINCE-1998-Globally-the-notional-value-of-all_fig12_328411995#market, #planning, #neuron, #injustice

We have become attuned to this model of the world where our machines and markets are syncopated with one another. These models, however, might also have the potential to remake our relations with each other and the world. As cultural theorist Randy Martin has argued, rather than separating itself from social processes of production and reproduction, algorithmic and derivative finance actually demonstrates the increased inter-relatedness, globalization, and socialization of debt and precarity. By tying together disparate actions and objects into a single assembled bundle of reallocated risks to trade, new market machines have made us more indebted to each other. The political and ethical question thus becomes how we might activate this mutual indebtedness in new ways, ones that are less amenable to the strict market logics of neoliberal economics (Martin 2014).#derivative, #economics, #neoliberal

The future lies in recognizing what our machines have finally made visible, and what has perhaps always been there: the socio-political nature of our seemingly natural thoughts and perceptions. Every market crash, every sub-prime mortgage event, reveals the social constructedness and the work—aesthetic, political, economic—it takes to maintain our belief in markets as forces of nature or divinity. And if it is not aesthetically smoothed over through media and narratives of inevitability, they also make it possible to recognize how our machines have linked so many of us together in precarity. The potential politics of these moments has not yet been realized, but there have been efforts, whether in Occupy, or more recently in movements for civil rights, racial equity, and environmental justice such as Black Lives Matter or the Chilean anti-austerity protests of 2019 (to name a few).#economics, #politics

In that all computer systems are programmed, and therefore planned, we are also forced to contend with the intentional and therefore changeable nature of how we both think and perceive our world. The failed efforts to model markets makes us recognize the historically situated and socially specific nature of both the economy and cognition.Research for this article was supported by the Mellon Foundation, Digital Now Project, at the Center for Canadian Architecture (CCA) and by the staff and archives at the CCA. Further funding was given by the Swiss National Science Foundation, Sinergia Project, Governing Through Design. A previous version of this article appeared in E-Flux Architecture, in March 2023, in the special issue on Models.#economics, #cognition

Footnotes

  1. 1A critical first step, as historians such as Philip Mirowski (2002a, 2002b, 2006) have noted, towards contemporary notions of information economies. ↩︎
  2. 2For more on the influence of cybernetics and systems theories on producing notions of non-conscious growth and evolution in Hayek’s thought see Lewis 2016 and Oliva 2016. ↩︎
  3. 3I am indebted to Moore’s excellent discussion for much of the argument surrounding Hayek, democracy, and information. This quote is from Hayek 1945. ↩︎
  4. 4For an account of earlier nineteenth and twentieth century models for pricing options, see MacKenzie 2006, 37-88. ↩︎
  5. 5The three men most often credited with the formalization of the derivative pricing model are Black, an applied mathematician who had been trained by artificial intelligence pioneer Marvin Minsky; Myron Scholes, a Canadian-American economist from University of Chicago who came to MIT after his PhD under Eugene Fama; and Robert Merton, another economist trained at MIT. Collectively they developed the Black-Scholes-Merton derivative pricing model. While these three figures are hardly singularly responsible for global financialization, their history serves as a mirror to a situation where new computational techniques were produced to address geo-political-environmental transformation. See Szpiro 2011, 116-17. ↩︎
  6. 6As Black noted in 1975, “[m]y initial estimates of volatility are based on 10 years of daily data on stock prices and dividends, with more weight on more recent data. Each month, I update the estimates. Roughly speaking, last month’s estimate gets four-fifths weight, and the most recent month’s actual volatility gets one-fifth weight. I also make some use of the changes in volatility on stocks generally, of the direction in which the stock price has been moving, and of the ‘market’s estimates’ of volatility, as suggested by the level of option prices for the stock” (Black 1975b, 5, cited in MacKenzie 2006, 321, note 18). ↩︎
  7. 7A ‘portfolio’ is a collection of multiple investments, which vary in their presumed riskiness, and which aim to maximize profit for a specific level of overall risk. ‘Arbitrage’ refers to purportedly risk-free investments, such as the profit that can be made when one takes advantage of slight differences between currency exchanges—or the price of the same stack—in two different locations. ↩︎
  8. 8Robert Merton added the concept of continuous time and figured out a derivation equation to smooth the curve of prices. The final equation is essentially the merger of a normal curve with Brownian motion (Das 2006, 194-95). ↩︎
  9. 9It is worth noting that the Black Scholes Derivative pricing equation inaugurating the financialization of the global economy was introduced in 1973. For an excellent summary of these links and of the insurance and urban planning fields please see Grove 2018. ↩︎
  10. 10https://www.researchgate.net/figure/GROWTH-OF-GLOBAL-DERIVATIVE-MARKET-SINCE-1998-Globally-the-notional-value-of-all_fig12_328411995 ↩︎
  11. 11Research for this article was supported by the Mellon Foundation, Digital Now Project, at the Center for Canadian Architecture (CCA) and by the staff and archives at the CCA. Further funding was given by the Swiss National Science Foundation, Sinergia Project, Governing Through Design. A previous version of this article appeared in E-Flux Architecture, in March 2023, in the special issue on Models. ↩︎

Literature

  • Black, Fischer. 1986 July. „Noise.“ The Journal of Finance 41, no. 3, 529-43, https://doi.org/10.1111/j.1540-6261.1986.tb04513.x.
  • Black, Fischer and Myron Scholes. 1973, May-June. „The Pricing of Options and Corporate Liabilities.“ The Journal of Political Economy, Vol. 81, No. 3, 637-654, https://doi.org/10.1086/260062.
  • Das, Satyajit. 2006. Traders, Guns, and Money: Knowns and Unknowns in the Dazzling World of Derivatives. Edinburgh, Prentice Hall, Financial Times.
  • Erickson, Paul, Judy L. Klein, Lorraine Daston, Rebecca M. Lemov, Thomas Sturm, and Michael D. Gordin. 2015. How Reason Almost Lost Its Mind: The Strange Career of Cold War Rationality. Chicago, University Of Chicago Press.
  • Grove, Kevin. 2018. Resilience. New York, Routledge.
  • Halpern, Orit, and Robert Mitchell. 2023. The Smartness Mandate. Cambridge, Mass., MIT Press.
  • Hayek, Friedrich. 1945. „The Use of Knowledge in Society.“ The American Economic Review XXXV, no. September, 519-530.
  • Hayek, Friedrich. 1960. The Constitution of Liberty. 2011 ed. Chicago, University of Chicago Press.
  • Hebb, Donald. 1949. The Organization of Behavior: A Neuropsychological Theory. New York, Wiley.
  • https://doi.org/10.1215/00182702-2005-029.
  • Kennedy, Devin. 2018. „The Machine in the Market: Computers and the Infrastructure of Price at the New York Stock Exchange, 1965–1975.“ Social Studies of Science 47, no. 6, 888-917, https://doi.org/10.1177/0306312717739367.
  • Lewis, Paul. 2016. „The Emergence of 'Emergence' in the Work of F.A. Hayek: A Historical Analysis.“ History of Political Economy 48, no. 1, 111-50, https://doi.org/10.1215/00182702-3452315.
  • MacKenzie, Donald A.. 2006. An Engine, Not a Camera – How Financial Models Shape Markets. Cambridge, Mass., MIT Press, 2006.
  • Martin, Randy. 2014. „What Difference do Derivatives Make? From the Technical to the Political Conjuncture”, Culture Unbound, vol. 6, 189-210, https://doi.org/10.3384/cu.2000.1525.146189.
  • Mirowski, Philip. 2002. Machine Dreams: Economics Becomes a Cyborg Science. New York, Cambridge University Press.
  • Mirowski, Philip. 2006. „Twelve Theses Concerning the History of Postwar Neoclassical Price Theory.“ History of Political Economy 38, 344-379.
  • Moore, Alfred. 2016. „Hayek, Conspiracy, and Democracy.“ Critical Review 28, no. 1, 44-62, https://doi.org/10.1080/08913811.2016.1167405.
  • Oliva, Gabriel. 2016. „The Road to Servomechanisms: The Influence of Cybernetics on Hayek from the Sensory Order to the Social Order.“ Research in the History of Economic Thought and Methodology, 161-198, http://dx.doi.org/10.1108/S0743-41542016000034A006.
  • Ramey, Joshua. 2015, December 8. „Neoliberalism as a Political Theology of Chance: The Politics of Divination.“ Palgrave Communications, 1-9, https://doi.org/10.1057/palcomms.2015.39.
  • Rosenblatt, Frank. 1958. „The Perceptron: A Probabilistic Model for Information Storage and Organization in the Brain.“ Psychological Review 65, no. 6, 386-408.
  • Rosenblatt, Frank. 1962. Principles of Neurodynamics: Perceptrons and the Theory of Brain Mechanisms. Washington, D.C.: Spartan Books.
  • Szpiro, George G. 2011. Pricing the Future: Finance, Physics, and the 300 Year Journey to the Black-Scholes Equation. Kindle Edition, New York, Basic Books.
  • Header Image Credits: Black-Scholes Model summation from https://brilliant.org/wiki/black-scholes-merton/ and Wikipedia, Accessed June 20, 2020.