A timely photo essay has just landed on the newsstands of New York City. Portfolio magazine, the glossiest newcomer to American business publications, features a striking photograph of Kuwait’s Financial Market: men in white Arab robes lounging about a carpeted room with red bench seats, all of it surrounded by electronic price boards. A strange crossover between airport and trading pit.

The photograph is followed by a terrific photo essay. Dramatic photographs of exchanges around the world, taken by artist-photographer Robert Polidori. The piece is a very timely addition to our discussion on the rapidly-shrinking NYSE. From Chicago to Sao Paulo, Tokyo to Nairobi, Polidori’s shots provide a window into the tremendous multiplicity of exchange assemblages. In Nepal, prices are still written on the board like they used to in New YOrk eighty years ago. London’s exchange has added activity to its static computers by planting a four-story high art installation in its lobby. Kuwait looks like an airport lounge. Nairobi and Tehran both look like Internet cafes… brokers sitting on chairs, staring at a PC screen… and separated by mysterious partitions for confidentiality. Hong Kong is organized as concentric circles of benches. Chicago’s pit traders wear colorful jackets, whereas Sao Paulo’s counterparts show up to the pit in white shirt and dark tie.

Daniel, the NY link of this little London-NY collaboration, sent me this link to a NY Times article about the immanent closure of some of the physical trading floors of the New York Stock Exchange. Some of the readers of this blog came to a tour of the New York Mercantile exchange where a similar message, about the rapid move from face-to-face to screen-based trading, was conveyed (see here).

This article, however, implies to yet another dimension of the move to screen-based that’s less obvious:

After becoming a publicly traded company itself last year by merging with Archipelago Holdings, the exchange’s operator merged in April with Euronext, which owned stock and futures exchanges in London, Paris, Brussels, Amsterdam and Lisbon.

This gradual amalgamation of financial markets at the institutional level into a single techno-social network means, for many institutions, that traders would have to go home (either retire or trade from outside the floor). But, this also means that markets now would use unified clearing and settlement systems (Euroclear), or in other words, that the exchanges’ risk management is gradually becoming centralised. In fact, by the end of 2007, Euroclear, who provide clearing and settlement services for Euronext will “move into the implementation phase of our platform consolidation programme, with the launch in production of the Single Settlement Engine (SSE).” It is feasible that such consolidation will deliver better efficiency, but it also raises questions about the ability to manage financial risks in a cross-owned network of exchanges that constitutes a large share of the global trading volume. For example, the operation of a unified risk management system may create inadvertent drops in prices in entire sections of the market by generating sale orders. I am sure that Euronext are much more sophisticated than this, but, at least at the conceptual level we can ask the question about the new forms of risks that are introduced to the financial markets through the creation of such exchange conglomerates.

An inspiring event just took place at Columbia. On September 7-8, David Stark and I organized a workshop on Knightian uncertainty. It brought together distinguished economists, including Nobel laureate Douglas North, as well as sociologists and management scholars, to talk about Knightian uncertainty.

Why uncertainty? It is not an exaggeration to argue that most post-war advances in social sciences have focussed on risk. From Black Scholes to Porter’s Five forces, Nash equilibrium to the efficient market hypotheses, the social sciences have modeled with great success situations in which the future can be predicted. But in a world of innovation, hedge funds, financial bubbles, climate change and terrorism, Knightian uncertainty — not risk — sits at the center of every decision-makers’ agenda. The workshop prompted reflection on these issues. What does uncertainty imply for policy-makers? for managers? For arbitrageurs? For strategists?

Here’s a few of the issues that we touched on.

First, the sheer importance of uncertainty. This was cogently articulated by Douglas North. Moving past the intellectual concern with institutions that won him general acclaim, North has turned to the problem of radical uncertainty, mental models and economic growth. North takes a really long perspective –five or ten centuries– and observes that some countries have risen from poverty to wealth, whereas others remain in misery. Why such wide differences? None of the orthodox economic explanations, he argues, account for it. To North, the explanation lies in the differences in mental models that guide the creation of institutions which, in turn, promote economic growth (or the lack of it). On the same session, Joe Porac expressed his disagreement with one of main premises of the workshop — that uncertainty is behind much of the activity that we see in the economy. Porac pointed to possible psychological mechanisms that may be having a similar effect.

Uncertainty is also central to the corporate world. For example, how should managers confront uncertainty? One of the most counterintuitive aspects of Knightian uncertainty is that the recipes that work best in a predictable world of risk can create major disasters under a scenario of uncertainty. This difficulty was emphasized by Nassim Taleb with the concept of rare events or “Black Swans.” Taleb distinguished between two realms of action: the real of non-rare events or “mediocristan,” and “extremistan.” The former describes phenomena that adhere to a normal Gaussian distribution (size of mountains, weight of people), while the latter are described non-normal distributions (wealth of Bill Gates, stock price movement in 1987). In extremistan, actors have to prepare for the potential occurrence of or rare but disruptive events. They face extraordinary success… and failure. Extremistan is where the key problems, the bankruptcies, the defaults, the shocks arise. It is particularly dangerous for decision-makers take mediocristan for extremistan, and assume they know more than they really do.

How, then, should companies operate in an extremistan-like world of uncertainty? As noted, it requires a very different organization. Firms that operate in this environment, according Anna Grandori, should espouse an “epistemic rationality”. Traditional concerns with optimality and saving resources should be replaced by efforts to discover the environment. The notion of heuristic, Grandori argued, also needs to be redefined: under uncertainty, the challenge no longer lies in recognizing familiar patterns but in discovering new models, novel relationships, etc. Search should no longer stops when performance is good enough, but when the firms’ insights make accurate predictions. The entire organization, in short, needs to be engaged in active research of one kind or another.

A different answer was provided by David Stark’s concept of heterarchy. Uncertainty poses managerial problems that go far beyond profitability. In extreme cases such as transition economies, the definition of success is itself in flux. Profits may not be the key measure of success; employment, sales, production — even location — may be the central reason why a company gains resources. How do you organize for such flux? Stark called for a different internal logic of organization — one that allows companies to adhere to competing conceptions of worth. To accomplish this, companies need to avoid hierarchy and bureaucracy. Instead, firms should pursue a dense network of horizontal interactions and distributed power relations.

The managerial problem, and more specifically the problem of strategy implementation, centered the presentation by Sarah Kaplan. Strategy involves the future. But consider the conundrum faced by an optical fiber company immeditely after the burst of the Internet bubble. A company that had been virtually living in the future for several years was suddenly confronted with a collapse in their market, as well as in their own confidence to predict the future. Kaplan identified the mechanisms whereby the company managed to restore sense and bridge their past, present and future.

Uncertain markets

How do capital markets confront uncertainty? The question goes at the heart of the conversations in the workshop, as well as at the core of Wall Street’s present headaches. Indeed, the cleavage between risk and uncertainty is at is widest in finance. From the CAPM model to the Black-Scholes equation, orthodox finance has provided Wall Street practitioners with the tools to engage in a new financial strategy, quantitative finance. But in contexts of uncertainty, those models seem to stop working and sometimes even turn against their users. In this sense, Emanuel Derman described quantitative finance in a lucid and succinct manner. Mathematical formulae, he argued, provide a mind-broadening ability to associate the value of a stock to the value of some other, seemingly different one. Unlike physics, which works with axioms, modern finance operates by analogy. But these analogies may break down in contexts of uncertainty. Examples of these include the 1987 crisis, the 1998 crisis and… the current subprime debacle (which was in the mind of all participants). What do to then?

In some ways, my own presentation addressed this very problem, and pointed to financial tools as a solution. I examined how merger arbitrageurs used a specific visualization device, known as the “spread plot,” to confront their own interpretations with those of their competitors. The spread plot provides arbitrageurs with information about what their rivals think. By alternating between their own estimates of merger probability and the probability implied by the spread, arbitrageurs stay alert to their own misinterpretations and are prompted into search. As arbitrageurs adopt financial positions, their beliefs feed back into the spread. Over time, the social use of the spread plot among the arbitrage community leads to gradual convergence in probability estimates. The arbitrageur’s solution to uncertainty, I suggest, is a financial tool: the spread plot.

Harrison Hong widened the discussion of financial uncertainty with a generalized model of disagreement in markets. His discussion centered on discrepancies between the beliefs of market actors and the ways in which these influence traded volume and price volatility. Behavioral models, Hong argued, provide some explanation for either of these, but no single model can explain extraordinary movements in both volume and prices. And yet, this is the central trait that distinguishes situation of uncertainty such as the dot-com bubble. In many ways, the concern echoes the problem raised by Beunza. Namely, that because arbitrageurs draw on all the social clues at their disposal, perception and action, prices and trading, fluctuate together. Hong presented several models in which differences in interpretation, constraints on shorting and differences in media coverage combine to yield the co-movement in prices and volume that is observed. In short, then, disagreement and differences in interpretation is crucial. Without them, one cannot account for bubbles or other periods of uncertainty.

The role of disagreement on Wall Street was further studied by Raghu Garud. A key group in this setting is Wall Street’s securities analysts. They play central role in articulating, using and debating different perspectives about company value. Garud described the role of analysts in advancing different opinions about Amazon.com during the emergence of the Internet. Analysis even interpreted the same piece of news in radically different manners. Eventually, however, their differences in interpretation collapsed when the milestones and rhythms promised by the dot-com optimists were not confirmed by events. But the process took three years.

Uncertainty and strategic interaction

What does “strategy” mean when the opponent is unknown? Traditionally, strategic interaction has been studied in game theory. But according to Adam Brandenburger, game theory can sometimes be unhelpful; specifically, orthodox game theory typically assumes that both players in the game know how the other person thinks, her interests and her payoffs (the so-called assumption of “common knowledge”). Bringing uncertainty into our understanding of games radically shifts the problem from an abstract exercise of calculative anticipation into something much different. In the so-called school of “epistemic game theory,” strategic interaction is seen instead as the way in which the players discover each other. What does the other person think? What does he or she stand to gain and lose?

In understanding “the other,” it is helpful to keep in mind the logic of justification that defines him or her. According to Laurent Thevenot, it is not the same to operate in a domestic regime (one rooted in tradition) than in a market setting (determined by competition) or a civic setting (shaped by the general interest). These different regimes not only bring with them different geographical locations, goals and metrics for success, but even differences in values. Such multiplicity of worth has been studied in detail by the French school of “economics of convention.” Conventions, according to them, are the solution to the problem of coordination in games.

But how does a convention arise? Such was the question addressed by Olivier Favereau. The heterodox economist explored the process whereby a collective of people who do not share a language are able to espouse a convention, underscoring the role of intersubjectivity in bringing about common knowledge.

All in all, an extraordinary event. The description above only begins to scratch the surface…. very soon, I’ll also be reporting on some very provocative commentaries by Bruce Kogut, Harrison White, Yuval Millo, and Fiona Murray.

Stay tuned.

The myth of virtuality

July 12, 2007

Increasingly, I hear and read in academic forums about the ‘virtuality’ of derivative markets. The notion that these markets are ‘not-real’, disjointed or otherwise exist ‘outside economy’ is raised frequently in sociological papers. This notion is also accompanied, more times than not, by the would-be implications of that ‘virtuality’. That is: what can derivative markets do because they are ‘virtual’? For example, derivatives and liquidity. A few months ago I listened to a paper in which the author claimed that because derivatives can be written on the basis of anything, are written for a very large variety of underpinning assets that can be exchanged, derivative markets provide, in effect, an almost infinite liquidity!

Another area affected by the ‘virtuality’ syndrome is the concept of leverage. Many derivatives allow high leverage and, consequently, there is a high ratio between the volume of derivatives contracts traded (nominal value) and contracts that get exercised (exchanged for the underpinning asset). The common argument following this characteristic of derivatives goes along these lines: “the annual nominal value of exchange-traded derivatives exceeds the global GDP [so far so good] and therefore it is obvious that these markets are completely divorced from any normal economic activity and are responsible for the hyper-capitalism we currently witness, the widening gap between rich and poor …” You get the drift.

True, the nominal value of derivatives traded is indeed astronomical and it is true that only a fraction of that volume is exercised. Additionally, it would be fair to say that the operation of many derivatives markets is hinged upon maintaining such a condition. However, does this fact make derivatives markets virtual? Let me put it differently. If the high ratio between held contracts (or even assets) and exercised assets is a mark of a virtual market then one does not need to delve into the exotic world of swaps, binary options and other financial beasts to find ‘virtual’ markets. In fact, to find them, it is enough to look at the boring, old stock exchange. On a extremely busy day in the New York Stock Exchange only between 1.5 and 2.0 per-cent of the Dow Jones Industrial Index stocks held by the public change hands. Yet, this minute amount determines a price change for the rest of un-traded bulk of stocks. Moreover, the validity of the price quoted for stocks is conditioned on the very fact that only a relatively small minority of owners would want to trade them at any given moment. If, for example, one morning 90% of the owners of IBM shares would decide to sell them, it is fairly obvious that the large majority among them would not receive a price that is even close to the price quoted for IBM the previous day… So, we see that the stock market, very much like the dreaded derivatives market is based on a high ratio between exercised and held assets. Well, then, is the stock market also ‘virtual’.

An even more ‘basic’ example for the same phenomenon is commercial banks’ reserve ratio. All sociology textbooks refer to the self-fulfilling nature of ‘run on the bank’: a bank’s customers, believing that there may be a liquidity problem with the bank, withdraw their savings and thereby cause a liquidity problem. But, let us now examine the common situation in which the cash-to-loans ratio is kept. It is obvious that not all the deposits are readily available in the bank, the same way that not all stocks can be exchanged simultaneously and not all derivatives can be exercised. So, does this make all there institutions virtual? Following the logic of the hypothetical (but sadly, very realistic) quote presented above, the answer must be: yes. All financial markets include such an inherent ‘virtual’ element. So, does this mean that there are no distinguishing factors that make derivatives markets worthy of a focused sociological analysis? Certainly not; it only means that if we want to talk sociologically about derivatives markets, we may have to discard the false concept of ‘virtuality.

One of the more interesting developments in the Social Studies of Finance is the extensions of its methods and approaches to other business phenomena. Just as SSF examines the practice, technology and content of the financial value claims, a growing literature is addresses with similar questions in marketing, strategy, etc.

An interesting panel session, organized by Catelijne Coopmans and Elena Simakova at the 2007 Academy of Management meeting in Philadelphia is concerned with this question. The meeting, part of the Professional Development Workshop programs is titled, “Does STS Mean Business? Interdisciplinary engagement as a source of theoretical innovation” and will have the presence of Raghu Garud, Peter Groenewegen, Peter Karnoe, Christian Licoppe, Eamonn Molloy, Wanda Orlikowski, Marc Ventresca and Ragna Zeiss (as well as myself, Daniel Beunza).

According to the organizers

This workshop addresses the opportunities and challenges arising from the interaction between organization and management studies on the one hand, and science and technology studies (STS) on the other. Taking up the theme of this year’s conference, we observe that STS concepts and scholars are arguably ‘doing well’ in the management arena, but what does it mean for them to do ‘good’? The contributions to this workshop offer various perspectives on (1) what a valuable engagement between organization and management studies and STS looks like, and (2) what such an engagement contributes to existing knowledge and ways of doing research. Potential tensions between being useful and being critical are thereby high on the agenda. The workshop also addresses possible implications of how the engagement between management studies and STS is being framed.

It will take place on Saturday August 4 2007, 9am-12pm at Philadelphia Marriott in Franklin 3. No registration required. For a detailed program, click on: Does STS Mean Business? Program

I would like to draw your attention to an interesting short magazine piece by Nassim Taleb and Avital Pilpel. Taleb is a well-known name among derivative traders and risk managers, and more recently, he began to publish texts of a more philosophical nature about risk. This piece relates to Taleb’s more recent intellectual trend and, I would assume, is related to his most recent book.
Some of the concepts discussed in the piece would not be new to ‘sociologists of risk’
(if we can put such a short label on this broad church). However, placing these concepts alongside a discussion about the limitations of the historical VaR concept (a very brief one, as the piece is very short) and with a reference to the self-referential nature of statistical distributions creates a nice opening for a sociological discussion about contemporary risk management. A couple of points:

First, in addition to the critical analysis regarding the nature of statistical distributions as a central tool in risk management, it is worth to note that such distributions do not take into account the cause-and-effect nature of risk management. That is, once an organization has encountered a risk event, it is very likely that it will change practices in such a way that will alter the probability of similar event happening again.

The second point is a development of the first one and refers more specifically to the constitutive power of risk events. Taleb and Pilpel suggest that the less frequent an event is, the more severe it will be. They imply that there is something about the nature of risk events that causes and maintains this inverse relation between frequency and severity:

[T]he severity of the event, will be in almost all cases inversely
proportional to its frequency: the ten-year flood will be more
frequent than the 100 year flood – and the 100 year flood will be more
devastating.

I would like to suggest that instead of referring to some implicit external causality that would explain this inverse relation, it could be a good idea to look at risks through the social dynamics in which they unfold. In particular, maybe the severity of risks should be measured in relation to the organizational environment in which they occur. Events of various degrees of severity take place at various frequencies. However, when events happen frequently enough, organizations become ‘used to them’. That is, institutional reactions to risk events become more routine-based, procedural even, and as a result, a lower degree if severity is assigned to the events. This is not an ‘optical illusion’ – the more prepared an organization is to a specific risk event, the less impact that event would have on the organization. Hence, rare event are more severe not because of some mysterious ‘long-term’ cycle that makes them more devastating but because they are rare and organizations have less opportunities to prepare for them.

I recently got a draft of a review paper, Theories of Markets and Theories of Society by Marion Fourcade, and now, that the paper is out, (American Behavioral Scientist, 50:8, 1015-1034) I would like to say a few words about the piece. A brief disclaimer first: my work is discussed favorably in the review, which is great. However, this not the main reason, I am writing about the piece. I refer to it here because I think that it presents a very useful map of ideas and concepts in contemporary economic sociology.

Fourcade begins by presenting a general dichotomy between ‘field analysts’ and ‘network analysts’. Then, perpendicularly to this dichotomy, Fourcade presents the work of ‘the performativists‘:

Finally is a group I will designate as the performativists, a much more recent stream of research, by and large coming out of Europe and out of science studies, who emphasize the way technologies (that is, men-machine complexes produced by –for instance– accountants, economists, or operations researchers) intervene in the construction of markets and economies.

This is a very nice definition to the work that falls under the broad label of performativity and can serve as a good starting, in my mind, to a discussion about the new developments in economic sociology. Having said that, I think that the truly radical ideas that that the performativity stream brings (or rather, re-introduces) to economic and organisational sociology have been subdued somewhat in the review and I would like to refer to one of these here briefly.

Let us look again at the connection between technologies and market behaviour and evolution that is mentioned in the definition above. Market technologies are products of expert knowledge that is being put through a ‘social mangle’ and performativity analyses the ways in which expert knowledge (i.e. risk management, management accounting, etc) turns, in effect, into an economic technology and is being embedded into economic activity.

Apart from the neat idea of ‘unexpected consequences’ that performativity applies to various cases, what this expert knowledge – technology – markets connection implies is that we should re-configure economics itself. That is, if the radical message of performative economic sociology is adhered to, then the study of economic action would refer not only to how humans interact (this leads us back to the famous story about baboon society), but equally to how humans and machines interact and bring about economic realities.

In other words, performativity calls for a change in the fundamental unit of analysis of economic action. The ‘economic atom’ of neoclassical economics – the single, utility maximising agent – is replaced by an ‘economic molecule’ – a cluster of humans and devices. Performativity offers this cluster as the basic unit of economic action.

It has to be mentioned that very similar things were suggested by Philip Mirowski in his seminal Machine Dreams. What the ‘performativity people’ add here is a more compact set of theoretical statements that can be applied to a wide variety of analysis topics in economic sociology.

The journal Accounting, Organizations and Society is planning to do a special section on social studies of finance and accounting. The deadline for submitting contributions is August 15, 2007.

The title of the section is: Tracking the numbers: Across accounting and finance, organizations, markets and cultures, models and realities. Here’s the editor’s description:

Studies of finance and those of accounting as social and institutional practice are united by an interest in analyzing social settings, organizations, cultures, markets and institutions characterized by a high frequency of circulating numbers. Across these fields and the associated scientific disciplines, a remarkable number of researchers has by now become engaged in investigating how the use of numbers and various social settings co-develop, change or persist. While interdisciplinary research in accounting has largely focussed on organized settings of calculative practice, the employment of numbers in programmes and technologies of government, and the respective roles of accounting professionals, social studies of finance have been more concerned with exploring the construction of markets and market cultures, and the role of numbers and calculations in bringing about these constructions has mostly been a more implicit analytical issue. Perhaps such differences are part of the reason why correspondence across social studies of finance and interdisciplinary research in accounting has remained regrettably limited.

AOS, as the journal is known, is published in England and read internationally. It has led the growth in the literature on “critical accounting,” as well as published research at the intersection of accounting and organization theory.

Papers can be submitted either by e-mail
(hendrik.vollmer@uni-bielefeld.de) or by post to:

Hendrik Vollmer
Universitaet Bielefeld
Fakultaet fuer Soziologie
Postfach 100131
33501 Bielefeld
Germany

Independence at a price

February 9, 2007

What does the demand to appoint independent non-executive directors mean in practice?

In public companies, non-executive directors are intended to prevent a concentration of power with the chief executive officer and/or senior executive directors of the firm. To act as an effective counterweight to the executive membership of a board, non-executive directors are presumed to be independent from the firm. But, how is this independence achieved in practice?

Following a string of financial scandals including those of Enron and WorldCom, Derek Higgs was commissioned by the UK government to review the role and effectiveness of non-executive directors. Higgs’ report, which served as the basis for the Combined Code on Corporate Governace proposed that nomination committees should ‘consider candidates from a wide range of backgrounds and look beyond the “usual suspects”’.

The implicit assumption here is that if non-execs come from outside the social networks of the existing directors, it is more likely that they would be independent. Unsurprisingly, numerous services sprung up, offering to match potential NEDs with companies looking to satisfy the regulatory demands (for example).

But, what does the demand to ‘diversify the boards’ really do to nomination practices? The Combined Code asks the firms to appoint non-executive directors from backgrounds that are different from those from which non-execs usually came. Diversity, in effect, is understood and practiced as social distance: the more remote a candidate is from the firm, the more independent she or he are deemed to be.

Of course, social distance comes at a price. The more distant one is from a social realm, the less one would know about it. This introduces an inevitable trade-off the firms. The more knowledgeable the NED they appoint would be, the more likely it is that she or he would be seen as ‘too close’ and would not be recognized by the Code as independent. Conversely, appointing a person who is remote from the firm’s realm of activity may result in the appointment of someone who is virtually clueless about the nature of the business.

The result is that the current regulatory demands turn the recruitment of non-executive directors into a utility-maximization (or damage-minimization) exercise. Firms are asked, in practice, to find the ideal candidate, one who would be stranger enough to the firm to be regarded as independent, but not too ignorant about the commercial activity to stifle the operation of the board.