The book’s conclusion makes an intriguing statement by suggesting a direct line of inheritance between the alliance of “data-driven research into the human condition” and structuralism, on the one hand, and present-day data analytics and algorithmic platforms, on the other (59, 177). But, as historians and STS scholars have recently demonstrated, one prominent marker of the data-driven algorithmic systems of the twenty-first century is their fixation on prediction.[1]Virginia Eubanks, Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor (St. Martin’s Publishing Group, 2018).  From Netflix’s predictive models for film recommendation to Google’s content-targeted advertising, present day data-driven algorithmic systems are designed to anticipate and manipulate user behavior.[2]Nick Seaver, Computing Taste: Algorithms and the Makers of Music Recommendation (University of Chicago Press, 2022), 55–58; Shoshana Zuboff, The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power (Public Affairs, 2019), 82. Their negative effect on society has been significant: from reinforcing racism to steering political opinions.[3]Rebecca Lemov, “Into the Whirlpool: How Predictive Data Put Brainwashing on the Spin Cycle,” The Hedgehog Review 22, no. 2 (June 22, 2020): 74–86; Safiya Umoja Noble, Algorithms of Oppression: How Search Engines Reinforce Racism (New York University Press, 2018); Cathy O’Neil, Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy (Crown: New York, 2016).

The historian Matthew Jones and psychologist Shoshanna Zuboff both argue for the instrumentalist imperative at the heart of predictive algorithmic systems. When analyzing their mathematical foundations, Jones argues that their developers are unconcerned with establishing any knowledge about statistical laws that govern the predicted event.[4]Matthew L. Jones, “How We Became Instrumentalists (Again): Data Positivism since World War II,” Historical Studies in the Natural Sciences 48, no. 5 (2018): 673–84. Zuboff makes a parallel argument as she examines how predictive algorithms manipulate user behavior: understanding the user is beyond the scope of their design, as they reduce “human experience to measurable observable behavior.”[5]Shoshana Zuboff, “Surveillance Capitalism and the Challenge of Collective Action,” New Labor Forum 28, no. 1 (January 1, 2019): 30. It is the emphasis on prediction over understanding that makes these systems so ill-equipped to solve hard social problems. For instance, as shown by the political scientist Virginia Eubanks, when such predictive systems are introduced to state government, particularly welfare offices, they make decisions that ultimately punish whoever turns to the state for support, failing to see the human behind the data inputs.[6]Virginia Eubanks, “Allegheny Algorithm,” in Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor (St. Martin’s Publishing Group, 2018).

It would be interesting to consider how the cybernetic approach of Margaret Mead, Gregory Bateson, and Claude Lévi-Strauss compares to the instrumentalist imperative of contemporary predictive data analytics. Were the empiricism and abstractions of the human sciences described by Geoghegan ultimately driven by the ambition to better understand the human condition? History reveals the paths not taken, helping us imagine alternative world orders and knowledge systems. It is unclear, however, how the mid-century human scientists discussed in this book engaged with the binary understanding vs. prediction. The unique historical moment unearthed by Geoghegan may be as much an example of an alternative, perhaps, more humane, approach to data collection and deployment as a vital step to the instrumentarian data analytics we are exposed to at the present.

Critiques of cybernetics have focused on its reduction of people, social relations, and communications to disembodied flows of information that might be commodified, manipulated, recombined, and sold. So many of the challenges we face—from alienation and deskilling, to misinformation and contagion—extend from this. The same calculus enables a host of false equivalencies and slippery metaphors that, for example, allow corporations and policy makers to substitute algorithmic systems for human expertise.

However, in the decades in and around the Macy Conferences, alternative visions of an information society were evoked. For example, Hayles looks to the work of MacKay and what he characterizes as a more “structural” information.[7]N. Katharine Hayles, How We Became Posthuman: Virtual Bodies in Cybernetics, Literature, and Informatics (Chicago: University of Chicago Press, 1999). Still other alternatives arise in what has been called second-order cybernetics, which looks to reflexivity, the role of the observer and, ultimately, the possibility of autopoietic systems defining their own reality.[8]Francisco Varela, Evan Thompson, and Elanor Rosch,The Embodied Mind: Cognitive Science and Human Experience (Cambridge, MA: The MIT Press, 1991). For Geoghegan, however, there’s another possibility. If we locate the origins of informatic domination in anthropology’s colonial project, then what would an information society look like in light of an anticolonial anthropology? An abolitionist anthropology? As the recent podcast “Santiago Boys” makes clear, “code” can be linked to radical democratization and local empowerment; the answer to the informatics of domination lies in the emancipatory movements happening all around us.[9]John Duda, “Cybernetics, Anarchism and Self-Organization,” Anarchist Studies 21(1): 52-72; Evgeny Morozov, “The Santiago Boys” [podcast] (2023).

And there are still other alternatives to our “prison houses of information.” Gregory Bateson illustrates one in a talk he gave at the Naropa Institute in 1974. During the Q&A, Bateson told a story about working with a schizophrenic patient in Palo Alto (as he would often do in the 1960s and 1970s). The patient visits Bateson in his office and Bateson offers him a cigarette. The man lights the cigarette, takes three puffs, and then drops it on the carpet. After the patient does this on the second day, he and Bateson start to take a walk on the hospital grounds, but then Bateson, worried about the smoldering cigarette butt, runs back to his office to check. On the third day, the patient again drops his cigarette, and heads out for his walk:

“And as I’m following I grab up the cigarette. I palm it. And we walk out onto the hospital grounds. And we’ve gone about 100 yards, and I say, ‘Ed, I think this is your cigarette, isn’t it?’ And then he has the grace to laugh.”[10]Gregory Bateson, “Lecture on Consciousness and Psychopathology, Part 2” (Naropa Institute, 1974).  Bateson would generally follow his anecdotes with an interpretation. “He was challenging me, you see.” This time, there’s no explanation; Bateson, perhaps, meant something more transcendental. I prefer to think of this as the triumph of interaction over interpretation.

If much of Bateson’s contribution to family therapy was the examination of the “codes” evident in linguistic, paralinguistic, and nonlinguistic interaction (82), it is telling that Bateson did not “de-code” this encounter. Here, what was the message? What was sent and received? Or, is it simply that interaction is more important than understanding? In other words, here’s an encounter that both Bateson and patient found meaningful, without the two necessarily arriving at a common understanding.

That is, against the “new lingua franca” of cybernetics, where “not only scientific problems but even questions of art and human freedom could be treated in computational terms,” there’s another approach that privileges interaction for its own sake, without the promise of fungible, commodifiable information (23). This becomes clearer with the second generation of cyberneticists. Intellectual heir to Ross Ashby and (to a lesser extent) Gregory Bateson, Gordon Pask developed many cybernetic machines, among them “musicolour,” a machine that allowed people to interact with it by experimenting with sound frequencies, different (changing) thresholds of which would produce lights. For this device (and others from Pask), defined informatic output was not the goal.[11]Andrew Pickering, The Cybernetic Brain: Sketches of Another Future (Chicago: University of Chicago Press, 2001), 316.

To be honest, the cybernetic vision of the postwar order frightens me.  As I see it, in the cybernetic vision the social world is flattened into a two-dimensional landscape of 1’s and 0’s, eviscerating all that is fluid and profound.  Recent work in the (pre)history of applied mathematics and computing is raising new questions about the logic of inquiry and institutional commitments that gave birth (or acted as midwives) to cybernetics (see Theodora Dryer’s work on input-output models and the prehistory of computing).  We need more work like Theo’s that examines the interwar period in our studies of intellectual and political foundations of the postwar social order. I must say that although Geoghegan states explicitly that “ideas about cybernetics, information, and computing on the rise in the 1940s belong to a program of techno-political reform that took shape across the 1920s and 1930s” (9), his discussion of this important period is limited to issues related to colonial anthropology and eugenics.  Missing from this account is the enormous influence at this time of scientific management, the rationalization movement, and applied psychology, fields that epitomized techno-political reform.  By narrowing the discussion to code, Geoghegan overlooks the ways interwar experiments in organizational efficiencies and innovative communication strategies contributed to the development of cybernetics. 

I would never have thought of Saussure and Jakobson as related to cybernetics, but if we include them as masters in the practice of coding, then I might have a more positive take on the cybernetic vision.  For me Saussure’s emphasis on the arbitrariness of symbols is absolutely crucial as a starting point to any sort of cultural analysis.  It opens the door to a relativistic vision, a form of empathy that is required if one is to be intellectually honest when attempting to understand societies other than one’s own.  But I hasten to add that it is foundational only insofar as one recognizes that symbols are necessarily historical artifacts.  In other words, I consider the langue/parole dyad to be as vacuous as the cybernetic twins of 1s and 0s, and it is only in moving beyond the dyad to a world full of pragmatic missteps and productive neologisms that we actually may begin to comprehend complex histories of practice.[12] For a somewhat different take on this point, see Martha Lampland, “Pigs, Party Secretaries, and Private Lives in Hungary,” American Ethnologist 18, no. 3 (1991): 459–79.

Irrespective of their direct association with cybernetic projects, certain guiding principles of this approach remain relevant for modern anthropology. The development of analytical models to explore the connections between vastly different categories of facts within a systemic and process-oriented framework still stands as one of the key objectives of anthropology. A “metabolic” approach, as articulated by Hannah Landecker, serves as a fitting example, as does Valérie Olson’s description of a spaceship as a “system” wherein technical and vital processes are intricately intertwined.[13]Hannah Landecker, “A Metabolic History of Manufacturing Waste: Food Commodities and Their Outsides,” Food, Culture & Society 22, no. 5 (October 20, 2019): 530–47; Valerie Olson, Into the Extreme: US Environmental Systems and Politics beyond Earth (University of Minnesota Press, 2018). For anthropology, particularly in the context of the anthropology of life, cybernetics continues to be a domain for examining the continuities and analogies between living and technical systems. It’s worth noting that my interest lies less in the specific outcomes of cybernetic projects, as the book illustrates that many of these outcomes are now dated. Instead, I am more inclined towards an approach that underscores the human endeavor to comprehend the relationship between life and technology.[14]Perig Pitrou, “Life as a making,” NatureCulture 4 (2017): 1-37.

It’s possible to interpret the initial chapters of Code as distinct phases in a broader effort to objectify the movements of life, a multifaceted phenomenon addressed in its biological, linguistic, social, cultural, and other dimensions. For anthropology and history, it is pertinent to trace the heuristic loops that connect the observation of living entities with the construction of technical systems that seek to emulate their characteristics. Every attempt to model life leads to new questions, generating processes for data production, processing, and visualization that, in turn, alter the way we observe living entities.

Beyond the dynamic parallels between life and technology, cybernetics provides a platform for investigating the interfaces between biological and socio-technical systems. From a perspective of continuity, the issue of control and governance, which lies at the core of the cybernetic project (a term derived from kubernêtikê, rooted in kubernân, meaning “to govern,” originally used in reference to steering a ship), remains pivotal in comprehending the contemporary world. Similar to living systems engaged in random evolutionary dynamics, technical and biotechnical systems are characterized by a certain level of uncertainty regarding their development, despite programming and anticipation efforts. Questions persist surrounding the governance and control of human socio-technical systems, where interactions and interferences among different subsystems are abundant. How can humans effectively intervene in these systems without introducing new layers of complexity that hinder or obstruct action? Can political action and value systems be encoded as variables dependent on the system, or do these sorts of human institutions possess an element of external influence, if not transcendence, in relation to the system? Code offers insights into each of these issues.

What I appreciate in the cybernetic framework is how it situates humans within a communicative relationship with non-human actors in understanding social systems and changes. This perspective challenges human-centrism in anthropology by looking at humankind as another subsystem constituting a bigger global system.[15]Mihajlo D. Mesarović, David L. McGinnis, and Dalton A. West, Cybernetics of Global Change: Human Dimension and Managing of Complexity, MOST Policy Papers (Paris: United Nations Educational, Scientific and Cultural Organization, 1996). Humans are, in the words of Donna Haraway, one “companion species” among others.[16]Donna Haraway, “Encounters with Companion Species: Entangling Dogs, Baboons, Philosophers, and Biologists,” Configurations 14, no. 1 (2006): 97–114. The key concepts of feedback mechanism and circularity in cybernetics allow us to understand that humans do not only change the subsystems external to them but are also changed by them. This perspective resonates strongly with The Nonhuman Turn in 21st century anthropology which highlights the agency of the non-human world in shaping human cultures and behaviors. Anna Tsing, one famous proponent of the approach, argues that more-than-human actors—in her case, fungi—have sociality in the sense that they react to, get transformed by, and entangle with both human and non-human others.[17]Anna Tsing, “More-than-Human Sociality: A Call for Critical Description,” in Anthropology and Nature, ed. Kirsten Hastrup (Routledge, 2013).

Another aspect that I think is important about cybernetics is that it has a goal: that is, the creation of stability. In the context of anthropology, I can see how this perspective can be used to answer problems related to phenomena characterized by social instability, such as inequalities. Although not explicitly cybernetic, I have seen many anthropologists of infrastructure use a similar network-centered approach in answering how technologies—made by humans—in turn create infrastructural and environmental racism. Understanding infrastructure as a network of people and things that facilitate circulation,[18]Brian Larkin, “The Politics and Poetics of Infrastructure,” Annual Review of Anthropology 42, no. Volume 42, 2013 (October 21, 2013): 327–43. they study how various parts of the infrastructural system interact with each other and what outcomes emerge from that particular relationality. Nikhil Anand’s work is an important example. Acknowledging that pressure is both physical and social, he explains how unequal access to water emerges as a result of “a complex matrix of sociocultural relations” consisting of water pressure, pipes, pumps, and human bureaucracy.[19]Nikhil Anand, “PRESSURE: The PoliTechnics of Water Supply in Mumbai,” Cultural Anthropology 26, no. 4 (2011): 542–64. Understanding infrastructure as a network, therefore, makes it possible for scholars to point out which parts in the infrastructural system need fixing to reduce inequalities.

Lastly, since cybernetics regards humans to be systemically bound with technology, infrastructure, and the environment, the approach also challenges the understanding about what is considered internal and external when it comes to human bodies. Anthropologist Cassandra Hartblay wrote that the framework “trouble[s] boundaries like human/tool, human/animal, animal/machine, body/mind, and physical/nonphysical.”[20]Cassandra Hartblay, “Cyborg,” Theorizing the Contemporary, Fieldsights (March 29, 2018). Scholars like Bateson and Haraway have also asked: What constitutes the body, and where does the body start? This critical discussion about what constitutes the “natural” can encourage anthropologists to expand their understanding about the body in anthropology, especially in design and disability studies.

However, despite being an important model to answer large-scale questions, I do think that cybernetics, and its direct or indirect descendants, can benefit tremendously from reflexivity—both of the observer towards themselves and towards the research subjects. As told in Code, this computer-like model in understanding society has been historically used to simplify the human dimension into a uniform subsystem devoid of cultural, political, and historical backgrounds.  Geoghegan astutely argues that the global history of cybernetics and computing has included “the suffering, strife, and participation of persons deemed less than full citizens or subjects by the state” (8).

The lack of political analysis in Mead and Bateson’s thesis on schizophrenia exemplifies the absence of native subjectivity in their study. Both anthropologists were very much aware of the presence of the Dutch colonial power on the island. However, their ethnography did not regard politics as an important aspect in their cybernetic model while, in fact, it constituted a huge part of Balinese life. This does not mean that we should throw the baby out with the bathwater. Going forward, I think scholars explicitly and implicitly influenced by cybernetics should reflect critically on questions such as: Who is considered “human” in the cybernetic model? Whose bodies and whose experiences are sampled to formulate the model? And—as asked by second-order cyberneticians—how does the subjectivity of the observer affect the data collected? This reflection allows us to acknowledge that diverse life experiences—heavily shaped by culture, politics, and history—affect the outcome of any cybernetic model.

Notes

Notes
1 Virginia Eubanks, Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor (St. Martin’s Publishing Group, 2018).
2 Nick Seaver, Computing Taste: Algorithms and the Makers of Music Recommendation (University of Chicago Press, 2022), 55–58; Shoshana Zuboff, The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power (Public Affairs, 2019), 82.
3 Rebecca Lemov, “Into the Whirlpool: How Predictive Data Put Brainwashing on the Spin Cycle,” The Hedgehog Review 22, no. 2 (June 22, 2020): 74–86; Safiya Umoja Noble, Algorithms of Oppression: How Search Engines Reinforce Racism (New York University Press, 2018); Cathy O’Neil, Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy (Crown: New York, 2016).
4 Matthew L. Jones, “How We Became Instrumentalists (Again): Data Positivism since World War II,” Historical Studies in the Natural Sciences 48, no. 5 (2018): 673–84.
5 Shoshana Zuboff, “Surveillance Capitalism and the Challenge of Collective Action,” New Labor Forum 28, no. 1 (January 1, 2019): 30.
6 Virginia Eubanks, “Allegheny Algorithm,” in Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor (St. Martin’s Publishing Group, 2018).
7 N. Katharine Hayles, How We Became Posthuman: Virtual Bodies in Cybernetics, Literature, and Informatics (Chicago: University of Chicago Press, 1999).
8 Francisco Varela, Evan Thompson, and Elanor Rosch,The Embodied Mind: Cognitive Science and Human Experience (Cambridge, MA: The MIT Press, 1991).
9 John Duda, “Cybernetics, Anarchism and Self-Organization,” Anarchist Studies 21(1): 52-72; Evgeny Morozov, “The Santiago Boys” [podcast] (2023).
10 Gregory Bateson, “Lecture on Consciousness and Psychopathology, Part 2” (Naropa Institute, 1974).
11 Andrew Pickering, The Cybernetic Brain: Sketches of Another Future (Chicago: University of Chicago Press, 2001), 316.
12 For a somewhat different take on this point, see Martha Lampland, “Pigs, Party Secretaries, and Private Lives in Hungary,” American Ethnologist 18, no. 3 (1991): 459–79.
13 Hannah Landecker, “A Metabolic History of Manufacturing Waste: Food Commodities and Their Outsides,” Food, Culture & Society 22, no. 5 (October 20, 2019): 530–47; Valerie Olson, Into the Extreme: US Environmental Systems and Politics beyond Earth (University of Minnesota Press, 2018).
14 Perig Pitrou, “Life as a making,” NatureCulture 4 (2017): 1-37.
15 Mihajlo D. Mesarović, David L. McGinnis, and Dalton A. West, Cybernetics of Global Change: Human Dimension and Managing of Complexity, MOST Policy Papers (Paris: United Nations Educational, Scientific and Cultural Organization, 1996).
16 Donna Haraway, “Encounters with Companion Species: Entangling Dogs, Baboons, Philosophers, and Biologists,” Configurations 14, no. 1 (2006): 97–114.
17 Anna Tsing, “More-than-Human Sociality: A Call for Critical Description,” in Anthropology and Nature, ed. Kirsten Hastrup (Routledge, 2013).
18 Brian Larkin, “The Politics and Poetics of Infrastructure,” Annual Review of Anthropology 42, no. Volume 42, 2013 (October 21, 2013): 327–43.
19 Nikhil Anand, “PRESSURE: The PoliTechnics of Water Supply in Mumbai,” Cultural Anthropology 26, no. 4 (2011): 542–64.
20 Cassandra Hartblay, “Cyborg,” Theorizing the Contemporary, Fieldsights (March 29, 2018).