Top Menu

Dear Reader, we make this and other articles available for free online to serve those unable to afford or access the print edition of Monthly Review. If you read the magazine online and can afford a print subscription, we hope you will consider purchasing one. Please visit the MR store for subscription options. Thank you very much. —Eds.

The Social Dialectics of AI

Seminar on Text and Data Mining and Artificial Intelligence, 1/18/2024

Seminar on Text and Data Mining and Artificial Intelligence, 1/18/2024. Illustration by Andrzej Olas/Svensk biblioteksförening - Illustration Andrzej Olas/Svensk biblioteksförening, CC BY-SA 4.0, Link.

Pietro Daniel Omodeo is a professor of historical epistemology at Ca’ Foscari University of Venice, Italy, and holder of the UNESCO Chair on Water, Heritage and Sustainable Development in Venice. He is the author of Political Epistemology: The Problem of Ideology in Science Studies (2019).

Artificial intelligence (AI) is the omnipresent, often reified as omnipotent, innovation of our time. Hence, Matteo Pasquinelli’s new book, The Eye of the Master: A Social History of Artificial Intelligence (Verso, 2023), which provides a historical and dialectical conception of AI, is itself of crucial significance. Pasquinelli’s work is the culmination of years of research on the material history of science, the algorithmization of society, and the antagonism between capital and labor in the Anthropocene. He critically assesses all of this through the lens of Karl Marx and of subsequent theorists concerned with the historical epistemology of science and the labor process, supplemented to some extent with Italian operaismo.1

In our time, dominated by technocentric dreams of green and digital transitions, the neoliberal emphasis on entrepreneurial “opportunities” for the expansion of capital markets neglects an important transition aimed at social and ecological justice.2 This alienated political-economic imperative requires a critical perspective from below. Pasquinelli moves in the right direction, as the question of socialism is inscribed in the logic of the book. He looks at AI as a technology that is the expression of conflictual social relations and affects power, as all machines (or the means of production) do under capitalistic rule. As Marx wrote in Capital: “Instruments of labour not only supply a standard of the degree and development which human labour has attained, but they also indicate the social relations within which men work.”3

Pasquinelli’s approach to the history of science and technology follows in the footsteps of “social externalists” of science and historical epistemologists such as Boris Hessen, Henryk Grossman, Peter Damerow, and Jürgen Renn.4 As Pasquinelli explicitly states, The Eye of the Master aims to “study and evaluate these [multiple social] AI lineages from the (externalist) perspective of labour automation, rather than as (internalist) problems of computational logic, task performance, and human likeness.”5 The main aim of the book is to develop a de-ideologized and labor-centered analysis of the socioeconomic roots of the digital age by relying on a critical methodology, defined thus: “Historical epistemology is concerned with the dialectical unfolding of social praxis, instruments of labour, and scientific abstractions within a global economic dynamics.”6

This is a praxeological reinterpretation of Hessen’s program for a social history of science that explains science as the result of three main factors: political economy, technology, and ideology.7 In line with this approach, one can further assume that all technology, including the technologies of the digital age, are located at the intersection of social practices (a question of economics), science (tied to the knowledge component), and material culture and class struggles (the political axis). From a Marxist viewpoint, productive technologies should be understood as “fixed capital,” that is, as means of production, labor organization, and labor’s alienation. Pasquinelli applies these critical concepts to AI, among the most famous historical instances of which are Charles Babbage’s nineteenth-century calculating machines and Frank Rosenblatt’s mid-twentieth-century computerization of statistical tools that was based on automated statistical analysis.

Indeed, a technological epiphany constitutes the acme of the book: Frank Rosenblatt’s invention of the Perceptron in 1957. Pasquinelli devotes the final chapter to this self-organizing artificial system comparable (for Rosenblatt) to a brain, constituting the material beginning of AI. The construction of the first technology for pattern recognition based on neural networks—and thus capable of learning or, to be precise, of machine learning—marked the beginning of a new technosocial course. The first prototype, Mark I Perceptron, automated statistical analysis and used a trial-and-error method to learn how to recognize patterns. It developed a system of pattern recognition from its sensor, a twenty-by-twenty-pixel camera with four hundred photoreceptors, through a three-step process moving from sensory units to associative units, and on to response units, which followed a binary classificatory logic.

According to Pasquinelli, the computerization of statistical tools incorporated psychometric techniques for measuring intelligence and cognitive skills, of which Rosenblatt was a passionate researcher. These metrologies implied a reductionist understanding of the mind, reduced to a set of quantifiable skills—thus introducing a prejudice that can be seen as the “original sin” of AI. Furthermore, the program to quantify cognition was part of a science, psychometrics, specifically devised for social normalization. Thus, far from constituting an arbitrary factor of the development of apparently neutral technologies, Pasquinelli’s political-epistemological critique points to the fact that classificatory biases are structurally embedded in AI. Indeed, machines learn how to classify according to reified cultural categories (class ideologies concerning social relations, race, gender, and so on): “Since the Turing test, machines have been judged as ‘intelligent’ by comparing their behaviour with social conventions.”8

Starting from the end of Pasquinelli’s story—that is, Rosenblatt’s technological achievement—I will examine The Eye of the Master à rebours in order to reorganize its account as an archeology of AI. The narrative unfolds following the developmental logic of AI, from very general considerations on the emergence of algorithms from calculating abstractions in antiquity and those of computers in modern times, to the organization of labor under capitalism since the Industrial Revolution. A reverse reading of the three sections allows for a better illustration of how Pasquinelli traces the genesis of AI in the cultural-scientific settings of the twentieth century (covered in the final part, “The Information Age”) and, moving further back, in the golden age of English industrialization and class struggles (found in the first part, “The Industrial Age”), thus connecting the history of AI to a more general history of labor, technology, and knowledge extraction (described in the introduction). Reading in this order highlights the force of Pasquinelli’s proposal of a labor theory of knowledge that reverses widespread myths about the knowledge economy on the basis of a historical-materialist inquiry.

The digital age inaugurated by the Perceptron is the focus of the second and final section of the book. Pasquinelli here explores the ideas and the technological practices on which AI hinges. Three ideas are shown to be crucial: first, the fixation with the biological metaphor of the neural network; second, the recurring problem of pattern recognition as the test case for intelligence; and third, “connectionism” and “autonomy” as the two interconnected pillars of the epistemological paradigm (and the ideology) of AI.

Regarding AI’s fixation with neural networks, Pasquinelli presents this metaphor as a legacy of neuropsychiatrist Kurt Goldstein’s and psychobiologist Donald Hebb’s idea of neuroplasticity that is able to be transferred from brain physiology to machines.9 In an often-quoted paper from 1943, “A Logical Calculus of the Ideas Immanent in Nervous Activity,” a pathbreaking text that appeared before the construction of modern computing engines, cyberneticians Warren McCulloch and Walter Pitts (a neurophysiologist and a mathematician, respectively) brought forward the idea that neurons could be imitated by technological means. This was the original impetus behind developing an AI that reproduces the brain functions. But, as Pasquinelli remarks, the paper’s authors did not imitate nature, as they claimed. Rather, they reinterpreted the neurons in technological terms, more specifically by analogy with the electrical circuits that engineer Claude Shannon had devised so as to technically reproduce Boolean binary logic operations.10

Furthermore, while the idea of neural networks stems from a technological reinterpretation of physiology, pattern recognition—another pillar of AI—originates from the psychology of perception, precisely from Gestalt psychology. Pasquinelli calls this a “cognitive fossil” of Gestalt theories that was translated into a statistical topography technology.11 The original reasons for this insistence on pattern recognition derives from a challenge that Gestalt scholars leveraged against early cyberneticians’ programs of machine intelligence. The Gestalt psychologists defended the irreducibility of human intelligence and its “complex synthetic faculty.”12 The cybernetic answer (from Norbert Wiener, Jerome Lettvin, Humberto Maturana, and others) transferred the debate onto the computational terrain by arguing that a logical representation must not by necessity look isomorphic with respect to the represented object of cognition. That is, representation need not mirror the perceived shape, but can simply translate it into bits of information. Cyberneticians focused on the physiology of the eye because this constituted an instance of perceptive synthesis that does not require an initial intervention of the human mind. Rather, the organ of vision receives and transmits information in a synthesized manner to the brain, independently of the latter’s capacity to interpret the signal. In other words, the synthetic function is not accomplished by the brain alone, as it is anticipated by the eye. Therefore, there is no compelling reason why the codification of information should bear any similarity with the referent.13

Moreover, a component of the AI discourse is the ideology of autonomy, seen as a self-regulatory capacity of the brain that can be reproduced by artificial neurons. Liberal thinkers saw this physiological capacity to establish bottom-up connections as a more general principle of nature and society, one that also accounts for the alleged self-organization of economy. A champion of free market autonomy such as Friedrich Hayek argued for its non-regulability. In order to support his argument, he developed a full-fledged theory of connectionism, an epistemological apology of the “spontaneous” order of markets.14 For Pasquinelli, this theory strongly impacted the ideology of AI as it still constitutes “the paradigm of artificial neural networks.” As he explains, “Hayek stole pattern recognition and transformed it into a neoliberal principle of market regulation.”15 To be sure, naturalization is the most accomplished form of ideology, as it reifies social relations. Yet, Hayek’s vision seems to even transcend nature in favor of a quasi-theological idea of spontaneous providentialism that is reminiscent of the Smithian invisible hand. Whether the unity of a complex system—the brain, economy, or the market—can be grasped and directed is an issue that connects epistemology and politics, as is evident from Hayek’s work on “connectionism.” For Hayek, the market is an epistemological space, as it depends on knowledge in the form of exchanges of information (for example, for the determination of prices). Accordingly, the tacit knowledge that regulates it is supraconscious. Therefore, it is not accessible to the actors, and nobody can possibly direct it. This position presupposes the heteronomy of social developments.16 It clearly promotes alienation. The market itself appears as the sole driver of societal processes. Yet, an alternative analysis and critique of heteronomy and alienation exists, one that does not put at the center consumption, but rather production. This alternative is the conception underlying Marx’s attention to the goal-oriented collective praxis of workers’ activities in the factory.

Technological heteronomy and antagonisms in the factory are discussed in the first part of The Eye of the Master, with special attention to the nineteenth century. In that time, especially in Great Britain, an understanding of machines emerged that looked at them as material abstractions of labor activities and a technological modeling of the division of labor. Although (in Marxian terms) living labor has a genetic priority with respect to dead labor, the former is subordinated to the latter as an effect of an asymmetric power relation. In line with this conception, Pasquinelli observes that “the social relations of production (the division of labour within the wage system) drive the development of the means of production (tooling machines, steam engines, etc.) and not the other way around, as technodeterministic readings have been claiming then and now by centering the Industrial Revolution around technological innovation only.”17

Hessen’s sociology of science looms large over the analyses in the first part of The Eye of the Master, which centers on labor, technology, and knowledge extraction. Drawing on his exemplary work on the social-economical, technological, and ideological conditions of Isaac Newton’s mechanics as rooted in the economic settings of early modern capitalist society, the question addressed here can be reformulated as follows: What are the socioeconomic roots of AI? Pasquinelli seeks an answer by first inserting the history of computational machines into the longer history of mechanics and, at a fundamental level, in the history of the labor that machines remodel (as dead labor), organize, and direct. More specifically, for the purpose of The Eye of the Master, Pasquinelli undertakes a “reformulation of nineteenth-century labour theory of automation for the age of AI.”18

The connection between mechanical work and management is at the heart of the theories and inventions of industrial capitalist Babbage, who dreamed of mechanized mental labor in a manner similar to the mechanization of physical labor in his factories. Babbage already moved the first steps in a direction that anticipated the cognitive machines of AI. His Difference Engine for the calculation of logarithms can be considered the prototype of the modern computer, but he also envisioned the possibility of a universal computer, an Analytical Engine, which inspired the first computer programming by mathematician Ada Lovelace.19 In Babbage’s eyes, the task of his engines was to reproduce and speed up computation in the general framework of industrial production and division of labor. As Pasquinelli explains, mechanization efforts rested on two guiding principles: (1) the mechanical imitation and replacement of already-established labor practices; and (2) the quantification and purchase of labor by means of the mechanized division of labor.20 In his industrial vision of mechanization, “the division of labour provides not only the design of machinery but also of the business plan.”21

Visions of mechanical organization of physical and mental labor clashed with workers’ resistance against the debasement of their activity by means of the use of machines that augmented production and increased the owners’ profits but made the condition of the working class vulnerable and many of their skills dispensable. Pasquinelli focuses on the social problem of mechanization in his third chapter, on “The Machinery Question.” In connection with mechanization, the problem of technoscience in an industrial economy coincides with the objectification of labor that ends up dominating the workers.22 These considerations deepen our awareness of the non-neutrality of science as has been denounced, since the 1960s, by militant scientists of the left in works such as Science and Society (1970) by Hillary Rose and Stephen Rose or The Bee and the Architect (1976) by Marcello Cini, and by other Marxist physicists, who argued that science and technology reinforce inequality if they emerge from the asymmetric power relations of capitalist society.23 In line with these analyses, The Eye of the Master reminds us that there can be no room for utopias of techno-emancipation if social justice is not first achieved.

In order to assess the social function of AI, Pasquinelli promotes a labor-centered view on the knowledge economy of the Anthropocene. He resorts to a classic reference: Marx’s Grundrisse, in particular the so-called fragment on machines. The Eye of the Master explicitly addresses the problem of the General Intellect as presented in the Grundrisse and interprets it as a contribution to the study of the question of the knowledge element of society in the industrial age.24 It is interesting to note that Marx derived from Babbage the idea that labor is the basis for technology, which, in turn, models it. However, reversing the master’s perspective of Babbage, Marx considered labor to be the real collective inventor of machines, going against myths of individual invention and claims of capitalists’ ownership.25 Yet, under unequal social conditions, once the machine is created and the knowledge it incorporates is codified, the workers become the machine’s object and lose their dignity as subjects of knowledge and action. The cumulative history of knowledge is paired with the cumulative history of machines. The political task, as Marx indicated to the expropriated workers, is to reappropriate both knowledge and means of production, that is, to de-alienate the Gesamtarbeiter of Capital—the “super-organism” or “collective working organism”—which connects workers and machines in the factory and, today, society at large.26

Pasquinelli further analyzes the technological codification of labor, that is, the epistemic factor of production, which, according to the main thesis of the fifth chapter, is created through the functional separation of energy (directly related to the physical side of labor) and information in the quasi-cyborg reality of the industrial age.27 The mechanical modeling and organization of labor, which can be called “abstract labor,” makes quantification and control (the pillars of cybernetics) possible and create the illusion of a technological solution to social antagonisms between workers and capital. Indeed, Pasquinelli conceives of technological modeling, from Babbage’s engines to post-Rosenblatt AI, as a form of intelligence extractivism. The mechanization of labor (physical and cognitive alike) makes the production process unseizable (or “super-conscious,” in Hayek’s expression), and fosters alienation by excluding the workers from the possibility to plan and direct production. Therefore, in view of a political analysis of AI, it is important to keep in mind that “what information comes ultimately to measure and mediate is the antagonism between workers and capital.”28 Such antagonism, far from being segregated in the factory, concerns the entire society as it has been transformed, according to a thesis by operaist Mario Tronti, into the expanded theater of production: society as the expanded factory.29 Hence, the Gesamtarbeiter—Marx’s quasi-cyborg result of the connection between workers and machines—is the alienated humanity of capitalist control societies, integrated through AI infrastructures. These are the components of a “carbon-silicon automaton.”30 AI embodies the knowledge element of the societal cyborg; more specifically, AI is the automation of the master’s supervision: the eye of the master.

Pasquinelli’s “genealogy of labour automation, social control and knowledge extractivism,” discloses the longue-durée premises of intelligence and labour—the very ancient roots of AI, as it were—in the first chapter.31 He discusses the most essential concept of computer science, the algorithm.32 Algorithm, “a finite procedure of step-by-step instructions to turn an input into an output making the best use of the given resources” is, in its core, labour.33 Indeed, all labor, from antiquity up through the digital age and AI, has an intellectual component. Drawing on insights by Hegelo-Marxist pedagogist Damerow, Pasquinelli sees the emergence of all forms of knowledge as a dialectics of abstraction and representation stemming from practices, individual and collective.34 Such abstraction is always the expression of praxis, that is, of societal antagonisms and conjunctural balances of forces. Against fashionable technocentrisms and opportunistic ideologies of technological determinisms (in the neoliberal discourses on the digital and ecological transitions), one can respond, as Pasquinelli does in the conclusion of The Eye of the Master, as follows: “to affirm…that labour is a logical activity is not a way of abdicating to the mentality of industrial machines and corporate algorithms, but rather of recognizing that human praxis expresses its own logic…a power of speculation and invention, before technoscience captures and alienates it.”35 AI, the most advanced technological expression of the intelligence inscribed in human activity, sheds light onto the intellectual component of all labor in all ages, including manual and physical activities that one would hardly have conceived of as intellectual until recent debates. The difficulty of imitating the skills of a truck driver through the application of AI to self-driving vehicles, one of the frontiers of intelligence today, exemplifies the mental complexity of work in general, and confirms the validity of Antonio Gramsci’s assertion: “all human beings are intellectuals…although not all human beings have in society the function of intellectuals.”36

Notes

  1. Matteo Pasquinelli, The Eye of the Master: A Social History of Artificial Intelligence (London: Verso, 2023) takes upon itself the task delineated by Marxist operaist Romano Alquati as follows: “Any technological innovation, including cybernetics, always embodies the power relations and class antagonism of a given historical moment and that for this reason it should be the focus of study.” Pasquinelli’s most important contributions to these topics are the edited volume Gli algoritmi del capitale: accelerazionismo, macchine della conoscenza e autonomia del comune (Verona: Ombre corte, 2014); “Italian Operaismo and the Information Machine,” Theory, Culture and Society 32, no. 3 (2015): 49–68; “The Automaton of the Anthropocene: On Carbosilicon Machines and Cyberfossil Capital,” South Atlantic Quarterly 116, no. 2 (2017): 311–26; “On the Origins of Marx’s General Intellect,” Radical Philosophy 2, no. 6 (2019): 43–56. Pasquinelli currently leads the project AI MODELS: Advancing the Historical Epistemology of Artificial Intelligence at Ca’ Foscari University of Venice, Italy.
  2. On the transition to socialism as an essential component to ecological action, see Naomi Klein, This Changes Everything (London: Penguin Books, 2015) and John Bellamy Foster, Capitalism in the Anthropocene (New York: Monthly Review Press, 2022). Technocratic dreams of ecodigital growth have occupied center stage at the most recent World Economic Forum in Davos, where the president of the European Central Bank Christine Lagarde presented it as an “opportunity” of investment requiring €620 billion per year for the green transition and €120 billion per year for “the digitalization that we need,” adding: “I believe that artificial intelligence can help” (“ECB President Christine Lagarde on Uniting Europe Markets at WEF,” Associated Press video, 46:14, January 18, 2024.
  3. Karl Marx, Capital, vol. 1 (London: Penguin, 1976), 286, quoted in Pasquinelli, The Eye of the Master, 238.
  4. Also see Pietro Daniel Omodeo, Political Epistemology: The Problem of Ideology in Science Studies (Cham: Springer, 2019), especially chapter 5.
  5. Pasquinelli, The Eye of the Master, 232. Compare Steven Shapin, “Discipline and Bounding: The History and Sociology of Science as Seen through the Externalism-Internalism Debate,” History of Science 30, no. 4 (1992): 333–69.
  6. Pasquinelli, The Eye of the Master, 13.
  7. Gideon Freudenthal and Peter McLaughlin, eds., The Social and Economic Roots of the Scientific Revolution (Dordrecht: Springer, 2009). Also see Boris Hessen, Manuscripts and Documents on the History of Physics: A Historical Materialist Textbook, Pietro Daniel Omodeo and Sean Winkler, eds. (Venice: Verum Factum, 2022).
  8. Pasquinelli, The Eye of the Master, 227.
  9. Pasquinelli, The Eye of the Master, chapter 6.
  10. Pasquinelli, The Eye of the Master, 136.
  11. Pasquinelli, The Eye of the Master, chapter 7. See, by way of comparison, Pasquinelli, The Eye of the Master,165: “machine vision ‘sees’ nothing: what an algorithm ‘sees’—that is, calculates—are topological relations among numerical values of a two-dimensional matrix.”
  12. Pasquinelli, The Eye of the Master, 162.
  13. Pasquinelli, The Eye of the Master, 173, 174–75.
  14. Pasquinelli, The Eye of the Master, chapter 8.
  15. Pasquinelli, The Eye of the Master, 183.
  16. Pasquinelli, The Eye of the Master, 187, 190.
  17. Pasquinelli, The Eye of the Master, 82.
  18. Pasquinelli, The Eye of the Master, 238.
  19. Pasquinelli, The Eye of the Master, 56.
  20. Pasquinelli, The Eye of the Master, chapter 2.
  21. Pasquinelli, The Eye of the Master, 63.
  22. Pasquinelli, The Eye of the Master, 85–86.
  23. Giovanni Ciccotti, Marcello Cini, Michelangelo De Maria, and Giovanni Jona-Lasinio, The Bee and the Architect: Scientific Paradigms and Historical Materialism, Gerardo Ienna and Pietro Daniel Omodeo, eds. (Venice: Verum Factum, 2024). For a recent contribution to the critical analysis of the political economy of scientific abstractions in a non-Eurocentric perspective, see Senthil Babu D., Mathematics and Society: Numbers and Measures in Early Modern South India (New Delhi: Oxford University Press India, 2022).
  24. Pasquinelli, The Eye of the Master, chapter 4.
  25. Pasquinelli, The Eye of the Master, 108.
  26. Pasquinelli, The Eye of the Master, 114, 116.
  27. Pasquinelli, The Eye of the Master, 121.
  28. Pasquinelli, The Eye of the Master, 130.
  29. Pasquinelli, The Eye of the Master, 128.
  30. According to Pasquinelli, “The Automaton of the Anthropocene” (Pasquinelli, The Eye of the Master, 117).
  31. Pasquinelli, The Eye of the Master, 233.
  32. Pasquinelli also remarks that, in several European languages, information rather than computation (or algorithms) has been central in the understanding of computer science as IT, or information
  33. Pasquinelli, The Eye of the Master, 16.
  34. Pasquinelli, The Eye of the Master, 38.
  35. Pasquinelli, The Eye of the Master, 238.
  36. Pasquinelli, The Eye of the Master, 29, author’s translation. For the original, see Antonio Gramsci, Quaderni del carcere, vol. 3 (Torino: Einaudi, 2007), notebook 12 (XXIX) §1, 1516: “Tutti gli uomini sono intellettuali…ma non tutti gli uomini hanno nella società la funzione di intellettuali.” For an alternative translation, see Antonio Gramsci, Selections from the Prison Notebooks (New York: International Publishers, 1971), 9.
2024, Volume 76, Number 06 (November 2024)
Comments are closed.

Monthly Review | Tel: 212-691-2555
134 W 29th St Rm 706, New York, NY 10001