Critical Philosophy of the Postdigital.
The postdigital does not describe a situation, condition or event after the digital. It is not a chronological term but rather a critical attitude (or philosophy) that inquires into the digital world, examining and critiquing its constitution, its theoretical orientation and its consequences. In particular, it addresses the conditions of digitality and the ideology of digitalism, the idea that everything can be understood without loss of meaning in digital terms (see Jandric et al., 2018). We call this the "critique of digital reason" that has application not only in terms of social theory and theory of hyper-control but also in music, art and aesthetics where it is concerned to humanize digital technologies. The critique of digital reason has two elements: first, the mathematico-technical control systems that are part of the emerging global digital infrastructure within which we now exist, and second, the political economy of these systems--their ownership, acquisition, structure and ownership. It also refers to the convergence and marriage of the two dominant world historical forces of digital and biological systems and the ways in which together they constitute the unsurpassable horizon for existence and becoming--the species evolution of homo sapiens and life in general, and the colonization of space.
Postdigital aesthetics is a term that has a certain currency after the collection of the same title, Postdigital Aesthetics Art, Computation and Design, by David M. Berry and Michael Dieter (2015), on a new aesthetic of resistance against the digital and the return to modernism and old media. Christian Ulrik Andersen, Geoff Cox, and Georgios Papadopoulos (2014) in their joint editorial to a Special Issue on postdigital research in A Peer-reviewed Journal About--which is "an open-access research journal that addresses the ever-shifting thematic frameworks of digital culture" (APRJA, 2018) - provide a common working definition of the postdigital:
Post-digital, once understood as a critical reflection of 'digital' aesthetic immaterialism, now describes the messy and paradoxical condition of art and media after digital technology revolutions. 'Post- digital' neither recognizes the distinction between 'old' and 'new' media, nor ideological affirmation of the one or the other. It merges 'old' and 'new,' often applying network cultural experimentation to analog technologies which it reinvestigates and re-uses. It tends to focus on the experiential rather than the conceptual. It looks for DIY agency outside totalitarian innovation ideology, and for networking off big data capitalism. At the same time, it already has become commercialized. (Andersen, Cox, & Papadopoulos, 2014)
As Florian Crammer puts it in his article "What Is 'Post-Digital'?" included in the same issue (and later in the book) (Cramer, 2015), post-digital is "a term that sucks but is useful" and goes on to provide a list of characteristics:
1. disenchantment with 'digital,' 2. revival of 'old' media, followed by a number of headings (numbered here but not in the original) that are just as revealing: 3. post-digital = postcolonial; post-digital [not equal to] post- histoire, 4. 'digital' = sterile high-tech?, digital' = low-quality trash?, 5. post-digital = against the universal machine, 6. post-digital = post-digitisation, 7. post-digital = anti-'new media,' 8. post-digital = hybrids of 'old' and 'new' media, 9. post-digital = retro?, 10. DIY vs. corporate media, rather than 'new' vs. 'old' media. (Cramer, 2015)
Cramer is outlining a new aesthetics in terms of "semiotic shift to the indexical" (although technically, he notes, "there is no such thing as 'digital media'" or "digital aesthetics") and, most importantly, "the desire for agency." This certainly is retro and modernist, and represents a critical rejection of the anonymous digital systems driven by the logic of big data and AI that can easily eclipse the agency of the individual artist.
We espouse a postdigital philosophy built on the radical interaction of the "new biology" and informationalism, that we refer to as bio-informational capitalism (Peters, 2012a; see also Peters and Jandric, 2018) and incorporates three configurations of quantum computing, complexity theory, algorithmic capitalism, and deep learning. This paper is an amalgam of an evolving agenda and draws on work from recent papers on the postdigital (Jandric et al., 2018), cybernetics (Peters, 2014), deep learning (Peters, 2018) and algorithmic capitalism (Peters, 2017) and bringing the ideas together here with new material at the beginning and end of the essay. In reality these aspects are part of a broader and interconnected perspective.
Quantum computing uses the laws and processes of quantum mechanics to process information. While traditional computers operate through instructions that use a binary system represented by the numbers 0 and 1 (representing the "off" or "on" of a logic gate on an integrated circuit) quantum computers use quantum bits, or qubits, to encode information as 0s, 1s, or both at the same time representing the control of the flow of energy through the circuit. The superposition of states in quantum computing together with both entanglement and tunnelling allows quantum computers to manipulate enormous combinations of states at any moment. Quantum theory is the attempt to describe the behaviour of matter and energy at this subatomic scale. Experiments in the early twentieth century suggested that very small particles like photons and electrons can behave either like a wave or like a particle under different circumstances, and there are precise limits with which quantities can be known (sometimes known as the Uncertainty Principle). Quantum theory has no entirely satisfactory explanation. The Copenhagen interpretation first proposed by Niels Bohr and Weiner Heisenberg in 1925-27 holds that the nature of quantum mechanics is probabilistic and will never by replaced by a deterministic theory thus threatening the classical idea of causality of physical systems and the notion of scientific realism.
As the U.S. House of Representatives Committee on Energy and Commence hearing entitled "Disrupter Series: Quantum Computing" (2018) puts it:
The wave-particle duality described in the HUP lies at the heart of quantum mechanics. A consequence of the theory is that at a fundamental level matter and light can only be described probabilistically; it is impossible to know both the position and momentum of a quantum object because the object exists in all possible states simultaneously until it is measured (or observed)--a concept known as superposition.
The background report also briefly mentions the core concept of quantum entanglement that posits a change of state in one particle with necessarily involve a change in its twin or related particle, an understanding that has led to string theory.
Applying quantum theory, quantum computing performs certain computational tasks exponentially faster than classical computing. The report cites Joseph Altepeter:
Quantum computers are fundamentally different from classical computers because the physics of quantum information is also the physics of possibility. Classical computer memories are constrained to exist at any given time as a simple list of zeros and ones. In contrast, in a single quantum memory many such combinations--even all possible lists of zeros and ones--can all exist simultaneously. During a quantum algorithm, this symphony of possibilities split and merge, eventually coalescing around a single solution. (Altepeter, 2010)
The upshot is that both quantum mechanics and quantum computing differ fundamentally from classical mechanics and classical computer around value of indeterminancy/determinancy and anti-realism/realism that are highlighted by a probabilistic universe.
Globally, some $2.2 billion has been invested in quantum computing by IBM, QxBranch LLC, Ionq, Google, MagiQ Technologies, Rigetti Computing, and Station Q-Microsoft. The EU recently sponsored research funding of over $1 billion for quantum computing and China has made a commitment of $20 billion in a national laboratory for quantum sciences. The report continues:
The Chinese companies Baidu, Alibaba Group Holdings, and Tencent Group Holdings are also working on quantum computers, with Alibaba announcing in February 2018 the opening of a quantum computing cloud platform for researchers that operates on an 11-qubit processor. (U.S. House of Representatives Committee on Energy and Commence, 2018)
The potential of quantum computing "to process multiple calculations simultaneously makes it particularly well suited to some of the most complex problems faced by programmers" including the "optimization" problem and machine learning from detecting patterns in large datasets.
We are at the edge of postdigitality in quantum computing which is very different from classical computing with multiple new uses based on fundamental differences and a fundamentally different perception of the world. This is how Michael Brett, Chief Executive Officer of QxBranch, Inc., expresses the point in his testimony to the Committee on Energy and Commence:
Quantum computers are not just a faster computer. They enable an entirely different approach to performing calculations--an approach that asks the question, what if we go beyond limit of 'classical' computers and into the subatomic, or quantum realm, to perform computational work? It turns out that this is possible, and there are some incredible and surprising phenomena like superposition and entanglement that allow us to solve some interesting--and practically unsolvable--problems like simulating the interactions among molecules as the grow in size, since the exhibit exponential growth in complexity. (U.S. House of Representatives Committee on Energy and Commence, 2018)
Brett indicates that "there are broadly three classes of application that become possible in the near-term":
1. Optimization problems--like transport and logistics routing, production streamlining, and financial portfolio optimization; 2. Machine learning--accelerating the most computationally expensive part of training artificial intelligence systems to detect patterns in large and complex data; and 3. Chemical simulation--using a quantum computer to simulate the behavior of molecules and materials, a quantum process that is extremely challenging to simulate using classical computers. (U.S. House of Representatives Committee on Energy and Commence, 2018)
There is little doubt that quantum information science holds for next-generation computing and processing. Most effort has gone into harnessing development at the level of apps for business in the global competitive economy. Little if any thought has gone into the broader philosophy and the ways in which quantum information science will fundamentally alter the conditions of society.
Complexity Theory (1)
Cybernetics is also broadly related to systems philosophy and theory and as Charles Francois (1999: 203) notes, both function as "a metalanguage of concepts and models for transdisciplinarian use, still now evolving and far from being stabilized." Francois (1999) provides a detailed history of systemics and cybernetics in terms of a series of historical stages. First, Precursors (before 1948)--the "Prehistory of Systemic-Cybernetic Language"--going back to the Greeks and to Descartes in the modern world and ranging across the disciplines with important work in philosophy, mathematics, biology, psychology, linguistics, physiology, chemistry and so on (Hartmann, Leibnitz, Bernard, Ampere, Poincare, Konig, Whitehead, Saussure, Christaller, Losch, Xenopol, Bertalanffy, Prigogine).
Second, "From Precursors to Pioneers (1948-1960)," beginning with Weiner who aimed to address the problem of prediction and control and the importance of feedback for corrective steering and mentioning Shannon and Weaver's (1949) Mathematical Theory of Communication, Von Bertalanffy's 1950 paper "An Outline of General System Theory," Kenneth Boulding's (1953) "Spaceship Earth," von Neumann's theory of automata, Von Forster biological computer and his collaborators like Ashby (1956), Pask (1975) and Maturana who pursued questions in human learning, autopoiesis and cognition (Maturana and Varela, 1980). Francois (1999) rightly devotes space to Prigogine (1955) on systemic and his escape from assumptions of thermodynamic models towards understanding dissipative structures in complex systems. Prigogine has an interest in time derived from the philosopher Bergson, and later from the physicists Boltzmann and Planck, where he developed a theorem on examples of systems which were highly organized and irreversible and applied it to the energetics of embryological evolution. His work in irreversible phenomena theory led him also to reconsider their insertion into classical and quantum dynamics and to the problem of the foundations of statistical mechanics (Prigogine, 1977).
Third, "Innovators (After 1960)" beginning with Simon's (1962) discussion of complexity, Miller's (1978) work on living systems, Maturana's work on autopoiesis, i.e. self-production (Maturana and Varela, 1980), Mandelbrot's (1977) work on fractal forms, Zadeh (1965) work fuzzy sets and fuzzy logic, Thom's work on the theory of catastrophes, and the development of chaos theory. As Francois (1999: 214) writes:
Chaos theory as the study of the irregular, unpredictable behaviour of deterministic non-linear systems is one of the most recent and important innovations in systemics. Complex systems are by nature non- linear, and accordingly they cannot be perfectly reduced to linear simplifications.
Francois also significantly details important work in ecology and economics mentioning Odum (1971), Daly (1973) on steady-state economy, Pimentel (1977) on the energy balance in agricultural production, among other working in the field. Fourth and finally, Francois (1999) examines "Some Significant Recent Contributions (After 1985)" mentioning the Hungarian Csanyi's (1989) work on the "replicative model of self-organization," Langton (1989) on AL, Sabeili's (1991) theory of processes, and McNeil (1993) on the possibility of a better synthesis between physical sciences and living systems. He ends by referencing Prat's (1964) work on the "aura" (traces that remain after the demise of the system), Grasse on "stigmergy" (indirect communication taking place among individuals in social insect societies, see more on stigmergy and massive online collaboration in Susi & Ziemke (2001), Gregorio (2002) and Robles, Merelo & Gonzalez-Barahona (2005)), and Gerard de Zeeuw (2000) on "invisibility."
If modern cybernetics was a child of the 1950s, catastrophe theory developed as a branch of bifurcation theory in the study of dynamical systems originating with the work of the French mathematician Rene Thom in the 1960s and developed by Christopher Zeeman in the 1970s. Catastrophes are bifurcations between different equilibria, or fixed-point attractors and have been applied to capsizing boats at sea and bridge collapse. Chaos theory also describes certain aspects of dynamical systems i.e., systems whose state evolve over time such as the "butterfly effect" that exhibit characteristics highly sensitive to initial conditions even though they are deterministic systems (e.g., the weather). Chaos theory goes back to Poincare's work and was taken up mainly by mathematicians who tried to characterize reiterations in natural systems in terms of simply mathematic formulae. Both Edward Lorenz and Benoit Mandelbrot studied recurring patterns in nature--Lorenz on weather simulation and Mandelbrot (1975, 1977) on fractals in nature (objects whose irregularity is constant over different scales). Chaos theory which deals with nonlinear deterministic systems has been applied in many disciplines but has been very successful in ecology for explaining chaotic dynamics. Victor MacGill provides a nontechnical account of complexity theory: "Complexity Theory and Chaos Theory studies systems that are too complex to accurately predict their future, but nevertheless exhibit underlying patterns that can help us cope in an increasingly complex world" (in Peters, 2014). Complexity is concerned with theoretical foundations of computer science being concerned with the study of the intrinsic complexity of computational tasks and rests on understanding the central role of randomness.
AI and Deep Learning (2)
Goodfellow et al. (2016) identify three waves of development of deep learning: deep learning known as cybernetics in the 1940s-1960s that appeared with biological theories of learning; deep learning known as connectionism in the 1980s-1990s that used "back-propagation" to train neural network with multiple layers, and the current resurgence under the name deep learning beginning in 2006 and only appearing in book form in 2016. They argue that the current deep learning approach to AI goes beyond the neuroscientific perspective applying "machine learning frameworks that are not necessarily neutrally inspired." Deep learning, then, is "a type of machine learning, a technique that allows computer systems to improve with experience and data". Morris, Schenloff and Srinivasan (2017) writing a guest editorial for IEEE Transactions on Automation Science and Engineering report on the remarkable "take-off" of artificial intelligence and with the resurgence also the return of the machinery question posed almost 200 years ago in the context of the Industrial Revolution. They note the upbeat analysis of mainstream press in 2016 and document the publication of several US and UK reports that suggest not only that "AI has arrived" but also offers "huge potential for more efficient and effective business and government." The economists cite welcome AI for productivity gains. They ask "What triggered this remarkable resurgence of AI?" and they answer:
All evidence points to an interesting convergence of recent advances in machine learning (ML), big data, and graphics processing units (GPUs). A particular aspect of ML--called deep learning using artificial neural networks--received a hardware boost a few years ago from GPUs, which made the supervised learning from large amounts of visual data practical. (Morris, Schenloff and Srinivasan 2017: 407)
The popularity of ML, they note, has been enhanced by machines out-performing human in areas taken to be prime examples of human intelligence: "In 1997, IBM's Deep Blue beat Garry Kasparov in chess, and in 2011, IBM's Watson won against two of Jeopardy's greatest champions. More recently, in March 2016, Google's AlphaGo defeated Lee Sedol, one of the best players in the game of Go" (ibid: 407). Following this popular success, as Morris et al. (2017) note the private sector took up the challenge. They note, in particular, that IBM developed its cognitive computing in the form of their system called Watson, a DeepQA system capable of answering questions in a natural language. The Watson website makes the following claim "Watson can understand all forms of data, interact naturally with people, and learn and reason, at scale" (NDB, 2018). And it also talks of "Transforming learning experience with Watson taking personalised learning to a new level."
The autonomous learning systems of AI, increasingly referred to as deep learning theoretically, has the capacity to introduce autonomy into machine learning with the same dramatic impact that mechanisation had first in agriculture with the creation of industrial labour force and massive rural-urban migration that built the mega-cities of today. Fordist automation that utilised technologies of numerical control (NC), continuous process production and the production processes using modern information technology (IT) introduced the system of mass production and later, the "flexible system of production" based on the Japanese management principles. When Fordism came to a crisis in the 1960s with declining productivity levels where Taylorist organisational forms of labour reached its limits, the search for greater flexibility diversified into new forms of automation, especially as financialization took hold in the 2000s and high-frequency trading ensued on the basis of platforms of mathematical modelling and algorithmic engines.
A working hypothesis and a dark scenario is that in an age of deep learning--the final stage of automation--the welfare state based on full employment, might seem a figment of a quaint and romantic past when labour, together with the right to withdraw one's labour, and labour politics, all naturally went together and had some force in the industrial age (Peters, Jandric, and Hayes, 2018). In retrospect and from the perspective of an "algorithmic capitalism" in full swing, the welfare state and full employment may seem like a mere historical aberration. Without giving in to technological determinism, given current trends and evidence it seems that deep learning as a form of AI will continue apace the process of automation and that while it will create some new jobs, it will do so much more slowly than the jobs it disestablishes.
Algorithmic Capitalism (3)
Increasingly, cybernetics and its associated theories has become central in understanding the nature of networks and distributed systems in energy, politics and knowledge and also are significant in conceptualizing the knowledge-based economy. Economics itself as a discipline has become to recognize the importance of understanding systems rather than rational agents acting alone and pure rationality models of economic behaviour are being supplemented by economic theories that use complexity theory to predict and model transactions. More critical accounts of globalization emphasize a new form of global capitalism. The "financialization of capitalism" is a process that seems to have accompanied neoliberalism and globalization, representing a shift from production to financial services, proliferation of monopolistic multinational corporations and the financialization of the capital accumulation process (Foster, 2007). Nassim Taleb (2018) and Benoit Mandelbrot (Mandelbrot and Hudson, 2004) joined forces to criticize the state of financial markets and the global economy, highlighting some of the key fallacies that have prevented the financial industry from correctly appreciating risk and anticipating the current crisis including, large and unexpected changes in dynamical systems that are difficult to predict, the difficulty of predicting risk based on historical experience of defaults and losses, the idea that consolidation and mergers of banks into larger entities makes them safer but in reality imperils whole financial system (Taleb and Mandelbrot, 2008).
Cybernetic capitalism is a system that has been shaped by the forces of formalization, mathematization and aestheticization beginning in the early twentieth century and associated with developments in mathematical theory, logic, physics, biology and information theory. Its new forms now exhibit themselves in the forms of finance capitalism, informationalism, knowledge capitalism and the learning economy with incipient nodal developments associated with the creative and open knowledge (and science) economies (Peters, Besley, and Jandric, 2018). The critical question in the wake of the collapse of the global finance system and the impending eco-crisis concerns whether capitalism can promote forms of social, ecological and economic sustainability.
"Cognitive capitalism" (CC) is a theoretical term that has become significant in the critical literature analysing a new form of capitalism sometimes called the "third phase of capitalism," after the earlier phases of mercantile and industrial capitalism (Boutang, 2012). CC purportedly is a new set of productive forces and an ideology that focuses on an accumulation process centred on immaterial assets utilizing immaterial or digital labor processes and the co-creation and co-production of symbolic goods and experiences in order to capture the gains from knowledge and innovation which is considered central to the knowledge economy. It is a term that focuses on the fundamental economic and media shift ushered in with the Internet as platform and post-Web 2.0 technologies that have impacted the mode of production and the emergence of digital labor. The theory of cognitive capitalism has its origins in French and Italian thinkers, particularly Gilles Deleuze and Felix Guattari's Capitalism and Schizophrenia (2009), Michel Foucault's biopolitics (1997), Hardt and Negri's trilogy Empire (2001), Multitude (2005) and Commonwealth (2009), as well as the Italian "Autonomist" Marxist movement that has its origins in the Italian Operaismo ("workerism") in the 1960s.
More recently CC emanates from a group of scholars centred around the journal Multitudes (after Hardt and Negri) established by Boutang in 2000. Multitudes is a political concept at the limits of sovereign power dating from Machiavelli and Spinoza naming a population that has not entered into a social contract and retained it capacity for political self-determination and, after Hardt and Negri, resistance against global systems of power. The journal offers the following description: "The concept of 'multitudes' refers to the immanence of subjectivities (rather than 'identities') acting in opposition to established power structures and mapping the way for new futures" (Multitudes, 2018).
As an epistemology related to systems and systems philosophy "cybetics" functioned as an approach for investigating a wide range of phenomena in information and communication theory, computer science and computer-based design environments, artificial intelligence, management, education, child-based psychology, human systems and consciousness studies. It also was used to characterise cognitive engineering and knowledge-based systems, "sociocybernetics," human development, emergence and self-regulation, ecosystems, sustainable development, database and expert systems, as well as hypermedia and hypertext, collaborative decision-support systems, and World Wide Web studies. It also has been used to talk neural nets, software engineering, vision systems, global community, and individual freedom and responsibility.
In a paper entitled "Algorithmic Capitalism and Educational Futures: Informationalism and the Googlization of Knowledge," Peters (2012b) commented upon the rise of a new kind of capitalism that Agger had been one of the first to name and to begin to scrutinize its social consequences:
Algorithmic capitalism and its dominance of the market increasingly across all asset classes has truly arrived. It is an aspect of informationalism (informational capitalism) or 'cybernetic capitalism,' a term that recognizes more precisely the cybernetic systems similarities among various sectors of the post-industrial capitalist economy in its third phase of development--from mercantilism, industrialism to cybernetics--linking the growth of the multinational info-utilities (e.g., Goggle, Microsoft, Amazon) and their spectacular growth in the last twenty years, with developments in biocapitalism and the informatization of biology, and fundamental changes taking place with algorithmic trading and the development of so-called financialization. (Peters, 2012b)
Speed and velocity are the main aspects of a new finance capitalism that operates at the speed of light based on sophisticated "buy" and "sell" algorithms. Already researchers have demonstrated that data transfer using a single laser can send 26 terabits per second down an optical fibre and there are comparable reports that lasers will make financial "high-frequency" trading even faster.
Western modernity (and developing global systems) exhibit long-term tendencies of an increasing abstraction described in terms of formalization, mathematicization, aestheticization and biologization of life. These are characteristic of otherwise seemingly disparate pursuits in the arts and humanities as much as science and technology and driven in large measure through the development of logic and mathematics especially in digital systems. Much of this rapid transformation of the properties of systems can be captured in the notion of "bioinformational capitalism" that builds on the literatures on "biocapitalism" and "informationalism" (or "informational capitalism") to develop the concept of "bio-informational capitalism" in order to articulate an emergent form of capitalism that is self-renewing in the sense that it can change and renew the material basis for life and capital as well as program itself. Bioinformational capitalism applies and develops aspects of the new biology to informatics to create new organic forms of computing and self-reproducing memory that in turn have become the basis of bioinformatics (Peters and Jandric, 2019).
The third phase of "cybernetic capitalism" itself has undergone further development from first to fifth generation. I described above the first four generations to the point of complexity theory. The fifth is what Peters calls "bioinformationalism" representative of bioinformational capitalism (Peters, 2012; see also Peters and Jandric, 2019) that articulates an emergent form of capitalism that is self-renewing in the sense that it can change and renew the material basis for life and capital as well as program itself. This represents a massive change to the notion of digital reason as also a biological notion--biologizing digital reason. Bio-informational capitalism applies and develops aspects of the "new biology" to informatics to create new organic forms of computing and self-reproducing memory that in turn has become the basis of bioinformatics.
Our speculation is that the "biologization of digital reason" is a distinct phenomenon that is at an early emergent form that springs from the application of digital reason to biology and the biologization of digital processes. In this space, we might also talk of digital evolution, evolutionary computation, and genetic algorithms.
Notes toward the Postdigital as Process Philosophy
A critical philosophy starts from the rejection of a purely mechanistic universe--the universe of classical mechanics that is echoed by a traditional deterministic computing. The move to something new--to a philosophy that emphasizes nondeter-ministic states, also might find solace in the concept of reality and philosophy as process or a web of interrelated processes as developed by Whitehead in his Harvard Lectures (1927-28) that was published as Process and Reality (1929). Whitehead's event-based process ontology offers an "ecological" and relational approach to a wide range of studies as well as serving as the common ground for Eastern and Western religious and cultural traditions. Indeed, theology was one of the disciplines that was first to develop Whitehead's process thought with the Claremont School of Theology and the establishment of the Center for Process Studies founded in 1973 by John B. Cobb and David Ray Griffin.
Whitehead's metaphysics was interpreted as a fundamental challenge to scientific materialism and a view of reality not fully determined by classical mechanics or causal determinism: creativity is the principle of existence and there is a degree of originality in the way in which entities responds to other entities. At least at first glance Whitehead's philosophy seems consonant with the postdigital and quantum physics as we have described it. For Whitehead (1920/2009: 166), "nature is a structure of events and each event has its position in this structure and its own peculiar character or quality." In Chapter 2 of The Concept of Nature (1920/2009: 173), he clearly rejects what he calls "the bifurcation of nature" (minds and matter) and criticises "the concept of matter as the substance whose attributes we perceive" arguing that "The character of the spatio-temporal structure of events can be fully expressed in terms of relations between these more abstract event--particles." He goes on to explain that "Each event-particle lies in one and only one moment of a given time-system [...] and it is characterised by its extrinsic character, its intrinsic character and its position" (Whitehead, 1920/2009: 191). Thus, all entities for Whitehead are temporal--they are occasions of experience--and nothing exists in isolation by only its relations.
Nicholas Rescher, the great American pragmatist much influenced by White-head's philosophy of the organism, writes:
What is characteristically definitive of process philosophizing as a distinctive sector of philosophical tradition is not simply the commonplace recognition of natural process as the active initiator of what exists in nature, but an insistence on seeing process as constituting an essential aspect of everything that exists--a commitment to the fundamentally processual nature of the real. For the process philosopher is, effectively by definition, one who holds that what exists in nature is not just originated and sustained by processes but is in fact ongoingly and inexorably characterized by them. On such a view, process is both pervasive in nature and fundamental for its understanding. (Rescher, 2006: 3)
The resuscitation of his work has much to do with Deleuze (see Robinson, 2009) and with Isabelle Stengers' book Thinking with Whitehead: Free and Wild Concepts (Stengers, 2011). Deleuze, like Whitehead, opposes substance metaphysics, the dominant paradigm since Aristotle, recasting the notion that being is a simple unchangeable substance rather than a becoming that is always occurring and undergoing a dynamic process of self-differentiation. "Substances" might be thought to be a grammatical feature of Indo-European languages that prioritizes static entities. Johanna Seibt (2018), in her entry "Process Philosophy" in the Stanford Encyclopedia of Philosophy, concludes:
contemporary process philosophy holds out the promise of offering superior support for the three most pressing tasks of philosophy at the beginning of the 21st century. First, it provides the category- theoretic tools for an integrated metaphysics that can join our common sense and scientific images of the world. Second, it can serve as a theoretical platform upon which to build an intercultural philosophy and to facilitate interdisciplinary research on global knowledge representation by means of an ontological framework that is no longer parochially Western. Third, it supplies concepts that facilitate interdisciplinary collaboration on reflected technology development, and enable the cultural and ethical imagination needed to shape the expectable deep socio-cultural changes engendered by the increased use of technology, especially automation.
Process philosophy provides us with what Whitehead called "a philosophy of the organism"--it is a form of speculative metaphysics that privileges the event and processes over and above substance with the consequence that we are released from the mechanistic, deterministic universe that is a product of classical physics. It is also a clear rejection of scientific realism substituting a relation process ontology that points towards a indeterministic universe at the sub-atomic level and a form of quantum philosophy based on quantum mechanics and computing characterising an era we are just entering. It will be transformative, dynamic, system-built ontology very different from our understanding of the digital, which itself has only got underway. A critical philosophy of the postdigital must be able to understand the processes of quantum computing, complexity science, and deep learning as they constitute the emerging techno-science global system and its place within a capitalist system that itself is transformed by these developments. (4)
All authors listed have made a substantial, direct and intellectual contribution to the work, and approved it for publication.
Conflict of Interest Statement
The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
1. This section draws on Peters, M. A. (2014). "The University in the Epoch of Digital Reason Fast Knowledge in the Circuits of Cybernetic Capitalism," in P. Gibbs, O.-H. Ylijoki, C. Guzman-Valenzuela, & R. Barnett (eds.), Universities in the Time of Flux: An Exploration of Time and Temporality in University Life. London: Routledge.
2. This section draws on Peters, M. A. (2018). "Deep Learning, Education and the Final Stage of Automation," Educational Philosophy and Theory 50(6/7): 549-553. doi:10.1080/00131857.2017.1348928
3. This section draws on Peters, M. A. (2017). "Algorithmic Capitalism in the Epoch of Digital Reason," Fast Capitalism 14(1).
4. This paper was originally published in Peters, M. A., & Besley, T. (2018), Postdigital Science and Education 1(1): 1-14. doi:10.1007/s42438-018-0004-9
Altepeter, J. B. (2010). "A Tale of Two Qubits: How Quantum Computers Work," Ars Technica, 18 January. https://arstechnica.com/science/2010/01/a-tale-of-two-qubits-how-quantum-computers-work/. Accessed 12 May 2018.
Andersen, C. U., Cox, G., & Papadopoulos, G. (2014). "Postdigital Research--Editorial," A Peer-Reviewed Journal About 3(1).
APRJA (2018). "About." http://www.aprja.net/about/. Accessed 12 May 2018.
Ashby, W. R. (1956). An Introduction to Cybernetics. London: Chapman and Hall.
Berry, D. M., & Dieter, M. (eds.) (2015). Postdigital Aesthetics: Art, Computation and Design. New York, NY: Palgrave Macmillan.
Boulding, K. (1953). "Toward a General Theory of Growth," Canadian Journal of Economics and Political Science 19: 326-340.
Boutang, Y. M. (2012). Cognitive Capitalism. Cambridge: Polity.
Cramer, F. (2015). "What Is 'Post-Digital'?," in D. M. Berry & M. Dieter (eds.), Post-digital Aesthetics: Art, Computation and Design. New York, NY: Palgrave Macmillan, 12-26. doi:10.1057/9781137437204
Csanyi, V. (1989). "The Replicative Model of Self-Organization," in G. J. Dalenoort (ed.), The Paradigm of Self-Organization. New York: Gordon & Breach.
Daly, H. (1973). Towards a Steady-State Economy. San Francisco, CA: Freeman.
Deleuze, G., & Guattari, F. (2009). Anti-Oedipus: Capitalism and Schizophrenia. London: Penguin.
Foster, J. B. (2007). "The Financialization of Capitalism," Monthly Review, 1 April. https://monthlyreview.org/2007/04/01/the-financialization-of-capitalism/. Accessed 12 May 2018.
Foucault, M. (1997). "Technologies of Self," in P. Rabinow (ed.), Ethics. London: Penguin Books, 223-325.
Francois, C. (1999). "Systemics and Cybernetics in a Historical Perspective," Systems Research and Behavioral Science 16: 203-219.
Goodfellow, I., Bengio, Y., & Courville, A. (2016). Deep Learning. Cambridge, MA: MIT Press. http://www.deeplearningbook.org. Accessed 12 May 2018.
Gregorio, J. (2002). "Stigmergy and the World-Wide Web," http://bitworking.org/news/Stigmergy. Accessed 12 May 2018.
Hardt, M., & Negri, A. (2001). Empire. Cambridge, MA, and London: Harvard University Press.
Hardt, M., & Negri, A. (2005). Multitude: War and Democracy in the Age of Empire. London: Penguin.
Hardt, M., & Negri, A. (2009). Commonwealth. Cambridge, MA: Harvard University Press.
Jandric, P., Knox, J., Besley, T., Ryberg, T., Suoranta, J., & Hayes, S. (2018). "Postdigital Science and Education," Educational Philosophy and Theory 50(10): 893-899. doi:10.1080/00131857.2018.1454000
Langton, C. (ed). (1989). Artificial Life. Redwood City, CA: Addison-Wesley.
Mandelbrot, B., & Hudson, R. L. (2004). The (Mis)behavior of Markets: A Fractal View of Risk, Ruin, and Reward. New York: Basic Books.
Mandelbrot, B. (1975). The Fractal Geometry of Nature, New York, NY: Freeman.
Mandelbrot, B. (1977). Fractal Forms, Change and Dimensions. San Francisco, CA: Freeman.
Maturana, H., & Varela, F. (1980). Autopoiesis and Cognition. Boston, MA: Reidel.
McNeil, D. H. (1993). "Architectural Criteria for a General Theory of Systems," Proceedings of the 37th ISSS Conference. Hawkesbury: University of Western Sidney.
Miller, J. G. (1978). Living Systems. New York, NY: McGraw-Hill.
Morris, K., Schenloff, C., & Srinivasan, V. (2017). "Guest Editorial. A Remarkable Resurgence of Artificial Intelligence and Its Impact on Automation and Autonomy," IEEE Transactions of Automation Science and Engineering 14: 407-409.
Multitudes (2018). "About." http://www.multitudes.net/. Accessed 12 May 2018.
NDB (2018). "Welcome to the Cognitive Era." http://www.ndb.bg/index.php/watson/. Accessed 12 May 2018.
Odum, H. (1971). Environment, Power and Society. New York, NY: Wiley.
Pask, G. (1975). The Cybernetics of Human Learning and Performance. London: Hutchinson.
Peters, M. A., & Jandric, P. (2018). The Digital University: A Dialogue and Manifesto. New York: Peter Lang.
Peters, M. A. & Jandric, P. (2019). "Posthumanism, Open Ontologies and Bio-digital Becoming," in K. Otrel-Cass (ed.), Utopia of the Digital Cornucopia. Singapore: Springer.
Peters, M. A. (2012a). "Bio-informational Capitalism," Theses Eleven 110(1): 98-111.
Peters, M. A. (2012b). "Algorithmic Capitalism and Educational Futures: Informationalism and the Googlization of Knowledge," Truthout, 4 May. https://truthout.org/articles/algorithmic-capitalism-and-educational-futures-informationalism-and-the-googlization-of-knowledge/. Accessed 12 May 2018.
Peters, M. A. (2014). "The University in the Epoch of Digital Reason Fast Knowledge in the Circuits of Cybernetic Capitalism," in P. Gibbs, O.-H. Ylijoki, C. Guzman-Valenzuela, & R. Barnett (eds.), Universities in the Time of Flux: An Exploration of Time and Temporality in University Life. London: Routledge.
Peters, M. A. (2017). "Algorithmic Capitalism in the Epoch of Digital Reason," Fast Capitalism, 14(1).
Peters, M. A. (2018). "Deep Learning, Education and the Final Stage of Automation," Educational Philosophy and Theory 50(6/7): 549-553. doi:10.1080/00131857.2017. 1348928
Peters, M. A., Besley, T., & Jandric, P. (2018). "Postdigital Knowledge Cultures and Their Politics," ECNU Review of Education. Forthcoming.
Peters, M. A., Jandric, P., & Hayes, S. (2018). "The Curious Promise of Educationalising Technological Unemployment: What Can Places of Learning Really Do about the Future of Work?," Educational Philosophy and Theory. OnlineFirst.
Pimentel, D. (1977). "America's Agricultural Future," The Economist, 8 September.
Prat, H. (1964). Le champ unitaire en biologie. Paris: Presses Universitaires de France.
Prigogine, I. (1955). Thermodynamics of Irreversible Processes. Springfield, IL: Thomas Press.
Prigogine, I. (1977). "Ilya Prigogine--Biographical," https://www.nobelprize.org/nobel_prizes/chemistry/laureates/1977/prigogine-bio.html. Accessed 12 May 2018.
Rescher, N. (2006). Process Philosophical Deliberations. Heusenstamm: Ontos Verlag.
Robinson, K. (2009). Deleuze, Whitehead, Bergson: Rhizomatic Connections. London: Palgrave Macmillan.
Robles, G., Merelo, J. J., & Gonzalez-Barahona, J. M. (2005). "Self-organized Development in Libre Software: A Model Based on the Stigmergy Concept," in D. Pfahl, D. M. Raffo, I. Rus, & P. Wernick (eds.), Proceedings of 6th International Workshop on Software Process Simulation and Modeling. Stuttgart: Fraunhofer IRB Verlag.
Sabeili, H. (1991). "Process Theory: A Biological Model of Open Systems," Proceedings of the 35th ISSS Meeting. Ostersund, 219-225.
Seibt, J. (2018). "Process Philosophy," in E. N. Zalta (ed.), The Stanford Encyclopedia of Philosophy (Winter 2014 Edition). https://plato.stanford.edu/archives/sum2012/entries/process-philosophy/. Accessed 12 May 2018.
Shannon, C., & Weaver, W. (1949). The Mathematical Theory of Communication. Urbana, IL: University of Illinois Press.
Simon, H. A. (1962). "The Architecture of Complexity," Proceedings of the American Philosophical Society 106(6): 467-482.
Stengers, I. (2011). Thinking with Whitehead: A Free and Wild Creation of Concepts. Cambridge, MA: Harvard University Press.
Susi, T., & Ziemke, T. (2001). "Social Cognition, Artefacts, and Stigmergy: A Comparative Analysis of Theoretical Frameworks for the Understanding of Artefact-mediated Collaborative Activity," Cognitive Systems Research, 2(4): 273-290.
Taleb, N. N., & Mandelbrot, B. (2008). "Nassim Taleb & Benoit Mandelbrot on 2008 Financial Crisis" [Video recording]. http://nassimtaleb.org/tag/benoit-mandelbrot/. Accessed 12 May 2018.
Taleb, N. N. (2018). Nassim Nicholas Taleb's Home Page. http://www.fooledbyrandom-ness.com/. Accessed 12 May 2018.
U.S. House of Representatives Committee on Energy and Commence (2018). Hearing entitled "Disrupter Series: Quantum Computing." https://docs.house.gov/meetings/IF/IF17/20180518/108313/HHRG-115-IF17-20180518-SD002.pdf. Accessed 12 May 2018.
Von Bertalanffy, L. (1950). "An Outline of General System Theory," British Journal for the Philosophy of Science 1(2): 134-165.
Whitehead, A. N. (1929). Process and Reality: An Essay in Cosmology. New York: Macmillan & Cambridge, UK: Cambridge University Press.
Whitehead, A. N. (1920/2009). The Concept of Nature. Ithaca, NY: Cornell University Press.
Zadeh, L. (1965). "Fuzzy Sets," Information and Control 8: 338-353.
Zeeuw, G. de (2000). "Some Problems in the Observation of Performance," in F. Parra-Luna (ed.), The Performance of Social Systems: Perspectives and Problems. New York: Springer Science+Business Media, 61-70.
Michael A. Peters
Beijing Normal University, China
Beijing Normal University, China
Received 10 September 2018 * Received in revised form 24 September 2018
Accepted 24 September 2018 * Available online 1 October 2018
|Printer friendly Cite/link Email Feedback|
|Author:||Peters, Michael A.; Besley, Tina|
|Publication:||Review of Contemporary Philosophy|
|Date:||Jan 1, 2019|
|Previous Article:||Peirce and Deleuze in the "Protoplasm of Philosophy": Triadic Relations and Habit as Pragmatic Concepts.|
|Next Article:||Towards a Cultural Phenomenology of Actions and Forms of Life.|