The aim of this article is not to analyze in detail what Gilbert Simon-don wrote about Quantum Mechanics (QM), but rather to present an interpretation of QM grounded in one of the main hypotheses present within Simondon’s theory of individuation:
Finally, one can make a hypothesis, analogous to that of quanta in physics, analogous also to that of the relativity of potential energy levels: it may be supposed that individuation does not exhaust the whole preindividual reality, and that a regime of metastability is not only maintained by the individual, but carried by him, so that the individual carries with him a certain associated load of preindividual reality, animated by all the potentials that characterize it; an individuation is relative as a change of structure in a physical system; a certain level of potential remains, and individuations are still possible. This preindividual nature remaining associated with the individual is a source of future metastable states from which new individuations can emerge.
In a previous paper, we have shown how the substantialist notion of “entity” constitutes an epistemological obstacle, in the specic sense of Gaston Bachelard, to the understanding of QM. Classical physics presupposes a substantialist ontology according to which the physical world is understood as an actual state of affairs ; a picture which amounts to the constitution of physical reality in terms of systems (or entities) composed of actual (defnite, valued) properties. These existents are constrained by classical Aristotelian logic, which is grounded on the principles of existence, non-contradiction, and identity. And that is the reason why Ferdinand Gonseth asserted that logic is “the physics of any object whatsoever.”
However, as Bachelard himself pointed out, there exist enormous difficulties when attempting to interpret the orthodox mathematical formalism of QM in terms of such (clas-sical) physical objects. In this respect, we reached three provisional conclusions:
1. A proper interpretation of QM will require the application of different principles from those presented by classical logic (existence, non-contradiction, and identity). Although it is commonly accepted that logic does not presuppose anything about its object, the notion of a logical object presupposes—at least—that reality is decomposable into entities (individuals with permanent identity). Explicitly referring to Gonseth’s works, Ernst Specker showed that QM cannot agree with this presupposition.
2. One possible resolution of this difficulty is to introduce the concept of “potentiality” in order to characterize the mode of existence corresponding to the principle of indeterminacy, the principle of superposition, and the absence of the identity of systems. This idea was suggested almost simultaneously by Simondon and Werner Heisenberg, who even conjectured that one should consider quantum probabilities as a new kind of objective physical reality in some way analogous to the Aristotelian notion of “potentia.”
3. Finally, these quantum potentialities cannot be interpreted in terms of “possibilities” conforming to classical logic. While classical possibilities refer to the evolution of individual entities, quantum possibilities seem to correspond instead to a “non-individual” reality. As Erwin Schrödinger pointed out: “these inconsistencies will be avoided by returning to a wave heory that is not continually abrogated by dice-miracles; not of course to the naïve wave theory of yore, but to a more sophisticated one, based on ... the non-individuality of ‘particles.’”
In this article, we will examine whether the epistemological obstacle produced by the notion of entity can be bypassed by specifying the mode of existence of quantum potentialities. We attempt to do so by considering the orthodox mathematical formalism of QM in the light of the Simondonian ontology, and more specically, its description of the preindividual realm: “one can suppose that reality is primitively, in itself, like the supersaturated solution and even more completely in the preindividual regime, more than unity and more than identity, capable of manifesting as wave or corpuscle.” In order to establish the validity of this logical, epistemological, and ontological revision, we will begin by providing an analysis of the evolution of the notion of physical possibility.
1. Being and Becoming: The Evolution of Physical Possibilities.
1.1. Hylemorphic Powers:
As Simondon points out, “the ancients knew only instability and stability, move-ment and rest; they did not know clearly and objectively metastability,” which made problematic the relation of being and becoming. The notion of “power” (dynamis) was firstly introduced by Plato in the Sophist (247 d–e) in order to solve this difficulty. Later on, it was also developed by Aristotle who distinguished between “being in act” and “being in power” in order to think about “movement” (i.e., any form of change). Accordingly, power represents the condition of possibility of generation and production: from a human seed can be born a human being and not an olive tree, as in wool there is the power of a garment but not of a saw. Thus, power does not only provide the condition of possibility of becoming; it also imposes its limit. The coupling of power and act circulates between all the domains since, for Aristotle, power means logical possibility as well as natural or technical capacity. There are two types of powers distinguished within the Aristotelian hylemorphic metaphysical scheme: on the one hand, an active power which imposes a form on the material ; and on the other, a passive power which authorizes matter to receive, more or less perfectly, the form. This coupling marks the teleological orientation of hylemorphism: power is the power of one act, of a single form. It is the form that determines matter, even if the imperfection of materialization indicates a “margin of indeterminacy.” Against the Megarics who denied the existence of powers by arming that a thing has the power to act only when it is acting—so that one who is not building cannot build—Aristotle recognized the existence of powers as possibilities that do not necessarily become actual. As a consequence, the corresponding propositions do not necessarily become true. And hence his solution to the problem of “future contingents”: when one says that “a battle will or will not take place tomorrow,” the alternative is true and necessary, while the affirmation that the battle will take place and its contradictory are not yet to be held true or false, inasmuch as the battle is still only “in power” (in the potential realm) and its actualization will (or won’t) result from contingent causes. The notion of power associated with the classical principles of logic implies therefore that there are undecidable propositions about becoming. However, within the Aristotelian scheme, this undecidable set of potential propositions can only apply to contingent futures and will be eventually decided by their future actualization (or non-actualization).
1.2. Classical Possibilities.
Two millennia after Aristotle’s meta-physics, Newtonian mechanics created a new physics in which the realm of potentiality was completely eradicated. Every physical system was described by Newton as a set of actual (definite, valued) properties. Before, Cartesian philosophy had reduced Nature to the prime qualities of extension, i.e., the quantitative properties of entities (order, form, and movement). The postulate of Cartesian analysis is that any physical system is reducible to simple, perfectly localizable elements in a manner that suits perfectly well the structure of reasoning—which is still based on the logic of Aristotle. As mentioned before, classical mechanics defines physical objects as grounded in the principles of existence, identity, and non-contradiction. By eliminating the indeterminacy of power while preserving the logical principles of Aristotle, classical physics reduced the notion of potentiality to mere logical possibility. In this way, any undecidable proposition about the past, the present, or the future was eliminated. As in Megaric thought, contingency become then inconceivable within classical physics. Becoming was then only that which is already true in the future, like an inevitable development of logical premises that never brings anything that is not already contained within the initial state of the system. One cannot then confer reality on other possibilities of evolution which are not contained within the already actual state of affairs. Sub specie aeternitatis the total universe is static:
Dropping Aristotelian metaphysics, while at the same time continuing to use Aristotelian logic as an empty ‘reasoning apparatus’ implies therefore losing the possibility to account for change and motion in whatever description of the world that is based on it. The fact that Aristotelian logic transformed during the twentieth century into different formal, axiomatic logical systems used in today’s philosophy and science doesn’t really matter, because the fundamental principle, and therefore the fundamental ontology, remained the same... This ‘emptied’ logic actually contains an Eleatic ontology, which allows only for static descriptions of the world.
This generates a metaphysics of pure actuality and absolute determinism. Hence the Laplacian conception of the world: a Cosmotheoros embracing all the physical data at a given instant, if it is endowed with sufficient computational capacities, would know the past and future of the universe. This absolute determinism derives from the existence of a unique solution satisfying the initial conditions of a differential equation:
In classical physics the most fundamental description of a physical system (a point in phase space) reflects only the actual, and nothing that is merely possible. It is true that sometimes states involving probabilities occur in clas-sical physics: think of the probability distributions ρ in statistical mechanics. But the occurrence of possibilities in such cases merely reflects our ignorance about what is actual. The statistical states do not correspond to features of the actual system (unlike the case of the quantum mechanical superpositions), but quantify our lack of knowledge of those actual features.
1.3. Quantum Potentialities.
In 1926, Max Born interpreted Schrödinger’s quantum wave function Ψ, as a “probability wave,” making explicit the way in which the formalism of QM introduced possibilities as a core element of the theory itself. However, these new quantum possibilities differed radically from those already present in clas-sical physics and their probabilistic counterpart within Andrei Kolmogorov’s axiomatic theory:
[The] concept of the probability wave [in quantum mechanics] was something entirely new in theoretical physics since Newton. Probability in mathematics or in statistical mechanics means a statement about our degree of knowledge of the actual situation. In throwing dice we do not know the fine details of the motion of our hands which determine the fall of the dice and therefore we say that the probability for throwing a special number is just one in six. The probability wave function, however, meant more than that ; it meant a tendency for something.
Heisenberg was aware that the quantum formalism was not compatible with classical possibilities ; instead some kind of ontological potentialities seemed to be required. According to him, the concept of the probability wave “was a quantitative version of the old potentia concept of Aristotle’s philosophy. It introduced something between the idea of a phenomenon and this phenomenon itself, a strange kind of physical reality equidistant between possibility and reality.” This potentiality designated the mode of existence to which quantum physicists refer, implicitly or explicitly:
I believe that the language actually used by physicists when they speak about atomic events produces in their minds similar notions as the concept of ‘potentia.’ So, physicists have gradually become accustomed to considering the electronic orbits, etc., not as reality but rather as a kind of ‘potentia.’
This intuition was also investigated by Karl Popper in his propensity interpretation, by Henry Margenau in terms of “latencies,” and later on by Constantin Piron, who followed Heisenberg’s interpretation in terms of “Aristotelian potentialities.” It became clear that quantum potentiality was not identical to the Aristotelian power, which is defined as the possibility of one single act. As Simondon pointed out, the error of hylemorphism “consists mainly in allowing only one entelechy for the individuated being, whereas the being must be conceived as having several phases.” Since the quantum potentiality possesses many different possible actualizations, not only a set of propositions about contingent futures is undecidable but any set of propositions made about the actualization of quantum possibilities whether they occurred in past, present, or future.
But not only was the notion of possibility drastically changed within the new theory of quanta, QM also strictly limited the realm of actuality itself. According to its orthodox formulation, the representation of the state of a system is given by a ray in a Hilbert space, H. In contrast to classical physics, physical quantities, represented by operators on H, in general, do not commute. In turn, this key feature of the mathematical formalism precludes the assertion that all quantities can be simultaneously determined as preexistent—that is to say, existent independently of any measurement—actual properties of an entity. This result was known to the founding fathers but only formalized explicitly in 1967 when Simon Kochen and Ernst Specker developed a theorem showing the limits of the formal representation of an actual state of affairs within orthodox QM."
"In 1927, Heisenberg derived from the matrix formalism and the existence of Planck’s quantum of action the following inequality: x p > h This inequality implies the impossibility of simultaneously assigning exact and determined values complementary properties such as position and momentum. Heisenberg, following Einstein, had developed an ontological interpretation in terms of “indeterminacy” which stressed the impossibility of the absolute determination of all properties of a system pointing to the necessity to modify profoundly the notion of physical existence. However, mainly due to Bohr’s inuence, it was an interpretation in terms of “uncertainty” that understood the relations in terms of the epistemic knowledge that an agent might possess of the state of the system, which became orthodox." (note 22)
2. A Simondonian Interpretation of QM
2.1. Ψ as a Representation of the Preindividual Realm
The preindividual hypothesis was elaborated by Simondon taking as a standpoint an analogy between phase shifts in classical physics and QM. This conceptual analogy did not attempt to restore the coherence of QM with the classical substantialist ontology, but, on the contrary, to derive adequate concepts in order for QM to explain physical individuation. The procedure consists basically, in overcoming the substantialist epistemological obstacle by abandoning the postulate of Cartesian analysis according to which physical reality is reducible to entities. This approach denies therefore the applicability of the principles of classical logic to QM:
If substance ceases to be the model of being, it is possible to conceive of the relation as non-identity of being in relation to itself, inclusion in being of a reality which is not only identical to it, so that being as being, prior to indi-viduation, can be grasped as more than unity and more than identity. Such a method supposes an ontological postulate: at the level of the being grasped before any individuation, the principle of the excluded third and the principle of identity do not apply ; these principles only apply to being already individu-ated, and they define an impoverished being, split between the milieu and the individual ; they do not then apply to the whole being, that is to say, to the whole subsequently formed by the individual and the milieu, but only to what, from the preindividual being, has become an individual.
Therefore, it is conceivable that no mathematical tool would be able to describe the preindividual itself but only the individuation process: “the necessity of correcting and coupling basic concepts in physics may reflect the fact that concepts are adequate to individual reality only, and not to preindividual reality.” However, as it stands, it is the Schrödinger equation which might be considered to be the best available formalization when attempting to understand the preindividual realm Simondon talks about. In this respect, special attention has to be paid to the wave function, Ψ, the solution of Schrödinger’s equation which—according to Born—represents a strange “wave of real quantum possibilities” interacting in a configuration space—a space, let us remark, that cannot be interpreted in terms of classical space-time. For Simondon, the operation that supports individuation in space-time, namely, transduction, “does not presuppose the existence of time as a framework in which the genesis unfolds, time itself being a solution, a dimension of the discovered systematic: time emerges from the preindividual like the other dimen-sions according to which individuation takes place.” It is at this point that we can introduce the quantum wave function as describing a “pure ubiquitous potential” ; allowing us to formalize “what is most positive in the state of the preindividual being, namely the existence of potentials, which is also the cause of the incompat-ibility and non-stability of this state.” Accordingly, the preindividual is prior to the distinction between being and becoming, and also prior to the resolution of being into several phases:
The preindividual being is the being in which there is no phase ; the being in which an individuation is accomplished is that in which a resolution appears through the division of being into phases, which is becoming ; becoming is not a framework in which being exists ; it is a dimension of being, a mode of resolution of an initial incompatibility rich in potentials.
The “phases” of being mentioned by Simondon do not refer to the “phase of a wave” in a classical sense but, by analogy with the physics of matter, to the concept of “phase shift.”
2.2. Mathematical Bases and Phase Shifts.
The preindividual defines a state of being whose relation to itself is more than unity and more than identity. As its name suggests, it is a state of being prior to any individuation, but it is also prior to the “phase shift,” which makes possible any individuation by determining what sort of entity (particle or wave) may be actualized in space-time. The preindividual is before any phase ; it becomes the first phase only from the individuation that splits the being, its phase shift in relation to itself. It is the individuation that creates the phases, because the phases are only this development of the being on both sides of itself, this double deframing starting from a first consistency crossed by tensions and potentials, which made it incompatible with itself.
Since the state of the being prior to the phase shift is crossed by tensions and potentials, it is incompatible with itself, which is of crucial importance to un-derstanding the ontological status of the wave function Ψ. In this respect, the Kochen-Specker (KS) theorem establishes that if we consider three physical quantities represented by the operators A, B, and C, with A commuting with B and C, but B not commuting with C, the value of A will depend on whether it is considered with B or with C. It is this constraint which presents serious difficulties for a substantivalist account of the theory of quanta. In more general terms, projection operators in QM cannot be subject to a global binary valuation to the values 0 and 1.31 It is this result which precludes an interpretation of projection operators as actual (definite, valued) properties. This constraint of the orthodox Hilbert formalism is called, within the literature, “contextual dependence” or “contextuality.” It is crucial to understand that the incompatibility of the values of A with B and with C is not simply the result of a variation occurring between two successive measurements, but a simultaneous preexistent incompatibility of the observable A with itself according to whether it is considered with B or C. This can be interpreted in the following terms: the undecidability of a set of propositions is constrained not only by contingent futures, but also by past and present. Undecidability is part of the preindividual, which is prior to every distinction between being and becoming. Since the preindividual is more than unity and more than identity, there is no contradiction between the different actualizations, which remain truly independent.
As a matter of fact, the actualization of one particular potentiality present within the preindividual does not eliminate the reality of the others:
Being as being is given entirely in each of its phases, but with a reserve of becoming; one could say that being has several forms and consequently several entelechies. [B]eing is not only what it is as manifested, because this manifestation is the entelechy of only one phase ; while this phase is actualizing itself, other latent and real phases, actual even as energetically present potential, exist, and the being consists in them as much as in its phase by which it reaches the entelechy.
The contextual dependence of quantum observables breaks with the operational assumption that the physical system can be defined through the measurement of actual properties. Being manifests itself through several becoming phases whether they are actualized or not. Therefore, if we want to be realists about QM, following Simondon, we must take into account not only the actualization process, but also the realm of potentialities from which actualizations arise.
2.3. Probability Density as Potentialities.
From a formal point of view, the set of propositions QM talks about are structured in an orthomodular lattice. The KS theorem can be seen then as a consequence of the failure of the law of distributivity on this lattice. If we consider an op-erator of possibility on the orthomodular structure of propositions, ◊, defined adequately, the possible propositions defined in this way are in the center of the extended structure and, consequently, escape the constraint provided by binary valuations. One may think one has restored coherence with classical logic, but this solution is illusory because such “possibilities” are certainly not classical. As mentioned above, the space of possibilities in classical logic can only be regarded as the space of future actualizations—they cannot correspond to incompatible actualizations as in the case of quantum potentialities. In fact, a theorem analogous to the KS theorem in the framework of modal logic (the modal theorem KS or MKS) proves that quantum potentialities are not classical possibilities. Regarding the actualization of quantum possibilities, one can accept the projection postulate without interpreting it as a real “collapse” or physical interaction taking place in space-time. In fact, in order to maintain its realistic claim, the modal interpretation admits that the non-actualized possible values do not vanish when a measurement is made. As remarked by Pieter Vermaas:
In modal interpretations the state is not changed if a certain state of affairs becomes actualized. The undiscounted possibilities are not removed from the description of a system and this state therefore codifies not only what is present but also what is currently possible. These undiscounted possibilities may, as a consequence, in principle still affect the course of subsequent events.
In this sense, Simondon affirms that the individual who is actualized must be held for “a relative reality, a certain phase of being which presupposes a pre-individual reality before it, and which, even after individuation, does not exist alone, because individuation does not exhaust the potentials of preindividual reality all at once.” The paradoxes of QM only arise from the application of substantialist concepts. The principles of classical logic cannot be applied to an individuation process that has not reabsorbed yet the divergent potentialities of the preindividual:
If it were true that logic relates to the propositions relating to being only after individuation, a theory of being prior to all logic should be instituted ; this theory could serve as a foundation for logic, for nothing proves in advance that being is individuated in one possible way ; if several types of individuation existed, several logics should also exist, each one corresponding to a defined type of individuation.
The algebraic structure of the space of possibilities that corresponds to QM cannot be put in correspondence with the truth judgments formulated in Boolean logic. The algebraic structure of the orthomodular lattice determines a different meaning of what is “possible,” breaking with classical physics and the epistemic account of possibility (i.e., the anticipation of future actualized properties on the basis of which truth judgments can be stated). Insofar as the potentialities encompass both actualized and undiscounted possibilities, and since there is no trivial relation between the structure of the orthogonal lattice and the determina-tion of truth in a Boolean logic, one must admit that the ontology that intends to account for quantum possibilities must radically change. This philosophical elaboration, induced by the constraints of the formalism of QM, constitutes a way of resolving conceptual difficulties that have been hardly explored by the existing literature—probably, because of the epistemological obstacles introduced by the notion of entity.Simondon may be the only philosopher who has understood not only that potential is as real as the actual but also that “the potentials of a system constitute its power to become without degrading ; they are not the mere potentialities of future states, but a reality that drives them to be.” In this respect, we might say that Simondon has produced—thanks to his preindividual hypothesis—one of the best characterizations of the potential mode of existence and of its logical and ontological implications.
We have shown how the conceptual framework of Simondon’s theory of individuation, based on the preindividual hypothesis, provides a guiding thread in order to produce a realistic but non-substantialist interpretation of the orthodox quantum formalism. While non-realistic interpretations consider the quantum wave function Ψ only as a prediction tool, we regard it as the objective description of the preindividual realm. The decomposition of Ψ on its different bases is then understood as the phase shift of the preindividual. In this case, quantum possibilities are interpreted in terms of potentialities—that is to say, according to a mode of existence distinct and independent from the actual. Even after the measurement, after the actualization of one of these potentialities, only the remanence of the preindividual gives meaning to quantum superpositions. Directly induced by formalism, our non-substantialist interpretation of QM provides a formal account of Simondon’s theory of individuation based on the preindividual hypothesis.
We insist that the originality of Simondon’s notion of the preindividual breaks with the notion of entity. It defines itself ontologically upstream of the existence of individuals, so it does not aim to restore the consistency of QM with classical logic (as physics of any object whatsoever) and induce a theoretical perspective on the individuation process that embraces the constraints of QM without any paradox.
The relation of being to itself is infinitely richer than identity: identity, poor relation, is the only relation of being to itself that can be conceived according to a doctrine that considers being as possessing a single phase ; identity, in the theory of polyphased being, is replaced by internal resonance.
The intention that guided Simondon as a philosopher of nature was “to know the individual through individuation rather than individuation from the individual.” Here, we have tried to reverse the usual philosophical analysis of QM (which aims to understand the potentialities in terms of actual entities) and replace it with an approach in which the understanding of the actual derives from the analysis of potentiality itself. This ambition overlaps with what constitutes the conceptual originality of QM understood as a description of the process of individuation:
Individuation must then be considered as a partial and relative resolution manifested in a system containing potentials and containing a certain incom-patibility with respect to itself, an incompatibility made of forces of tension as well as an impossibility of an interaction between the extreme terms of the dimensions. Considering the realistic non-substantialist interpretation of the orthodox quantum formalism that we have elaborated thanks to Simondon’s concepts, one may wonder why did not he ground his own interpretation of QM on his preindividual hypothesis instead of discussing Louis de Broglie’s interpretation in terms of pilot-wave and Niels Bohr’s principle of complementarity. Our guess is that it was quite impossible to prove the relevance of the preindividual hypothesis without knowing the KS theorem. However, this makes the accuracy of Simondon’s intuition all the more impressive."
-Vincent Bontems & Christian De Ronde, "Simondon and quantum mechanics (or, on how the ‘preindividual’ hypothesis leads to a realistic but non-substantialist interpretation of the orthodox quantum formalism)", Philosophy Today, 2019, 63(3), pp.611–624.