Sunday, December 22, 2024

Some topics in the philosophy of nature

The relationship between the concepts of determinism, predetermination, computability, cardinality, causality and the foundations of the calculus. To study this we need a mathematical general systems theory, hopefully general enough for this investigation. 

It is clear that 'determinism' is a very complex and ambiguous term and that it only has been given rigorous sense in the case of systems equivalent to Turing machines which are a case of finite or countably infinite systems.  Note that there are finite or countably infinite systems which are not computable and hence not deterministic in the ordinary sense of this term. Thus this sense of determinism implies computability which in turn implies that to determine  the evolution of the system we need consider only a finite amount of information involving present or past states. And we should ask how the even more complex concept of 'causality' comes in here. What are we to make of the concept of causality defined in terms of such computable determinism ? Note that a system can be considered deterministic in a metaphysical sense without being in fact computable.

A fundamental problem is understanding the role of differential (and integral) equations in natural science and the philosophy of nature.  The key aspect here is:  being an uncountable model and the expression of causality in a way distinct from the computational deterministic model above.  Note the paradox: on one hand 'numerical methods' are discrete, computable, deterministic approximations of differential models. One the other hand the differential models used in science are clearly obtained as approximations and idealizations of nature, for instance in the use of the Navier-Stokes equations which discards the molecular structure of fluids.

One problem is to understand the causality and determinism expressed in differential models in terms of non-standard paradigms of computation beyond the Turing limit. One kind of hypercomputational system can be defined as carrying out a countably infinite number of computational steps in a finite time.

For a mathematical general systems theory we have considered two fundamental kinds of systems: these are transpositions to generalized cellular automata/neural networks  of the Eulerian and Lagrangian approaches  to fluid mechanics.  It is clearly of interest to consider non-countable and hypercomputational versions of such general cellular automata: to be able to express differential models in a different way and to generalize them by discarding the condition of topological locality (already found in integral-differential equations and the convolution operation, Green's function, etc.).

The deep unsolved problems regarding the continuum are involved here as well as their intimate connection to the concepts of determinism, causality, computability and the possibility of applying differential models to nature. 

A special case of this problem involves a deeper understanding of all the categories of functions deployed in modern analysis: continuous, smooth, with compact support, bounded variation, analytic, semi- and sub-analytic, measurable,  $L^p$, tempered distributions, etc. How can 'determinism' and even computability be envisioned in models based on these categories?

What if nature was ultimately merely measurable rather than continuous ? That is, the temporal evolution of the states of systems modeled as a function $\phi: T \rightarrow S$ must involve some kind of merely measurable map $\phi$ ? Our only 'causality' or 'determinism' then must involve generalized derivatives in the sense of distributions. And yet the system can  still be deterministic in the metaphysical sense and even hypercomputational in some relevant sense. Or maybe such maps are generated by sections of underlying deterministic continuous processes ? 

General determinism and weak causality involve the postulating of properties of the evolution of the system which may not be logically or computationally sufficient to predict the evolution of the system in practice. This is similar to the situation in which given a recursive axiomatic-deductive system we cannot know in practice if a given sentence can be derived or not. Also constructions like the generalized derivative of locally integrable functions involve the discarding a much information.

For quantum theory: actual position and momentum are given by non-continuous measurable functions over space-time (we leave open the question of particle or wave representations). The non-continuity implies non-locality which renders, perhaps, the so-called 'uncertainty principle' more intelligible. The wave-function $\psi$ is already a kind of distribution or approximation containing probabilistic information. Quantum theory is flawed because the actual system contains more information than is embodied in the typical wave-function model - a situation analogous to the way in which the generalized derivative involves discarding information about the function.

Uncertainty, indeterminism, non-computability are a reflection thus not of nature itself but of our tools and model-theoretic assumptions. In the same way it may well be that it is not logic or mathematics that are 'incomplete' or 'undecidable' but only a certain paradigm or tool-set that we happen to choose to employ.

Another topic: the study of nature involves hierarchies of models which express different degrees and modes of approximation or ontological idealization - but forcefully ordered in a coherent way. Clearly the indeterminism or problems of a given model at a given level arise precisely from this situation; small discrepancies at a lower level which have been swept under the rug can in the long-run have drastic repercussions on higher-level models, even if most of the times they can be considered negligeable. And we must be prepared to envision the possibility that such hierarchies are imposed by the nature of our rationality itself as well as by experimental conditions - and that the levels may be infinite.

Computation, proof, determinism, causality - these are all connected to temporality, with the topology and linear order of time and a major problem involves the uncountable nature of this continuum.

In mathematical physics we generally have an at least continuous function from an interval of time into Euclidean space, configuration space or a space-time manifold. This to describe a particle or system of particles. More generally we have fields (sections of finite dimensional bundles) defined on such space which are in general at least continuous, often locally smooth or analytic.  This can be generalized to distributions, to fields of operators or even operator valued distributions.  But what if we considered, at a fundamental level,  movements and fields which were merely measurable and not continuous (or only section-wise continuous) ? Measurable and yet still deterministic. Does this even make sense ? At first glance 'physics' would no longer make sense as there would no longer be any locality or differential laws. But there still could be a distribution version of physics and a version of physics over integrals. If the motion of a particle is now an only measurable (or locally integrable) function $\phi: T \rightarrow \mathbf{R}^3$. Consider a free particle. In classical physics if we know the position and momentum at a given time then we know the position (and momentum) at any given time (uniform linear movement). But there is no canonical choice for a non-continuous function. Given a measurable functions $f: T \rightarrow \mathbf{R}^3$ we can integrate and define a probability density $\rho: \mathbf{R}^3 \rightarrow P$ which determines how frequently the graph of $f$ intersects a small neighbourhood of a point $x$. But what are we to make of a temporally evolving $\rho$ (we could consider a rapid time, at the Planck scale and a slow time) ?

Tentative definition of the density function:

\[  \rho_f (x) =   \lim_{x \in U}\frac{\mu(f^{-1} U)}{m(U)} \]

where $\mu$ is a Borel measure on $T$ and $m$ the Lebesgue measure on $\mathbf{R}^3$. Question: given a continuous function $g : {R}^n \rightarrow \mathbb{K}$ where $\mathbb{K}$ is either the real of complex functions and a  (signed) Borel measure $\mu$ on $T\subset \mathbf{R}$ is there a canonical measurable non-continuous functions $f :T \rightarrow \mathbf{R}^n $ such that $\rho_f = g$ ? It would seem not.  Any choice among possible 'random' candidates implies extra information.  And we need to make sense of this question for continuous families $g_t$ of continuous functions, for example $g(t) = e^{i\pi t}$. The differential laws of $g_t$ might need to be seen as finite approximations.

Define real computable process. 

Another approach: we have a measurable map $f: T \rightarrow A \times B \rightarrow$. Suppose that we know only $f_A(t_0)$ and not $f_B(t_0)$ while the knowledge of both would be theoretically enough to compute $f(t)$ for $t > t_0$.  Then given a $U \subset A \times B$ we can take the measure of the set $V \subset B$ such that if $f_B(t_0) \in V$ then $f(t) \in U$.  

If a trajectory is measurable and not continuous, does velocity or momentum even make sense ? 

For $f : T \rightarrow \mathbf{R}^3$ measurable (representing the free movement of a single particle) we can define for each $I \subset T$, $\rho_I (x) =   \lim_{x \in U}\frac{\mu(f^{-1} U) \cap I}{m(U)}$ which can be thought of as a generalized momentum but where causality and temporal order are left behind. Thus we could assign to each open interval $I \subset T$ a density function $\rho_I: \mathbf{R}^3 \rightarrow \mathbb{K}$. We can then postulate that the variation of the $\rho_I$ with $I$ is continuous  in the sense that given a $\epsilon$ we can find a $\delta$ such that for any partition $I_i$ of $T$ with $d(I_i) < \delta$ we have that $|| \rho_{I_{i+1}} - \rho_{I_i}|| < \epsilon$ for some suitable norm.

This construction can be repeated if we consider hidden state variables for the particle, that is $f : T \rightarrow \mathbf{R}^3 \times H$ for some state-space $H$. Of course we cannot in practise measure $H$ at a given instant of time for a given particle.  Note also that if we have two measurable maps then indiscernibility follows immediately - individuation is tied to continuity of trajectories.

Space-time is like an fluctuating ether which induces a Brownian-like motion of particle - except not continuous at all only measurable. Maybe it is $H$ that is responsible for directing the particle (a vortex in the ether) and making in behave classically is the sense of densities.  

A density function (for a small time interval) moving like a wave makes little physical sense. Why would the particle jump about in its merely measurable trajectory and yet have such a smooth deterministic density function ? It is tempting to interpret the density function as manifesting some kind of potential - like a pilot wave.

The heat equation $\partial_t = k\partial^2_x$ represents a kind of evening out of a function $f$, valleys $f''>0$ are raised and hills $f''< 0$ are leveled. But heat is a stochastic process. Maybe this provides a clue to understand the above - except in this case there is only one very rapid particle playing the role of all the molecules. 

Another approach: given a continuous function in a region $\phi: U \rightarrow \mathbf{R}^+$ construct an nowhere continuous function $\tau :T \rightarrow U$ such that $\phi$ is the density of $\phi$ in $T$. This is the atomized field. The Schrödinger equation is an approximation just like the Navier-Stokes equation ignoring the molecular structure of fluids.

Newton' first law of motion expresses minimality and simplicity for the behaviour of a free particle.  We can say likewise that an atomized field if free is completely random spread out uniformly in a region of space. As yet without momentum. Momentum corresponds to a potential which directs and influences the previously truly free atomized field.  Our view is that a genuinely free particle or atomized field is one in which the particle has equal probability of being anywhere (i.e. it does not have cohesion, any cohension must be the effect of a cause). Thus Newton's free particle is not really free but a particle under the influence of a directed momentum field. There are rapid processes which create both the atomized field (particle) and field.

Why should we consider a gravitational field as being induced by a single mass when in reality it only manifests when there are at least two ? 

In Physics there are PDEs which are derived from ODEs of physics at a more fundamental level and there are PDEs that are already irreducibly fundamental.

A fundamental problem in philosophy: the existence of non-well posed problems (in PDEs), even with smooth initial conditions.  This manifests not so much the collapse of the differential model of determinism but the essentially approximative nature of the PDE modelling. Philosophically the numerical approximation methods and the PDEs might be place on equal grounds. They are both approximations of reality.  Even the most simple potential (the gravitational field of a point particle) must have a discontinuity.

Weak solutions of PDEs are in general not unique. Goodbye determinism. The problem with mathematical physics is that it lacks an ontology beyond the simplest kind. It is applied to local homogenous settings - or systems which can be merged together in a straightforward way.  It lacks a serious theory of individuality and interaction - which is seen in the phenomena of shock waves. 

The above considerations on quantum fields are of course very simple and we should address rather the interpretation of quantum fields as (linear) operator valued distributions (over space-time) (see David Tong's lectures on QFT). This involves studying the meaning of distributions and the meaning of  (linear) operator fields - and of course demanding perfect conceptual and mathematical rigour with no brushing infinities under the carpet. And consider how a Lagrangian is defined for these fields involving their derivatives and possibly higher powers (coupling, self-interaction, "particle" creation and annihilation). What does it even mean to assign to each point an operator on a Hilbert space (Fock space) ? How can this be interpreted according to the approach above ?

But we have not even touched upon some of the fundamental elements of physics: Lagrangian densities (why the restrictions on its dependencies ?), the locality of the Lagrangian,  the principle of least action, Noether's theorem, Lorenz invariance,  the energy-momentum tensor. But we consider the distinction between scalar and vector fields to be of the utmost mathematical and philosophical significance. 

And what are some points regarding classical quantum field theory ? The interplay between the Heisenberg and Schrödinger picture in perturbation theory. That our 'fields' now are our canonical coordinates seen as fields of operators. That now we have a whole calculus of operator valued functions (for example $a_p e^{ip\cdot x}$ where $p$ is a momentum vector and $a_p$ is the corresponding creation operator): PDEs,  integral solutions, Green functions, propagators, amplitudes via scattering matrices, etc.  That the field itself is now beyond space and time, it is not a function of space and time, but recalls rather the Alayavijnana in Yogâcâra philosophy, physics is a excitation via such operator fields of this primordial field (and we will not enter here into a discussion of zero energy and the Casimir effect). 

How do we deploy our approach then to elementary QFT ? Perhaps consider a merely measurable field (not necessarily continuous) $M \rightarrow A$ in which $M$ is space-time and $A$ is some structure over which it is possible to define topologies, $\sigma$-algebras and do calculus.

The structure that replaces the real or complex field in the operator calculus might be seen most naturally as a C* algebra.  But operators act on a Hilbert space. So we need to consider that we also have that our C* algebras have representations on a given inner product space F. Thus a field takes space-time points to representations of C* algebras on a space F. Amplitudes $\langle 0 | \phi(x)\phi(y) | 0 \rangle$ (here $0$ represents the ground state, not the null element of a vector space) are obtained by applying the representation to $|0\rangle$ and then taking the inner product again with $|0\rangle$. More generally this gives amplitudes for certain transitions. The actual state space for a quantum field (with its excitations) is completely non-localized. But could this be given a topological interpretation (without having to go global) for instance, as (co)homology of a bundle ? 

Addendum to our paper Hegel and modern topology:  for physics and systems theory the most developed and relevant sections in the Logic are found in the section on Object (mechanism, chemism and teology) and the first parts of the section of Idea (with regards to process and life). For the relevance to topology and geometry this is a good point of entry, with special focus on self-similarity (maybe in quantum field theory ?) and goal-oriented systems. The final parts of Idea are clearly about self-reference, self-knowledge, self-reproduction, self-production and clearly culminate in a form of abstract theology.  Concept-Idea is both essentially self-referential (it is knowledge knowing itself, and knowledge that is being and being that is knowledge, and also process and goal)  and self-productive as well as generative, in the sense that it generates or emanates nature. 

What might be the deeper philosophical significance of the delta function in physics, the fact that its Fourier transform (in the sense of Fourier transforms of distributions) is a constant function ? It seems to have something to do with the correspondence between space and time.

The following is certainly very relevant to our discussion on determinism, causality and differential vs. measurable models.

1. Can thermodynamics be deduced mathematically from a more fundamental physical theory ?

2. Could we consistently posit a fifth force in nature which manifests in a decrease in entropy ?

The problem here is that many ordered states can be conceived as evolving as usual into the same more highly disordered state.  This can even be approached by attempting to give an underlying deterministic account (as in the kinetic theory of gases).  Thus thermodynamics just gives general dynamical systems results that apply to the specific systems of nature.

But if a new force manifested in the sense of decreasing entropy, then a reconciliation with determinism would be more problematic: from a single chaotic state there is a multitude of ordered states which it could (apparently, at least) consistently evolve to (emerge). Thus there seems to be some kind of choice, freedom, collapse of the wave function like process.

Perhaps in nature there is a conservation law involving entropy and anti-entropy.  Life is a manifestation of the anti-entropy which balances out the usual entropy increase in physics.

Like consciousness, causality, determinism, proof, number and computation - entropy is intimately connected to the direction and linear flow of time. 

The big question is: are there finitary computational or differential deterministic processes which have entropy-decreasing behavior (do they evolve complex behaviour, self-reference, etc.) ?  We would say that this seems indeed not to be the case. Thus we need to move on to: infinitary (hyper) computational systems and to deterministic but not necessarily continuous systems. There is indeed a connection between the problems of the origin and nature of life and Gödel's incompleteness theorems and the problems in quantum field theory.

Differential models and their numerical approximation methods are some of the most successful and audacious applications and apparent extensions of our finitary computable paradigm. But they cannot explain or encompass everything. 

If we postulated that space and time were discrete, then there could be no uniform linear motion, for instance going two space units in 3 time units. At the second time unit the particle  could claim equally to be in 1st or 2nd space cell - hence a situation of uncertainty.  For more complex movement the uncertainty distribution can be greater. 

The Hydrogen atom: examine carefully the methods of attaining the solutions of the Schrödinger equation in this instance and see if the solutions (involving spherical harmonics) can be given other physical interpretations (of the electron 'clouds') along the lines of our proposal.

What we need to do: elaborate the mathematical theory of how a free "quantum" particle (i.e. a particle with a completely discontinuous random trajectory in space) comes under the influence of a classical potential.  

Since we cannot write a differential equation for the non-continuous trajectory our determinism must be defined by a differential equation on the probability density (as explained above).  Take a potential and the laplace equation. Physically if the 'particle' is influenced by the potential then the totally dispersed 'cloud' will be attracted and reshaped according to the solutions of the equation.

6 comments:

  1. "In the same way it may well be that it is not logic or mathematics that are 'incomplete' or 'undecidable' but only a certain paradigm or tool-set that we happen to choose to employ."

    https://www.youtube.com/watch?v=tbU3zdAgiX8

    Dream of mathematics - of course .
    In mathematics there is no special way for kings
    But in the world there is no special way for mathematics as a queen
    So jokingly - please forgive me - greetings :)

    ReplyDelete
  2. The exploration of hypercomputational systems, the reinterpretation of differential equations, and the intriguing idea of merely measurable trajectories push the boundaries of conventional understanding, particularly by challenging the implicit assumptions underpinning physical determinism and computational paradigms.

    One of the most compelling aspects is the connection drawn between the foundational issues in the philosophy of nature and cutting-edge challenges in mathematical physics, such as the limitations of weak solutions in PDEs and the ontology of operator-valued distributions in quantum field theory. The text’s suggestion that uncertainty and indeterminism might stem more from our epistemic tools than from the fabric of nature itself resonates with deeper philosophical critiques of scientific modeling.

    However, a critical question arises: how far can such abstract, non-continuous, or hypercomputational models retain empirical applicability while maintaining conceptual coherence? For instance, the discussion of measurable fields and densities raises intriguing possibilities but also challenges the role of locality and the physical interpretation of such models. Could this abstraction risk losing contact with the physical reality it seeks to describe, or does it hint at a paradigm shift where the very nature of "physicality" is redefined?

    I think these issues would also be interesting to analyse in the light of Penrose's three worlds model.

    ReplyDelete
    Replies
    1. @Sveness Semeloduria
      Discrete circuits are the basics of electronics.
      Each waveform - a continuous function - is equivalent to a series of discrete samples - values, and vice versa.
      https://en.wikipedia.org/wiki/Nyquist%E2%80%93Shannon_sampling_theorem

      In addition, the discrete logistic equation is one of the simplest models of chaos. So the practical application is obvious.

      Our brain works discretely - alpha.beta,.... rhythms That is, consciousness
      is like TV news broadcast periodically at 8 pm. And as there are no special editions,
      the reality between editions is the same and changes discretely with a new edition :))))

      Delete
  3. It is important to point out that experimentation involves an exchange of essentially finite amounts of information (in the form of postulated approximations) and that a key feature of models is that they are computable and allow predictions within related finitary degrees of approximations. Thus both the differential and measurable models may be argued to be coherent as theories of natures and still express metaphysical determinism, yet in terms of our computable predicative approximations the measurable models will in general translate only into differential models relating to probability densities.

    ReplyDelete
  4. It would be interesting to think of functions from the rationals to the rationals which are computable in an adequate sense and yet which cannot be extended to a continuous function from the reals to the the reals.

    ReplyDelete
    Replies
    1. https://en.wikipedia.org/wiki/Dirichlet_function

      Delete

A central problem of philosophy

To us a central problem of philosophy is to elucidate the relationship between the following three domains of (apparent) reality /experience...