The relationship between the concepts of determinism, predetermination, computability, cardinality, causality and the foundations of the calculus. To study this we need a mathematical general systems theory, hopefully general enough for this investigation.
It is clear that 'determinism' is a very complex and ambiguous term and that it only has been given rigorous sense in the case of systems equivalent to Turing machines which are a case of finite or countably infinite systems. Note that there are finite or countably infinite systems which are not computable and hence not deterministic in the ordinary sense of this term. Thus this sense of determinism implies computability which in turn implies that to determine the evolution of the system we need consider only a finite amount of information involving present or past states. And we should ask how the even more complex concept of 'causality' comes in here. What are we to make of the concept of causality defined in terms of such computable determinism ? Note that a system can be considered deterministic in a metaphysical sense without being in fact computable.
A fundamental problem is understanding the role of differential (and integral) equations in natural science and the philosophy of nature. The key aspect here is: being an uncountable model and the expression of causality in a way distinct from the computational deterministic model above. Note the paradox: on one hand 'numerical methods' are discrete, computable, deterministic approximations of differential models. One the other hand the differential models used in science are clearly obtained as approximations and idealizations of nature, for instance in the use of the Navier-Stokes equations which discards the molecular structure of fluids.
One problem is to understand the causality and determinism expressed in differential models in terms of non-standard paradigms of computation beyond the Turing limit. One kind of hypercomputational system can be defined as carrying out a countably infinite number of computational steps in a finite time.
For a mathematical general systems theory we have considered two fundamental kinds of systems: these are transpositions to generalized cellular automata/neural networks of the Eulerian and Lagrangian approaches to fluid mechanics. It is clearly of interest to consider non-countable and hypercomputational versions of such general cellular automata: to be able to express differential models in a different way and to generalize them by discarding the condition of topological locality (already found in integral-differential equations and the convolution operation, Green's function, etc.).
The deep unsolved problems regarding the continuum are involved here as well as their intimate connection to the concepts of determinism, causality, computability and the possibility of applying differential models to nature.
A special case of this problem involves a deeper understanding of all the categories of functions deployed in modern analysis: continuous, smooth, with compact support, bounded variation, analytic, semi- and sub-analytic, measurable, $L^p$, tempered distributions, etc. How can 'determinism' and even computability be envisioned in models based on these categories?
What if nature was ultimately merely measurable rather than continuous ? That is, the temporal evolution of the states of systems modeled as a function $\phi: T \rightarrow S$ must involve some kind of merely measurable map $\phi$ ? Our only 'causality' or 'determinism' then must involve generalized derivatives in the sense of distributions. And yet the system can still be deterministic in the metaphysical sense and even hypercomputational in some relevant sense. Or maybe such maps are generated by sections of underlying deterministic continuous processes ?
General determinism and weak causality involve the postulating of properties of the evolution of the system which may not be logically or computationally sufficient to predict the evolution of the system in practice. This is similar to the situation in which given a recursive axiomatic-deductive system we cannot know in practice if a given sentence can be derived or not. Also constructions like the generalized derivative of locally integrable functions involve the discarding a much information.
For quantum theory: actual position and momentum are given by non-continuous measurable functions over space-time (we leave open the question of particle or wave representations). The non-continuity implies non-locality which renders, perhaps, the so-called 'uncertainty principle' more intelligible. The wave-function $\psi$ is already a kind of distribution or approximation containing probabilistic information. Quantum theory is flawed because the actual system contains more information than is embodied in the typical wave-function model - a situation analogous to the way in which the generalized derivative involves discarding information about the function.
Uncertainty, indeterminism, non-computability are a reflection thus not of nature itself but of our tools and model-theoretic assumptions. In the same way it may well be that it is not logic or mathematics that are 'incomplete' or 'undecidable' but only a certain paradigm or tool-set that we happen to choose to employ.
Another topic: the study of nature involves hierarchies of models which express different degrees and modes of approximation or ontological idealization - but forcefully ordered in a coherent way. Clearly the indeterminism or problems of a given model at a given level arise precisely from this situation; small discrepancies at a lower level which have been swept under the rug can in the long-run have drastic repercussions on higher-level models, even if most of the times they can be considered negligeable. And we must be prepared to envision the possibility that such hierarchies are imposed by the nature of our rationality itself as well as by experimental conditions - and that the levels may be infinite.
Computation, proof, determinism, causality - these are all connected to temporality, with the topology and linear order of time and a major problem involves the uncountable nature of this continuum.
In mathematical physics we generally have an at least continuous function from an interval of time into Euclidean space, configuration space or a space-time manifold. This to describe a particle or system of particles. More generally we have fields (sections of finite dimensional bundles) defined on such space which are in general at least continuous, often locally smooth or analytic. This can be generalized to distributions, to fields of operators or even operator valued distributions. But what if we considered, at a fundamental level, movements and fields which were merely measurable and not continuous (or only section-wise continuous) ? Measurable and yet still deterministic. Does this even make sense ? At first glance 'physics' would no longer make sense as there would no longer be any locality or differential laws. But there still could be a distribution version of physics and a version of physics over integrals. If the motion of a particle is now an only measurable (or locally integrable) function $\phi: T \rightarrow \mathbf{R}^3$. Consider a free particle. In classical physics if we know the position and momentum at a given time then we know the position (and momentum) at any given time (uniform linear movement). But there is no canonical choice for a non-continuous function. Given a measurable functions $f: T \rightarrow \mathbf{R}^3$ we can integrate and define a probability density $\rho: \mathbf{R}^3 \rightarrow P$ which determines how frequently the graph of $f$ intersects a small neighbourhood of a point $x$. But what are we to make of a temporally evolving $\rho$ (we could consider a rapid time, at the Planck scale and a slow time) ?
Tentative definition of the density function:
\[ \rho_f (x) = \lim_{x \in U}\frac{\mu(f^{-1} U)}{m(U)} \]
where $\mu$ is a Borel measure on $T$ and $m$ the Lebesgue measure on $\mathbf{R}^3$. Question: given a continuous function $g : {R}^n \rightarrow \mathbb{K}$ where $\mathbb{K}$ is either the real of complex functions and a (signed) Borel measure $\mu$ on $T\subset \mathbf{R}$ is there a canonical measurable non-continuous functions $f :T \rightarrow \mathbf{R}^n $ such that $\rho_f = g$ ? It would seem not. Any choice among possible 'random' candidates implies extra information. And we need to make sense of this question for continuous families $g_t$ of continuous functions, for example $g(t) = e^{i\pi t}$. The differential laws of $g_t$ might need to be seen as finite approximations.
Define real computable process.
Another approach: we have a measurable map $f: T \rightarrow A \times B \rightarrow$. Suppose that we know only $f_A(t_0)$ and not $f_B(t_0)$ while the knowledge of both would be theoretically enough to compute $f(t)$ for $t > t_0$. Then given a $U \subset A \times B$ we can take the measure of the set $V \subset B$ such that if $f_B(t_0) \in V$ then $f(t) \in U$.
If a trajectory is measurable and not continuous, does velocity or momentum even make sense ?
For $f : T \rightarrow \mathbf{R}^3$ measurable (representing the free movement of a single particle) we can define for each $I \subset T$, $\rho_I (x) = \lim_{x \in U}\frac{\mu(f^{-1} U) \cap I}{m(U)}$ which can be thought of as a generalized momentum but where causality and temporal order are left behind. Thus we could assign to each open interval $I \subset T$ a density function $\rho_I: \mathbf{R}^3 \rightarrow \mathbb{K}$. We can then postulate that the variation of the $\rho_I$ with $I$ is continuous in the sense that given a $\epsilon$ we can find a $\delta$ such that for any partition $I_i$ of $T$ with $d(I_i) < \delta$ we have that $|| \rho_{I_{i+1}} - \rho_{I_i}|| < \epsilon$ for some suitable norm.
This construction can be repeated if we consider hidden state variables for the particle, that is $f : T \rightarrow \mathbf{R}^3 \times H$ for some state-space $H$. Of course we cannot in practise measure $H$ at a given instant of time for a given particle. Note also that if we have two measurable maps then indiscernibility follows immediately - individuation is tied to continuity of trajectories.
Space-time is like an fluctuating ether which induces a Brownian-like motion of particle - except not continuous at all only measurable. Maybe it is $H$ that is responsible for directing the particle (a vortex in the ether) and making in behave classically is the sense of densities.
A density function (for a small time interval) moving like a wave makes little physical sense. Why would the particle jump about in its merely measurable trajectory and yet have such a smooth deterministic density function ? It is tempting to interpret the density function as manifesting some kind of potential - like a pilot wave.
The heat equation $\partial_t = k\partial^2_x$ represents a kind of evening out of a function $f$, valleys $f''>0$ are raised and hills $f''< 0$ are leveled. But heat is a stochastic process. Maybe this provides a clue to understand the above - except in this case there is only one very rapid particle playing the role of all the molecules.
Another approach: given a continuous function in a region $\phi: U \rightarrow \mathbf{R}^+$ construct an nowhere continuous function $\tau :T \rightarrow U$ such that $\phi$ is the density of $\phi$ in $T$. This is the atomized field. The Schrödinger equation is an approximation just like the Navier-Stokes equation ignoring the molecular structure of fluids.
Newton' first law of motion expresses minimality and simplicity for the behaviour of a free particle. We can say likewise that an atomized field if free is completely random spread out uniformly in a region of space. As yet without momentum. Momentum corresponds to a potential which directs and influences the previously truly free atomized field. Our view is that a genuinely free particle or atomized field is one in which the particle has equal probability of being anywhere (i.e. it does not have cohesion, any cohension must be the effect of a cause). Thus Newton's free particle is not really free but a particle under the influence of a directed momentum field. There are rapid processes which create both the atomized field (particle) and field.
Why should we consider a gravitational field as being induced by a single mass when in reality it only manifests when there are at least two ?
In Physics there are PDEs which are derived from ODEs of physics at a more fundamental level and there are PDEs that are already irreducibly fundamental.
A fundamental problem in philosophy: the existence of non-well posed problems (in PDEs), even with smooth initial conditions. This manifests not so much the collapse of the differential model of determinism but the essentially approximative nature of the PDE modelling. Philosophically the numerical approximation methods and the PDEs might be place on equal grounds. They are both approximations of reality.
Weak solutions of PDEs are in general not unique. Goodbye determinism. Entropy.