Quantum physics represents a very imperfect, incomplete and highly unsatisfactory theory. Its interpretations and approaches are multi-faceted and complex. Quantum physics took a drastically wrong turn with the Hilbert space operator based Von Neumann axiomatization (and the subsequent more sophisticated, but likewise inadequate, $C^\ast$-algebra approach) - abandoning the much more interesting initial historical connections to the 'dualistic theory of radiation', statistical mechanics, the photo-electic effect, geometric optics, the Hamilton-Jacobi equations, 'wave mechanics', 'matrix mechanics' - and above all the insights of Paul Dirac who was one of the giants of 20th-century physics (Penrose's Twistor Theory is in some sense a continuation of Dirac). The investigations carried out during the initial development of quantum physics gave rise to meaningful and interesting mathematics and physics which did not depend in any way on probabilistic interpretations (or the collapse) of the wave function - for example the study of the Lorenz-invariant Klein-Gordon and Dirac equations. If the Hilbert-space and and $C^\star$-algebra based quantum theory was at least mathematically rigorous and interesting in its own right (it ultimately gave rise to Alain Connes' Non-Commutative Geometry) this is not the case of Feynman's and Schwinger's approach to (perturbative) quantum field theory. Cf. G.B. Folland's Quantum Field Theory: A Tourist Guide for Mathematicians (2008) where it is stated that once we leave the free field (itself requiring a staggering amount of functional analysis and distribution theory) we have left the realm of a correct mathematical formulation of physics. The problem is that QFT is not only bad mathematics it is also bad experimental science.
Something is rotten in the state of QED (Oliver Consa)
Consa says the much-touted precision of QED is based on measurements of the electron g-factor, but that “this value was obtained using illegitimate mathematical traps, manipulations and tricks”.Theoreticians come up with a calculation that exactly matches an experiment. Then a later experiment shows that the earlier experiment wasn’t quite correct. Then the theoreticians change their calculation to match the new experiment. And so on (...) Consa quotes Dyson from 2006: “As one of the inventors of QED, I remember that we thought of QED in 1949 as a temporary and jerry-built structure, with mathematical inconsistencies and renormalized infinities swept under the rug. We did not expect it to last more than 10 years before some more solidly built theory would replace it. Now, 57 years have gone by and that ramshackle structure still stands”. It still stands because it’s been propped up by scientific fraud. Here we are fourteen years later, and it’s still the same, and physics is still going nowhere. How much longer can this carry on? Not much longer, because now we have the internet.
https://physicsdetective.com/something-is-rotten-in-the-state-of-qed/
Maybe a clue to improving this situation involves a critique and reform of distribution theory - for example along the lines of Sato's theory of hyperfunctions. This has of course already been suggested in the context of the divergent infinite sums of $\delta$-functions appearing in QFT.
We postulate axiomatically that the 'position', 'momentum', even 'energy' of a 'particle' are given by distributions over space-time (we still have our standard PDEs for distributions). Thus a 'particle' does not necessarily have a definite position at a given moment of time. Nor is a 'particle' a wave or field defined aver space-time. It is a completely different kind of entity which subsumes as particular cases or approximations the aspect of wave or particle (for the wave-like aspect we have regular distributions, for the particle aspect $\delta$-functions, or something similar). There must be an "interactive" (we must carefully re-evaluate the controversies surrounding the interactive interpretation of the collapse of the wave-function as well as the hidden-variable approaches) or alternative way of explaining the collapse of the wave function and the probabilistic aspect based on this perspective (cf. the work of A. Hobson (2012)). Since experimentally we can only prepare 'test functions' with a limited degree of precision it is not surprising that the output of the distribution should also exhibit a corresponding degree of uncertainty. But of course we need to ask what is the physical nature of the test functions? Do not they have to be (regular) distributions as well? In practice the test functions will not be exactly regular distributions but only approximately so (determined by some boundary conditions). Thus observations - which correspond to evaluating the test functions - or interactions of localized distributions along a boundary - will have uncertainty corresponding to the non-regular components of the approximate test function. Note that a distribution is essentially non-local (cf. Hobson's analogy to a bursting balloon) although they can be restricted. It would be interesting to explore how this approach looks like from the point of view of Sato's hyperfunction theory - and sheaf cohomology (Penrose would endorse this !). And maybe Penrose's Twistor theory has an even greater significance in a completely different philosophical context than the one adopted by Penrose himself (who still adheres to the 'collapse of the wave-function' dogma). Consider the double-slit experiment. We need a concept of boundary and interaction for distributions. And to be able to deduce probabilistic information from distributions, boundaries and completely deterministic equations. But we must not forget that the plate used in the double-slit experiment is only approximately a plane - in reality it has a highly irregular surface and there will always be one local region which is the first to "touch" the wave-front proceeding from the slits.
Note that a regular distribution may be localized according to the support of its associated function $f$ in $L_{loc}(\Omega)$. That is its value for is equal to the value of its restriction to an open set containing the support of $f$ . Or the support can be disconnected, so we have two disjoint localized centers. A distribution may be regular or localized according to its restriction to a certain time interval but evolve into a different situation - this can be used for a solution to paradoxes similar to the EPR paradox. The photon ceases to be a localized wave-packet (a regular distribution) becoming a non-regular one (i.e. having a non-local holistic character) (we have a continuous path in the space of Radon measures for instance) thus explaining why an observation (i.e. an interaction) at location A can determine the outcome of an observation at a distant location B.
The whole proposal above is obviously highly sketchy and unsatisfactory. We need not only the non-locality (which transcends both the field and particle approach) using distribution theory but also the fundamental postulate that the linearity of the theories and equations is only an approximation of the fundamentally non-linear or even chaotic (but deterministic) physics at a finer scale (cf. the Casimir effect which QFT interprets as 'fluctuations of the vacuum'). It is this framework that could explain that in reality observations are interactions with a non-linear component - in general expressing what happens when a non-localized (non-regular) distribution interacts with a regular localized one (recall that there is no satisfactory definition of a product of distributions in general). Maybe we must extend physics to account for an equivalence between energy and information (in observations, the measurement process) perhaps embodied in Psi-phenomena. The wave-function is like the continuous holistic coordinated movement of juggling (or swimming). If the mind stops and focuses on a localized part (i.e. local interaction energy is exchanged) the system implodes and its non-linear dynamics leads to only apparently random final outcomes or crashes.
A real 'theory of everything' would be a theory which allows one to solve all (differential) equations.


