Quantum physics represents a very imperfect, incomplete and highly unsatisfactory theory. Its interpretations and approaches are multi-faceted and complex. Quantum physics took a drastically wrong turn with the Hilbert space operator based Von Neumann axiomatization (and the subsequent more sophisticated $C^\ast$-algebra approach) - abandoning the much more interesting initial historical connections to statistical mechanics, electromagnetism, geometric optics, the Hamilton-Jacobi equations, 'wave mechanics' - and above all the insights of Paul Dirac who was one of the giants of 20th-century physics (Penrose's Twistor Theory is in some sense a continuation of Dirac). The investigations carried out during the initial development of quantum physics gave rise to meaningful and interesting mathematics and physics which did not depend in any way on probabilistic interpretations (or the collapse) of the wave function - for example the study of the Lorenz-invariant Klein-Gordon and Dirac equations. If the Hilbert-space and and $C^\star$-algebra based quantum theory was at least mathematically rigorous and interesting in its own right (it ultimately gave rise to Alain Connes' Non-Commutative Geometry) this is not the case of Feynman's and Schwinger's approach to (perturbative) quantum field theory. Cf. G.B. Folland's Quantum Field Theory: A Tourist Guide for Mathematicians (2008) where it is stated that once we leave the free field (itself requiring a staggering amount of functional analysis and distribution theory) we have left the realm of mathematics proper. The problem is that QFT is not only bad mathematics it is also bad experimental science.
Something is rotten in the state of QED (Oliver Consa)
Consa says the much-touted precision of QED is based on measurements of the electron g-factor, but that “this value was obtained using illegitimate mathematical traps, manipulations and tricks”.Theoreticians come up with a calculation that exactly matches an experiment. Then a later experiment shows that the earlier experiment wasn’t quite correct. Then the theoreticians change their calculation to match the new experiment. And so on (...) Consa quotes Dyson from 2006: “As one of the inventors of QED, I remember that we thought of QED in 1949 as a temporary and jerry-built structure, with mathematical inconsistencies and renormalized infinities swept under the rug. We did not expect it to last more than 10 years before some more solidly built theory would replace it. Now, 57 years have gone by and that ramshackle structure still stands”. It still stands because it’s been propped up by scientific fraud. Here we are fourteen years later, and it’s still the same, and physics is still going nowhere. How much longer can this carry on? Not much longer, because now we have the internet.
https://physicsdetective.com/something-is-rotten-in-the-state-of-qed/
Maybe a clue to improving this situation involves a critique and reform of distribution theory - for example along the lines of Sato's theory of hyperfunctions.
A real 'theory of everything' would be a theory which allows one to solve all (differential) equations.


