Friday, March 20, 2026

Anatta

Anatta was indeed the central philosophy of early, authentic buddhism. But is was not fully understood and rapidly compromised and corrupted. Its recovery, preservation and further exposition and development is found nevertheless in various schools of the East and West, both ancient (and even pre-buddhist) and modern.

We hold that Anatta is the essence and culmination of TPC and TPP.  Although in this blog we have discussed and highlighted the importance of Pyrrho, Hume, Nagarjuna - and a certain approach to Hegel, we shall make no attempt to articulate it further at the moment.

We say only that just as consciousness (and ego) transcends what we perceive to be non-conscious in the world and yet is somehow connected to it, so too there something higher than consciousness which nevertheless has a link to consciousness. 

Rather we shall more modestly focus on developing certain Kantian themes (see our new papers Analyticity, Computability and the A Priori  and Computability and Differential Models) which even in the light of the above can be considered as having a practical humanly wholesome role. 

Friday, March 13, 2026

Kantian notes

Schematism involves the concepts of  'rule' and  'time',  the realization of a pure concept of the understanding. How can we understand this but as computation ?

The difference between thinking something and knowing something. How can we explain this ?

The ideas of reason as ideal, completed totalities (the unconditioned). There is a largest set, because we can think of the set of all things and thus everything will belong to it. On the other hand if there is a set containing all things then this set itself must belong to it and as such will no longer be the largest set. Thus there is no largest set. Kant's anticipation of Russell's paradox. Kant is finitist and intuitionistic somewhat like Brouwer with his theory of choice-sequences.

The ideas of reason organize and give a direction to philosophy and science but they are also the foundation of practical morality. As we wrote previously: intelligence and morality are one.

In the transcendental dialectic in some cases Kant states that both opposing propositions are false in other cases that they are both true - this recalls the Buddhist tetralemma.

Reason as the intelligible character, the noumenic source of freedom. 

Kant seems to be saying that the phenomenal self and the phenomenal world are mutually dependent and relative. 

Grete Hermann:  quantum mechanics does not disprove causality, but rather clarifies it by separating it from deterministic predictability. She proposed that while quantum predictions are statistical, causal chains can be reconstructed retrospectively after measurement. 

A very important aspect of quantum mechanics is the relationship between the Schrödinger equation and the Hamilton-Jacobi equation (which originally expressed the analogy between mechanics and optics). 

$\frac{\partial S}{\partial t} + H(p, \frac{\partial S }{\partial p} , t) = 0$ 

This relationship was present at the very beginning in multifaceted history of quantum theory.  The Hamilton-Jacobi approach (beyond its use in quasi-classical approximation) is the key to developing a correct pilot-wave type theory (which does not necessarily have anything to do with Bohm's variant or path-integral approaches). Research on droplets bouncing on vibrating fluids (where the "droplet" both causes and is affected by the associated "wave") is of utmost interest and importance. 

We have discussed the mutually implied trinity : logic, computation and arithmetic. But we should also add therein combinatorics and graph-theory...

The Curry-Howard isomorphism expresses to a certain extent the correspondence between logic and the $\lambda$-calculus presentation of computation. Different type-theoretic-logical systems (Gödel's system T, Girard's and Reynold's system F) only capture a fragment of the class of computable functions. Curiously enough there is also a direct correspondence with forms of Peano Arithmetic wherein  provable totality is used to characterize such classes.

And what is a computational object but one which can be reduced, in which a computational process can be carried out ? 

Given a (partial) formal model of computation does it always have some kind of "logical" correspondence? And vice-versa? Note how both proof and computation involve temporality in an essential way...

This is what we call Girard's problem: in the above, what is a priori and what is posteriori? what is analytic and what is synthetic? 

Frege simply defined 'analytic' as that what is derivable in his system of second-order logic (and does not rely on any form of intuition).  Girard seems to view untyped computational objects as analytic and typed-ones as synthetic. One notion of a proposition being analytic is: being true in virtue of its form alone.

Girard's linear logic and proof-nets (and geometry of interaction, transcendental syntax) seeks, so it seems, to delve deeper into the above correspondence, even going beyond the distinction between a proposition and its proof. Girard offers a computational model distinct from the $\lambda$-calculus and still connected to the essence of proof.

Tuesday, March 10, 2026

Genius, Intelligence and Sainthood

Schopenhauer expounds a theory of 'genius' in the World as Will and Representation. The analysis and critique of varied historical and present meanings of the terms 'genius', 'intelligence' and 'sainthood' in the west is of utmost cultural and philosophical importance. Part of our philosophical project involves the radical deconstruction of these concepts and unmasking their harmful influences and consequent aberrations. 'Sainthood'  is currently employed in a socio-historical sense or related to organized religions, its true universal ethical and spiritual meaning being lost.  'Intelligence' is a vague and fluid pseudo-concept that has wrought immeasurable harm to human culture and society as well as to the human sciences. The term 'genius' is as a rule completely misapplied.

Our theory of intelligence is that true intelligence is founded on two intimately connected principles. That of knowledge of the universal moral law (perceived as such) and that of attaining at least some elements of transcendental philosophical consciousness (TPC) and transcendental philosophical praxis (TPP).

Extraordinary achievements in TPC and TPP are the mark of genius par excellence. 

The genius of true intelligence in the ethical sphere and TPP is called 'sainthood'. 

The special kind of genius, that while intimately connected implicitly to morality, TPP and TPC (as brilliantly expounded by Schopenhauer),  is that of artistic genius which reveals fundamental important aspects about human nature and the world.  

The quality of intelligence is related directly to the importance of its object and domain. 

Intelligence can be defined as an empathy for reality and moral intelligence has as its main source universal empathy for the suffering of all living beings. 

Intelligence and morality are one. 

The extreme opposite of true intelligence is all that involves a kind of cunning, a skill-set whose only purpose and application lies in achieving personal egotistic wealth and power, everything that involves dominating, exploiting and harming other living beings. There is no 'intelligence' here, no 'genius' and no 'sainthood'. Intelligence is radically incompatible with the will to power or competitive mechanisms of survival (i.e. overpowering others or merely adapting to a contingent environment).

The greatest deception and idol of western society has been attributing 'intelligence' or 'genius' to what amounts to nothing more than gaming, gambling, cunning and calculation. We have to be extremely cautious and insightful about attributing intelligence or genius to work in logic, mathematics, engineering or science,

Genius and intelligence in mathematics is never about the predictable success (theorem proving) in random searches in formal possibility spaces and conceptual engineering (scientific mass production) but rests solely on the relevance of the work to philosophy (more specifically to TPC) and to the philosophical unification and clarification of science.

Playing the game, being lost in the illusion o the game, is very different from the TPC-informed insight and consciousness of the game qua game (which can be compared to the attitude of Alice at the end of Alice in Wonderland...and Schopenhauer uses the metaphor of the chessboard after the game is finished). 

Thus we have such unsurpassed luminaries as Gauss, Grassmann, Riemann, Frege (who wrote that every true mathematician is half a philosopher), Peirce, Hilbert, Turing, Whitehead, Gödel, Brouwer, Russell, Poincaré, Lawvere, Thom, Martin-Löf, Girard, etc. Their writings are never a mere tortuous game with symbols and ad hoc concepts or non-rigorous obfuscation and plagiarism. The light of TPC implicitly shines through. Their work is also a spiritualization of language.

But scientific discoveries with practical applications to the quality and preservation of human life (or of any living being) are certainly meritorious - for indeed they partake this way in the sainthood, intelligence and genius of ethics.  Science aimed at mechanisms for harm and destruction is the ultimate immorality and stupidity. There is nothing brilliant or intelligent about faulty simplistic models of human society such as 'game theory'.

Saturday, March 7, 2026

Transcendental Syntax I : deterministic case - Jean-Yves Girard

https://girard.perso.math.cnrs.fr/trsy1.pdf

We study logic in the light of the Kantian distinction between analytic (untyped, meaningless, locative) answers and synthetic (typed, meaningful, spiritual) questions. Which is specially relevant to proof-theory: in a proof-net, the upper part is locative, whereas the lower part is spititual: a posteriori (explicit) as far as correctness is concerned, a priori (implicit) for questions dealing with consequence, typically cut-elimination.

Proof Nets

Tuesday, March 3, 2026

Elementary Topos Theory

Topos theory is a messy area with various rival factions and a problematic community. However we believe that Lawvere's original theory of elementary toposes and its subsequent development is important and interesting. The most important object in an elementary topos is the subobject classifier $\Omega$. If we think of the topos as a cell then $\Omega$ is a kind of nucleus. An elementary topos is simply a model of mathematics which as a foundation is vastly superior and more cogent than ZFC. To us the most important concept in topos theory is that of the Lawvere-Tierney "topology", which is just a morphism $j : \Omega \rightarrow \Omega$ satisfying three simple "modal" or "closure-operator" type axioms. A very important instance of $j$ is given by double-negation $\neg\neg : \Omega \rightarrow \Omega$, in which we consider the (internal) Heyting algebra structure on $\Omega$. The morphism $j$ then determines a localization of the topos, a new subtopos of the original topos called the topos of $j$-sheaves. 

A central philosophical problem of topos theory is understanding the meaning of the Lawvere-Tierney "topology" and its associated topos of $j$-sheaves as well as the special role of the $\neg\neg$-topology (why is it not abuse to call $j$ in this case a "topology"?).  A key to this is to see how the theory above abstracts the concrete case of presheaves and sheaves. A "sieve" is a curious concept. Think of a set $S$ of open sets (conceived as "cover") in some space $X$. Then take the minimal extension of $S'$ of $S$ under the condition that $U \in S$ and $V\subset U $ implies $V \in S $. Then we have a sieve (generated by $S$). We could rephrase the condition as $W \cap U \in S$ for any $U \in S$ and open set $W$. That is $S'$ is the $\wedge$-ideal generated by $S$. The we have the obvious notion of a principal idea generated by $O$ (called a principal sieve).

 For the presheaf topos on a topological space $X$ we have that the presheaf $\Omega$ associates to each open set $U$ the set of all sieves on $U$. So $\Omega$ is a kind parametric version of local truth values. On the presheaf topos an important example of $j$ is the functor that associates to each sieve $S$ (on a $U$) the principal sieve determined by "what $S$ covers". In the words of Moerdijk and Maclane "What counts is what gets covered". Thus in this case $j$ is nothing more than a kind of parametric generalized union. Logically it is expressing "if something is locally true then it is globally true". That is the subjobject of $\Omega$ determined by $j$ consists in those sieves which are invariant under generalized union, situations in which if something is locally true then it is globally true. This fails for instance for the presheaf of constant functions. This $j$ (which we should call the union topology) seems to be intuitively clear but we still need to understand better why in the topos of continuous functions on  topological space $X$ the internal logic of the topos proves that all functions are continuous.

Thus for the topos of presheaves on a topological space the subobject classifier on an open set $U$ yields all the sieves on $U$ while for the topos of sheaves it yields all the open sets of $U$ (or equivalently the set of principal sieves on $U$. 

But a central problem of topos theory is understanding other $j$s such as the double-negation topology (and its "$j$-sheaves"), in particular as an abstraction of the topological sheaf case. We have written something about this in our "Hegel and Modern Topology". The double negation topology is all about "density" while the union topology is about local-global coherence. Could we associate the union topology with exponentials in linear logic ?

What does the double-negation topology look like for the presheaf topos on a category $C$ ? And for the topos of sheaves over a topological space ? In this case it appears to be simple: an open set $V$ in $\Omega(U)$ is taken by $j = \neg\neg$ to $int(\overline{V})$. Thus the topology subobject $J$ is just the set of regular open sets in $\Omega(U)$. What are the $\neg\neg$-sheaves in this case ? The idea is like the passage from holomorphic to meromorphic functions (or the identification of measurable functions different on measure zero sets). The resulting sheaves can be considered as defined up to empty interior closed subsets. In a way, the $\neg\neg$-topology introduces closed boundaries, qualitative differences between different regions.

The fundamental definition of $j$-sheaf, an object satisfying $hom(E,A) \cong hom(B,A)$ for a $j$-dense subobject $E$ of a an object $B$ and considering the restriction morphism, can be understood as follows. The associated closure operator is a kind of transcendence, expansion beyond itself, saturation, perfection. The condition then picks objects for which their effect on a subobject already contains their effect on that subobject's transcendent expansion (closure). This means that the object satisfying the condition already contains within itself its own self-transcendence.

Put in another way, sheaves for the dense topology are ideal cohesive structures which on are object (open set) are determined by coverings whose union is not equal but only dense in that object. These sheaves allow a form of completion, limit, coherent transcendence. They allow one to construct an object based on coherent data which is yet still incomplete.

Sunday, March 1, 2026

Claire Ortiz Hill on Hilbert's philosophy (excerpts)

Upon several occasions, using almost exactly the same words, Hilbert described what he called the basic philosophical position that he considered necessary for all scientific thinking, understanding, and communicating, and without which no intellectual activity is possible. He stressed that this basic philosophical position was the very least thing that had to be assumed, that it was something that no scientific thinker could dispense with, and that everyone had to adopt, whether consciously or not. According to this approach, as a precondition for the use of logical inferences and the performance of logical operations, certain extra-logical, concrete objects had to be already given in our faculty of presentation and be intuitively present as immediate experience prior to all thought. For logical inferences to be reliable, those objects had to be completely surveyable in all their parts, and the fact that they occurred, differed from one another, followed one another, or were concatenated also had to be immediately given intuitively with them as something neither reducible to anything else, nor requiring reduction. Hilbert labelled this concern for concrete content the finite approach.

 In recognizing that such conditions necessary for the use of “contentual” (inhaltlich) logical inference existed and had to be respected, Hilbert saw himself as being in agreement with philosophers. Specifically, he considered this basic philosophical position to be part and parcel of the teachings of Immanuel Kant who had maintained that extra-logical concrete objects intuitively present as immediate experience prior to all thought had to be given, that, in particular, mathematics could never be provided with a foundation by means of logic alone, that it has at its disposal a content secured independently of all logic. In contrast, he repeatedly stressed that Frege’s and Dedekind’s attempts to provide arithmetic with foundations independent of all intuition and experience and to derive arithmetic by means of logic alone were bound to fail, because logic alone could not suffice, and certain intuitive conceptions and insights were indispensable for scientific knowledge to be possible.

 In the case of mathematics, Hilbert explained, the extra-logical, concrete objects intuitively present prior to all thought were the concrete signs themselves, whose shape was immediately clear and recognizable. As perceptually recognizable, objective and displayable numerals, the numbers of concrete-intuitive number theory met Hilbert’s requirements. They and the proofs of theorems about numbers fell into the domain of the thinkable. Formalized proofs were concrete, surveyable objects communicable from beginning to end. He defined proofs as arrays, that is, objects composed of primitive signs given as such to perceptual intuition and consisting of inferences where each premise is an axiom, directly results from an axiom by substitution, or coincides with the end formula of an inference occurring earlier in the proof or results from it by substitution. The axioms and provable propositions resulting from this procedure were, Hilbert contended, “copies of the thoughts constituting customary mathematics as it has developed till now”.

In his proof theory, only real propositions can be directly verified. “The formula game enables us to express the entire thought-content of the science of mathematics in a uniform manner and develop it in such a way that at the same time the interconnections between the individual propositions and facts become clear. To make it a universal requirement that each individual formula be interpretable by itself is by no means reasonable; on the contrary, a theory by its very nature is such that we do not need to fall back upon intuition or meaning in the midst of some argument”.

Anatta

Anatta was indeed the central philosophy of early, authentic buddhism. But is was not fully understood and rapidly compromised and corrupted...