Friday, September 19, 2025

Circle of philosophy

We have included in this site several essays exploring (among others) a certain 'school' or approach to philosophy which might be briefly described as one based primarily on the direct (unmediated) reflection of consciousness upon itself. Our major thesis was that the core elements for such a philosophy are found in the ancient Pali suttas as well in many places in the history of both western and eastern philosophy.  Almost everything that is of value in Husserl's 'transcendental phenomenology' is to be found in previous philosophical works and the ideal of a  'reduction' and 'epokhê' is found in its most correct and thorough form in the Pali suttas (which have a strange agreement with several elements of Aristotle's De Anima and even the first Ennead).  A complex and interesting issue is whether Hume or Kant came closest to the kind of transcendental awareness required for this approach to philosophy.

Another major approach to philosophy is skepticism as exemplified by certain Socratic dialogues and what accounts we can collect of the 'skeptical' phase of the Academy, Pyrrho and most importantly the extant works of Sextus Empiricus together with the works of Nagârjuna.  We argue for a substantial affinity between Sextus' Pyrrhonism and the Madhyamaka, just as the Yogacara (and specially some later analytical works of this school) is an important example of a consciousness-based philosophy. But paradoxically there is also an intriguing correspondence with late neoplatonism as well, with the apophaticism of Damaskios, Pseudo-Dionysus and others. We can discern in varying degrees the same kind of  equipollent amphibolous 'dialectic' in many important figures of modern philosophy such as Kant and Hegel.

Finally we have an approach to philosophy based on analytical atomism, and this 'atomism' may be (perhaps even simultaneously) physical, psychological and logical-conceptual.  Obvious examples are Leucippus, Democritus, the Vaisheshika and the various schools of pre-or-non-Mahâyâna  Abhidharma as well as the strikingly important case of Hume. There are also many examples of 'logical atomism' from ancient times to the present. In a way any doctrine of categories - specially those based on an analysis of language (cf. the Stoic lekta or the Proclean logoi) - tends towards a theory of logical atomism. One of our approaches involves a theory of categories based on finitism and computability (Turing completeness and Church's thesis) allied to a formal combinatory analysis of logic and language. Any theory of deduction or inference is at least implicitly involved with computability.

A project: history and archaeology of recursion theory and Turing completeness: search for and study ancient documents and artifacts exhibiting Turing completeness or other weaker computable properties. An important example is Panini's grammar which amounts to a term-rewriting system. A curiosity: Babbage's Analytical Engine was Turing complete.

Now a remarkable fact is that not only are these three approaches are in no wise  plainly  'contradictory' or 'incompatible' but they appear to be deeply entangled and dependent on each other (even if only in a negative sense, such as when atomism is taken as a starting point for refutation). This is patent for instance in the literature on the relationship between the Madhyamaka and Yogacara on in the text of Hume' Treatise  itself which is remarkable for containing at once elements of all three approaches.  A major issue is the relationship between these schools and ethics, the consistency with and indeed implication of a moral compassionate realism along the lines of Schopenhauer and the ancient Buddhist precepts. Notice that if we look at the best ancient representatives of all three approaches we see that the ethical spirit in fact imbues everything ! Cf. the last section of Sextus' Outlines 280: the skeptic is a philanthropist (dia to philanthrôpos einai)  who wishes to cure the dogmatist of conceit and rashness.  Now this does have a huge social and cultural implication...

It is very important to study carefully that monumental and unique work of genius which is Hume's Treatise and not only later abridgments, commentaries or 'refutations' in order to grasp all the aspects, subtleties and nuances present therein.  Our view is that the key to a correct understanding is to study the deep connection to Sextus Empiricus (properly understood) as well as the (historically paradoxical, perhaps) connection to Buddhist philosophy: certain Abhidhamma schools, the Yogacara and the Madhyamaka. Alison Gopnik has proposed a historical scenario for a direct acquaintance by Hume of the Abhidhamma literature during his stay in Paris. Our view is that in addition the Yogacara and Madhyamaka literature played a profound role. There is a certain similarity between many of Hume's arguments and those of the later Yogacara school (against external objects) and a similarity between Hume's treatment of substance, personal identity and monism in general and Nagarjuna's treatment of svabhava.  However some of these similarities may be explained by a direct influence of Sextus Empiricus (the Pyrrhonism - Madhyamaka connection has not been cleared up historically).

Breaking the circle: we have previously discussed the complexities involved in the 'paths of ascent', 'anagogic processes'   which should accompany and are the ultimate ground of the circle of philosophy. These also have a multiplicity and form their own 'circle'. In the west we have the theory of dialectics expounded in Plato (specially the mathematically based on in the Republic which we have discussed) and Plotinus (and other Socratic, Megarian, Eleatic, Pyrrhonian and even late neoplatonic forms). Then we have the theory of eros and beauty in Plato and also in Plotinus (and the whole tradition of platonic and courtly love in Europe, for instance Dante and the Fedeli d'Amor).  And there is the theory of catharsis and detachment  and self-awareness  (watchfulness) in the Phaedo and the Enneads (and also in Sextus and the Stoics) - which probably was meant to accompany other methods.  And of course the central and disturbing role theurgy played for neoplatonism (but possibly also for earlier platonism, pythagoreanism and its relation to orphism).   In the east the methods of yoga form a circle and the situation for Buddhism is indeed likewise complex, even for early Buddhism (cf. Masefield's book on  divine revelation in Pali buddhism complemented by Dhammika's text Broken Buddha: all this suggests that the tradition represented by Theravada is the result of a certain school of Buddhism ceasing long ago to be 'operative'  and becoming merely 'speculative' and 'ritualistic'.). In Buddhism anagogic dialectics played (as we saw) a hugely important role (and we should also mention the Nalanda tradition in Tibetan buddhism).  In this complex circle there seems to be involved the crucial presence of an anagogic energy and power and illumination, a kind of transmission. The 'virtues' and habits in Bhagavad-Gita 13.8-12  have a profound anagogic significance. 

There are certainly grave errors involved in certain approaches to anagogic processes (it is at least problematic that popular meditation manuals aiming at leading one to the jhanas are  adequate or sufficient) as well as certain attitudes and practices (specially those involving a kind of collectivism or passive surrender to the authority of a group).  However we must give a central place to philosophical awareness and its liberating transcendental insight, also knowledge of the inner working of the mind through accumulation of habits and associations (specially those ingrained in earlier years or through traumatic experiences): the mind must be a clearly conscious creator, never passive and deluded before its own subconscious conditioning, delusions and projections. And look at the deep meaning of the 'five hindrances' and 'seven limbs of awakening'.  These all form a complex interwoven feedback system, like a higher-dimensional Rubik's cube or analogous puzzle.  Often to address undesirable state A we apply B but this in turn gives rise to another problem C and so forth. There is a certain similarity to the situation with the 5 types of brain wave frequencies.

Friday, September 5, 2025

Natural Term Logic

Our aim is to develop a formal logic which reflects the logical mechanisms of the grammar of natural language. In particular a formal logic which is not

tied in any way to extensionalism or standard interpretations of quantifiers and multiple generality. It is our conviction that the standard use of variables (in quantifier logic, lambda calculus and the constructivist version used in dependent type theory) while being useful and mathematically interesting, does not directly reflect the actual logical structure of natural language and its mechanisms for expressing 'quantitative' determiners and multiple generality (cf. the study of suppositio and syncategoremata in the middle ages. ). Our philosophical premise is that the conceiving, applying, checking and reasoning about Turing complete formal systems presupposes a priori a system of cognitive categories that are independent from and prior to standard quantifiers. Rather the most adequate type of formal system for logic would be in the style of Quine's paper 'Variables explained away' (this can be compared to our considerations on the philosophical significance of term-rewriting systems in our paper on Analyticity and the A Priori). The earlier work of Schönfinkel and Curry (combinatory logic) while also involving the elimination of bound variables is however structurally quite different. Quine himself also mentions the earlier work of Tarski on cylindric algebras. It is necessary to extend the combinatory approach to intensional logic and a main source of inspiration for our work is George Bealer's book 'Quality and Concept'.

Our work on Aristotle's Topics attempts not only a formalization of this work but argues for there being present in Aristotle a variant of a natural deduction calculus for an extension of second-order logic. However it does not deal directly with variable-free logic, being focused on inference rather than logical expression.

Our system of variable-free logic is called Natural Term Logic. The basic building blocks are called primitive terms and denoted by capital roman letters $M,N,P$. Each term is assigned a unique (predication) sort which is an integer $\geq 0$. The sort $0$ should be understood as indicating that the term cannot be used a a predicate (i.e. it is unsaturated) and this will include both abstract objects (like propositions) and concrete objects. We sometimes denote the sort of a term by $M^{(s)}$. Finally we have constructors, denoted by lower-case Greek letters $\alpha, \beta, \gamma$ to which are assigned a unique signature $(s_1, s_2,..., s_n) \rightarrow s$ where $s$ is sort and the $s_i$ are sorts. The classification into sorts can be compared to the saturated and unsaturared lekta of Stoic logic and there are also important comparisons that should be made with medieval and classical grammar (our terms are slightly more general than definite grammatical categories or 'flections'). Constructors take as arguments terms of specified sorts and yield a complex term of sort $s$.

Roughly speaking, sort 0 terms correspond to proper nouns or nominalized sentences (in particular the objects of propositional attitudes), sort 1 terms are those that can be used as monadic predicates, sort 2 terms correspond to relations, etc. At this stage the same term can be used for different kinds of supposition. Thus we have the sort 1 term 'man' which also doubles as a predicate 'being a man' when conjoined with the appropriate constructor.

Basic constructors include the family $\delta$ (for simplicity we denote these basic constructor families by a single letter) which diagonalizes terms (this corresponds in particular to reflexive constructions such as 'to love oneself') and $\sigma$ which corresponds to permutations (this includes the reciprocal of relations: the reciprocal of the relation of 'somebody to know something' corresponds to 'something to be known by somebody').

But the fundamental family of constructors are $\pi$ which correspond to predication (both simple and embedded - this will be explained further ahead). We sometimes omit the constructor and use simply the concatenation of terms such as $SM$ for the nominalized sentence 'Socrates is a man'. If $L$ is of sort 2 (a relation) then predication can create a term o sort 1 $LJ$ (to love John) and then a term of 0 sort $MLJ$ 'that Mary loves John' (where we use a form of concatenation mimicking the SVO construction). As we shall see, we have a certain freedom on how we define these simple forms of predication (when combined with the basic constructors above).

But what is embedded predication ? Consider the sort 1 term Happy and the sort 1 complex term 'known by John'. Then we can form the 1 sort complex term 'known to be happy by John'. The grammatical aspect here is doubtlessly complex and varied (related to dependency and verbal complements). What is clear is that this construction cannot be reduced to simple predication involving the terms 'know' and 'happy'. Let us explain in more detail embedded predication.

Consider the following related complex terms (we use variables for convenience):\\


1 sort: 'X knows (about the) love (relation)' (knowing about the love relation)

2 sort 'X knows (about property) loving W' (somebody knowing the property of loving somebody)

2 sort 'X knows (about property) being loved by Z'

1 sort 'X knows (about property) to love Mary'

1 sort 'X knows (about property) being loved by John'

3 sort 'X knows that Z loves W'

2 sort 'X knows that Z loves Mary'

2 sort 'X knows that John loves W'

1 sort 'X knows that John loves Mary'


Only the 1 sort complex terms are examples of non-embedded predication of the terms Knowledge and Love. In all other cases the predicative aspect of the predicate term is pulled up into the complex predication term itself.

We have two versions of the analogues of binary connectives for terms. For 2 sort terms $L$ (love) and $R$ (respect) we have $L\& R$ the relation of somebody loving and respecting somebody. But we can also combine terms into a juxtaposed conjunction relation $L\otimes R$ resulting in a degenerate complex term of sort 4. Negation applies to terms of all sorts. Thus we have 'not being the case that Mary loves John', 'not loving Mary', 'not loving', etc. The juxtaposed conjunction together with $\delta$ allow us to define the product of relations, for example 'someone being the father of someone who is the father of someone'. This can be given an alternative (perhaps preferable) presentation using the natural term logic versions of the Peano operator and its indefinite version (selector). For instance for a 1 sort $M$ we form 0-terms 'the M' and 'a M'. But there are also 'functional' versions for 2 sort term yielding 1 sort terms: for instance 'the father of somebody' and 'a father of somebody'. We can also venture into the quicksand of extensions of terms as well as mereology (which was already developed in the middle ages in the analysis of the syncategoremata totus and pars).

Finally let us come to quantifiers. Though this is not strictly necessary for a formal study of our system we make the philosophcial assumptions that all quantification is bounded in some form, even if only implicitly.

Let us consider 1 sort terms only. Then 'all M is N' will be expressed by a constructor $\kappa MN$ with obvious signature $(1, 1) \rightarrow 0$ and analogously for 'some'. We can wonder if (grammatically oriented) we could also conceive of $\kappa$ as being a constructor of signature $1 \rightarrow 1$ so that $(\kappa M)N$ could be read in the same way. Or what about Bobzien's proposal for a Stoic implicational reading ?

Of great importance is how $\delta$ combines with quantifier operators by entangling together different subterms thus allowing us to express multiple generality in a way formally similar to the mechanisms of natural language and in particular the use of anaphoric pronouns. Examples can be found for instance in Boethius' discussion of conditional propositions in \emph{De topicis differentiis} (1176B) in which what in modern notation would be written $\forall x S(x) \rightarrow R(x)$ is expressed "If it is spherical, it is revolvable.", the second occurrence of the pronoun 'it' being anaphoric. In Natural Term Logic this could correspond to first applying $\delta$ to the term corresponding to $\lambda x y S(x) \rightarrow R(y)$ (the relation of one thing being spherical implying that the other revolvable) and then applying an unbounded quantification operator.

We cannot discuss here 'judgement' vs. proposition nor the interesting possibilities of formalizing directly much of the medieval theory of obligations (much richer than the proposition vs. judgment distinction) as well as a even closer interpretation of the Topics and medieval developments inspired by this work..

Consider the sentence 'every man has a father and has somebody who gave that man a name'. Suppose 'John is a man' is accepted. Then we must be able to infer that 'John has a father and there is somebody who gave John a name' : can we devise a syntactical algorithm for tracking the 'free variable' 'that man' ? And things get even more complicated when we are in the presence of embedded predication (verbal complements, etc.).

And what about 1 sort terms built from $\kappa$ such as 'having every brother liking Winnie-the-Pooh' ? We will see further ahead how this is done.

And surely we need a determiner-constructor corresponding for instance to adjectival constructions and other determiners in noun sentences. How is the complex term 'rational animal' formed (this kind of juxtaposition of terms can be given, it seems, a conjunctive reading) ? It would be interesting to study the subtle problem of semantic differences in the order of longer strings of determiners as well as apposition. The Topics permit us to formulate a much more fine-grained theory of noun sentence determiners.

Our Natural Term Logic is typed but only in a very broad sense. It turns out that the above approach has some very interesting applications to standard type theories such as the simply typed lambda calculus and Girard's system F.

In this paper we shall see that we can give a variable-free equivalent presentation of the simply typed lambda calculus and furthermore that this approach can be internalized in system F.

Monday, August 25, 2025

Finitism and resource consciousness

In formal logic and theoretical computer science not enough attention has been payed to the explicit study of resource limitations and their implications (cf. the study of the length of proofs).  We should reform these disciplines by explicitly postulating finite bounds in every aspect of the object of study and attach importance to the quantitative study of the interdependence of these bounds.  For example: study the length of proofs in function of the formula to be proved in system based on a finite language. Study resource limitation in the grammar of natural language. Study the computational capacities of automata seen as finite approximations of Turing machines. This may be a way a developing abstract theories with a higher degree of fundamenta in re.  Study the bounds and limitations of encodings of information and attempt to understand the question: can there be provable mathematical statements which have proofs which are too big to be written down or processed by a computer given our physical limitations ? Are there numbers to big to be represented by any means in the physical universe ? Finitism means not only finitely many basic or 'atomic' elements but finitely many higher-order relations between them.  This all is related to the mysterious problem of qualitative changes in scale (and also the relativity of scale as illustrated in Gulliver's travels).  Remaining in the finite domain by increase the quantity of a certain parameter (size, speed, etc.) suddenly beyond a certain limit the behavior of the system can change radically.  This is usually studied using the infinitary models of mathematical analysis and geometry, but surely this occurs in a finitary  context at a basic level.  The question of finitism is related to some of the most fundamental questions both of phenomenology and the very concept of 'analysis' and 'scientific consciousness'.  An authentic phenomenology must give a central place to the concepts of 'illusion' and 'delusion' as well as the two-pronged nature of the 'social' and the 'cultural'.  

To understand the synthesis of phenomenology, logic and analytic scientific consciousness (in a playful symbolic way we could say: the synthesis between Hume, Sextus and Democritus - and we would add Epictetus for moral theory)  it is good to look at the structure and dynamics of the theories of biology and biochemistry and their surprising correspondence with certain aspects of contemporary mathematics.  The analogy between a tissue and its cells and the definition of algebraic variety (or better sub-analytic objects or general stratified objects) which has both a 'smooth' and 'extensive' aspect and a local, discrete, algebraic 'intensive' aspect.  The algebraic locality is much like the discrete biochemistry of a cell and homology theory a kind of genome or centrifuge of the algebraic structure (so too are graded ring constructions).  What is lacking is of course the dynamic, transformative aspect.  The connection between biochemistry and logic (or between processes and inferences) is profound and Girard explicitly acknowledges this in his foundational papers on linear logic. We propose that the basic local-global (sheaf theoretic) concepts of geometry be applied to logic as well.  We have an organism made up from local logics considered as individual cells. For each cell hypothesis are what are given from without and what is deduced (asserted) is what exits the cell. The difference from standard sheaf (and topos) theory is that we have a dynamic global aspect related to (rapid) transport and distant interactions (cf. the $\pi$-calculus).  This TeX package is interesting because it bridges the gap between logical (and linguistic) syntax and chemical syntax by developing a linear system to represent two and three dimensional chemical diagrams (just as logical expressions codify syntactic trees).

If reality is represented by $U$ then a 'perspective' or 'approximation' or 'abstraction' or 'construction' is a system of relational logical atomism $A \rightarrow U$.  These perspectives can be organized by a partial orders $A < A'$ signifying that $A'$ extends $A$ or that $A$ abstracts from details of $A'$. A very interesting aspect of this abstraction are the convenient fictions of mathematical analysis, the 'passage to the infinite', seen also in statistical mechanics and the kinetic theory of gases. Some comparisons might be made with Jain logic. The theory of scale is fundamental here.

All formal systems involve the characteristic of proliferation, generativity (cf. our previous discussion on semi-Thue systems).  This is also omnipresent in algebra and geometry. The philosophical and scientific ideal of a 'system'.  This reproduces fundamental characteristics of mental processes (note how Chatterjee's book on the Yogacara makes a connection to the broadly Hegelian subjective idealism of Gentile).  But this generativity is limited by resources and is also error prone. Pyrrho and Sextus have a radically distinct approach and goal. 

Open systems of logics (of whatever order) - essentially the relational syllogism in Galen which we argue to be present much earlier - seem good candidates for pure a priori logic and the for expressing the cognitive nature of human Turing completeness.  This is (combined with Stoic propositional logic) is the natural logic to express semi-Thue, term rewriting systems and indeed an adequate concept of law and rules of a game (cf. as Panini's grammar).  If existential quantifiers are interpreted constructively then we can interpret their presence in a theorem as an abbreviation for a more complex expression which is part of a purely open formula (let us say, second-order).

We must not look for linear progress in the history of logic, but rather cycles, periods of decadence and progress.  There appears of have been a huge step backwards in many epochs in the history of logic. For instance the Kathâvatthu is much more sophisticated (possibly) than the later standard Nyâya theory of inference  before it subsequently becomes very sophisticated and interesting in the Nava-Nyâya school - which Ganeri argues to represent a kind of graph-theoretic ontology and dialethic logic which also represents polyadic predicates (numbers are interpreted as such).

The syllogism $Fa$ and $\forall xFx\rightarrow Gx$ hence $Ga$ is epistemically faulty with a substitutional reading of the universal quantifier, because then it includes as a particular case $Fa\rightarrow Ga$ with  means that we already know that $Ga$ since $Fa$.  Also $a$ may be the only $x$ such that $Fx$ - hence the requirement that $F$ and $G$ be non-empty with a  non-$a$ instance.

Addendum: Boethius and the medieval tradition support out interpretation of a topic as a 'maximal proposition', i.e. as an axiom.  There is an interesting notion of definition as an 'unfolding'. We should mention Boethius' 'hypothetical syllogisms' in our note on Kant's logic.  Can we find where Boethius is more explicit about a universally quantified conditionals ? Note that his indefinite  propositions is our indefinite article selector and that singular propositions are introduced. Disjunction seems to have been studied from a profoundly mereological perspective (see also Kant and Hegel) which echoes our treatment. 

Addendum: to our post on Hume we should add that there is much epistemically humility and agnosticism in Hume, direct reference to a mysterious, magical, unknown and unknowable 'human nature' which recalls very much Kant's similar remarks on the schematism of the pure concepts of the understanding.

Wednesday, August 6, 2025

Another view of philosophy

Philosophy is divided into pure grammar, ethics and liberation practices (eleutherology).  Pure grammar holds that the main legitimate domain and goal of philosophy is the description and direct awareness of the deep structures and processes of natural language (and this includes exploration and classification of the total semantic universe).  It is called 'pure' because this is to be done without any extraneous reductionism and ideological, theoretical presuppositions or interpretation, anything beyond the domain of the analytic a priori computationalist  'core logic' discussed in our previous work. Pure grammar criticizes standard philosophy (from its historical roots onwards) for its shaky or questionably linguistic confusions and simplifications (rather than taking the formalist logical-dialectical approach we have discussed in the past). There is also the problem of ideological distortions and arbitrary dogmas both in their views and applications of language, specially the most recent physicalist and behaviorist ones (and this applies equally to many contemporary linguistic theories with physicalist, behaviorist and social-pragmatic premises, or the dubious theories which reject meaning altogether.). Though of course historically many philosophers in some of their works have approached to realm of pure grammar (Aristotle's Topics, the Stoics, Buridan, Leibniz, Peirce, Frege, Montague, etc.).  Some of the great linguists are revealed to have been great philosophers (Pânini,  the (nava)Nyâya school, Saussure, Anna Wierzbicka). It is also 'pure' because it claims to be (or shows itself to be) cognitively and epistemically a priori and more fundamental than all the sciences: pure grammar is 'first psychology' and 'first cognitive science' (of course we distinguish between surface and depth grammar and bring to the foreground the plurality of natural languages and the study of universals). Pure grammar shows the limits of the world (without making claims about the world) and its shows what must and can be transcended in order to go beyond the world. Pure grammar challenges not so much the idea of formal logic in itself but the faulty paradigms which dominated the extensionalist logic of the 20th century (in particular its inability to distinguish between distributive and merely intensional quantification). Pure grammar  challenges both dialectics (Madhyamaka, Hegel) and idealism (Yogâcâra, Plotinus) - while at the same time ascribing a certain value to both of these (and more can be said in relation to eleutherology).  Perhaps Ch'an (Zen) - which we might call linguistic buddhism (the practice of the Koan) -  can be seen as related closely to pure grammar in its eleutheric aspect. Pure grammar should not be confused with the ideas of Mauthner, Wittgenstein and 'ordinary language philosophy'  - certain superficial similarities mask a profound, radical and irreconcilable difference and opposition (there is no question of pure grammar attempting to reduce language and meaning to any alleged social act, game or toolset: also a language need not be in perfect working order): a task of pure grammar is critical analysis and refutation of such schools. The same goes for structuralism (including the Prague School),  'grammatology' and postmodernist linguistic philosophy.  Comparison with Husserl's Logical Investigations (and to Bolzano, Lotze, Brentano, Meinong,  Twardoswski, Mally and specially Marty)  is an interesting and involved subject that is well worth exploring.  An important work for pure grammar is Bernard Pottier's Linguistique Générale (1974). Also Jean-Louis Gardies. Of interest is also medieval scholastic logic and grammar.

Wilhelm Wundt - Psychologismus und Logizismus (1910)

Some of our preliminary studies for our project of a 'pure grammar': Bealer's Intensional Logic, Aristotle's Second Order Logic,  On Analyticity and the A Priori.  It is important to compare the theory of definition in the Topics with modern 'analytic semantics' and its sememes. Furthermore, our papers  Differential Models, Computability and Beyond and Hegel and Modern Topology can be considered essays on regional ontologies and the pure categories of the understanding.

Pure grammar: an extended universal philosophical semantics and grammar as transcendental knowledge in the Kantian sense. Pure grammar is about seeing, about coming to awareness. It has no method. And it has nothing to do with social acts, behaviors or games. Learning language is a linguistic activity and is not a game or an action (for games presuppose language).  We cannot act without language (or logic). Sneezing is not an act.  Philosophy does not result from linguistic confusions and misapplication: rather the vice of philosophy is simply the ignorance of the importance and complexity of the pure linguistic a priori.

Pure grammar puts forward linguistic (arti)facts which being surveyed automatically  lead to pure knowledge.  That is, the artifacts are presented which disclose immediately cognitive-linguistic-semantic structural symmetries and dualities and triads - as well as proto-spatial and proto-temporal a priori categories. Thus pure grammar contains a kind of pure  a priori immediately evident geometry.  Pure grammar has not as primary aim the foundation of science but rather eleutherology. Kant anticipated much of contemporary linguistics.

The recursion of subordinate clauses is limited in natural language (maybe maximum three levels before intelligibility is lost ?).   We should investigate how subordinate clauses can be interpreted or transformed away.  This is of course intimately connected to intensional logic.  John believes that Mary believes that John believes something about her.  Can we transform this multiply embedded subordinate clause sentence into a sentence without subordinate clauses ?  John believes something about Mary. That is something Mary could be considered to believe.  And John believes that Mary believes that. This is of course just a way of disguising nominalization of sentences through backwards pointing demonstrative pronouns.

So, there is the triad of psychological (phenomenological) a priorism, logical a priorism and grammatical-semantic a priorism.  A particular kind of psychological a priorism is psycho-somatic a priorism with profound eleutheric consequences. As Wundt discusses in the text linked above, there is a deeply confusing entanglement, dance  and even contest between all the members of this triad. Which one, we can wonder, holds the key to a synthetic illumination of human existence ? To pañña and the overcoming of dukkha ?

The correct 'phenomenism': we note  that the naive, default world-view is actually 'idealist' in the sense that it is the result of an unconscious projection and constitution by an ego, a mine-making force and determination. Correct phenomenism  looks at things as they really are, looks at what really is in fact there and discerns the fundamental properties shared by the  things* that are there, not what is made from, by, in, or with them by the ego and mine-making force and its constitutive conceptual proliferation. False idealism attempts to logically and causally derive the world from the ordinary ego or subject (or will).  True phenomenism lets what really is shine forth as it really is and liberates from the world and from the world-tending ego and its superimposition, distortion  and positing.  Seeing psycho-physicality from a neutral third-person perspective. It is the coming to intimate yet detached awareness of the process of one's own thoughts qua such and merely as such as well as the perception of the world (and the mundane ego) as immanent and constituted in this thought-current (into which is woven sense-imagination). yogascitta vritti nirodhah. Correct phenomenism unveils the right reference point which synthesizes and makes a compendium of the sphere of core semantic categories of human existence (and hence thought and language and life). This is the critical 'phenomenology' we have discussed extensively in this blog.

*we do not mean 'things' in the ontological sense, but in the sense of mind-stuff manifestations.

Sunday, July 20, 2025

Philosophy of quantifiers

Are quantifiers convenient fictions with fundamenta in re ? What does constructivism and dependent type theory have to say about this ? And functional interpretations ? 

Universal quantification is either an abbreviation for a expression of finite conjunctive knowledge or else something about concepts and not pluralities or extensions.

Universal quantification is determined by a computable function.

A universal Turing machine or equivalent machine (we will not discuss finiteness arguments here) is enough to check any proof or run proof-searches. And all machines imply human intentionality.

The first principle of synthetic a priori knowledge: that a specific computation of one machine can be taken as showing a certain (non-finitarily verifiable) property of the computations of another machine (for instance, output X cannot be produced from any input).  But this property itself may on the surface involve quantification in its expression. Again quantification needs to be seen as determined by a computable function or functional.

What is a typing judgment $t:T$ ? It is a statement that a given machine produces a certain output for input $t$ (inference) or $t$ and $T$ (checking).

Is there a particular philosophical interest in considering Boolean circuits or cellular automata (or neural network) models of universal Turing machines ?

It might be possible to elaborate an alternative version of formal logic, closer to the syntax and mechanisms of natural language and without any implicit extensionalismFF.  We call this system natural term logic (NTL).  The building blocks consists of primitive terms (denoted by capital letters) and primitive constructors (denoted by small-case letters).  To each term we associate a tag which can be $I$, $\omega$ or a number $n \geq 0$ and to each constructor we associate an ordered sequence of tags $t_1,...t_n,t$ denoted by $t_1 \times...\times t_n \rightarrow t$.  We form complex terms and associated to each of them a tag in the expected way. We indicate the tag of a term as $M^{(n)}$.  The most basic constructor corresponds to predication and is denoted simply by concatenation in some cases. There is also partial predication. The tag $i$ is for individual substances and $\omega$ is to indicate that a place in a constructor can accept any term. 

 NTL rejects unbounded quantifiers and does not presuppose extensionalism.  Universal quantification in its most basic form is given by a constructor $a$ of tag $1 \times 1 \rightarrow 0$. Thus for $M^{(1)}$ and $N^{(1)}$ we have the $0$ term (i.e. propositional term) $aMN$ which can be read 'all $M$ is $N$'. Likewise we a similar constructor for 'some'. Let $F^{(2)}$ be the primitive relation term of fatherhood. Then how do we form the $1$ term corresponding to having a father ? To we have implicit boundedness here too ? Before discussing this we can mention the we posit a class of constructors for relational terms which permute or diagonalize certain arguments.  But back to 'having a father'. We need a constructor $s^{1 \times 2 \rightarrow 1}$ such that $sMF$ would be read as 'being an M having an F'.  We can also deal with a unique existential quantifier.  

Can we conceive of a logic in which syntax and logic and intertwined (a way of dealing with non-denoting terms) ?

Notice the difference between relations and  (choice) functions.  We can take a binary relation $R$ and obtain a  (partial) function $cR$ which assigns each M to a possible F (if any).  This suggests that we should consider an alternative approach to NTL, closer to Church's intensional logic (but without variables and with bounded quantifier constructors) - or indeed to Stoic logic ! Each term will have a type. The genitive and other natural language constructions are clearly functional.  Thus for $M,N: \iota \rightarrow p$ we obtain $\Pi_1 MN : p$ and for $R: \iota\rightarrow \iota\rightarrow p$ we obtain $\Pi_2 MR : \iota \rightarrow p$, 'being the R of all M'.   We have a family of choice functions. For instance $cR: \iota \rightarrow \iota$  which corresponds to 'an R of something'  which is not the same as 'being the R of something'.  To define paternal grandfather we still need a combinatoric version of lambda terms (see our paper on Bealer's logic).

Being the father of someone's father, that is, $\lambda xy Fx\iota z Fzy$, which then is to be converted according to our formulation of Bealer's logic. The mechanisms of natural language for expressing multiple generality and logical constructions in general are wondrous and intricate.

Project: find a variable-free combinatoric version of our system ASOL  The metasyntactic operators can be expressed directly in the logic through the 'second-order' functions. 

Did the Stoics anticipate dependent type theory (DTT) ? If one argues that 'all men are mortal' would be implicatively expressed by the Stoics as: if something is a man then it is the case that that man is mortal - this corresponds to there being a term of type $\Pi$(x: man) mortal(x). This term is a 'function' which takes every man m to a proof of the proposition "m is mortal" (but this can also be another kind of type such as what is called a 'set' and even one that does not depend on $x$).  Of course there are other readings in dependent type theory. Anyhow so much remains to be understood about quantifications and the philosophical implications of homotopy type theory. 

If we take 'all men are mortal' as a function which takes each man $m$ into a witness or justification or proof or evidence or element in the set $mortal(m)$ which is a kind space for the mortality of each man.  Note that for adjectives such as 'big' this is necessary: a big flea does not use the same standard as a big house. The fibers have only an analogical correspondence. If we want to read a type for individual substances in an extensional way, the types for adjectives or qualities have to be considered otherwise. Notice how complex 'some A are B' is in DTT. 

See also our latest note on 'Natural Term Logic' (available on researchgate and academia.edu)

Saturday, July 19, 2025

Commentary on the first book of Aristotle's Topics

In this note we assume the reader is familiar with the system of logic expounded in our paper 'Aristotle's Second-Order Logic'.  We refer to this system now as second-and-a-half order logic as it sits between second and third-order logic. In the above paper we argue that this is the natural logic to formalize Aristotle's philosophy. 

Chapter 1. Aristotle's definition of 'syllogism'  here is quite general (and should not be confused with the 'syllogisms' of the Analytics) and a good translation would be 'inference', the kind of inference represented by a sequent in the sequent calculus (with the cut-rule) with one formula on the right $A_1,A_1,....,A_n \vdash B$.  Aristotle's 'true' or 'primary'  things are sequence of the form $\vdash A$.

Chapter 2. But what is dialectic in the Topics ? Does it investigate the axioms of the particular sciences themselves (which cannot be investigated in those sciences) ? The passage 101b1-3 is mysterious.  Dialectic is a critical path having the 'beginning of all methods'.

Chapter 4.  The protasis. Each protasis indicates (is made up from) either property, genus or accident. Difference is strangely classed as 'pertaining to genus (generic)'. And   'problems' can be constructed from every protasis by changing the 'mode'.  This is very subtle and interesting point regarding intensional logic and, so it seems, a term-formation corresponding to interrogation.  

Chapter 5. This is a very important section.  We have a 'logos' which 'semainein' (signifies).  It is not clear if the logos is meant as a mere signifier or as the sign (signifier + signified).  Here logos is contrasted with onoma.  Apparently this corresponds to the difference between simple and complex terms. We see that there are definitions of complex terms.  An important question is: can we accept a definition consisting of a simple term ? Here Aristotle hesitates but admits that such protasis are at least useful for definition. A fundamental concept is that of 'antikategorein' (to be convertible with).  In 'Aristotle's Second-Order Logic' we argue that the Fregean distinctions between Sinn and Bedeutung  as well as between concept, object (and extension) are all present in the Topics.  A is convertible with B for Aristotle if A and B are predicates with the same extension (but not necessarily with the same meaning).  Aristotle's formula in 102b20-23 is : if (it) is A then (it) is B and if (it) is B then (it) is A.  Clearly definition and property have the same extension but different meaning (they both do not signify essence).  The rest of the discussion is valuable for elucidation of 'accident' and how it overlaps with relative (and temporary) property. The example of the 'only man sitting' (in a group) suggests a connection to definite descriptions of individuals.

In second-and-a-half order logic, all quantifiers should be bound - and how are we to interpret quantification, for instance in chapter 1 of the second book.  How did Aristotle distinguish between: 'all men are animals',  'man is an animal', 'animal is the genus of man', between extensional and intensional predication.  In the above chapter Aristotle seems to give the rule: from $\Gamma \vdash \forall_{x:Ax} B$ we can derive $\Gamma \vdash \exists_{x:Ax} B$.   In the beginning of chapter 6 of the second book we find the Stoic exclusive disjunction.  It is interesting that 'connectives' for Aristotle are just as much term operators as operators on propositions.

We are concerned with finding evidence for the natural deduction rule of existential quantifier introduction. For instance Aristotle explicitly stating that we can infer from Socrates is mortal that some man is mortal. Unfortunately direct evidence is lacking. Rather it would seem that a dependent, alternative, version of existential quantifier introduction is required.  We could reserve universal quantification for explicitly distributional, extensional quantification (such as in the the expression of the topics itself) and in other case use $\gamma$, etc. And $\exists^\gamma_A B$ would be third-order predicate meaning $\exists C\prec B \gamma AC$. Thus our new version of the existential quantifier rule (which is a topic discussed in book 2, 109b) would look like this: from $\gamma AC$ and $C\prec B$ deduce $\exists^\gamma_B A$.

Chapter 7. Peri Tautou (sameness, identity).  There is a kind of homotopy or qualitative sameness considered here and the example of the water drawn from a given spring is noteworthy. The  question of identity is examined in our paper and is crucial with regards to extensionalism.  What is Aristotle's 'arithmetical identity':  the name being many the thing being one (103a9-10).  Here Aristotle may be interpreted as postulating that identity is not a primitive notion but a polysemic and it to be defined in terms of either homonymity, definition or property  or even accident ! This rules out any extensionality (Frege's Law V).

Chapter 8.  Definition consists of genus and differences and these are said to be 'in'  the definition. There is still the question, however, of the precise relationship between difference and accident. 

Chapter 9.  For Aristotle 'kategoria' means 'predicate'.  What is the relationship between onoma, pros, protasis, logos and kategoria?  Category in the ordinary sense is actually 'genus of predicate'.  This chapter is very important and very subtle.   The ten classes seem to be genera both of predicates and of things in general  - an ontology.  In our paper (and in Modern Definition and Ancient Definition) we raise the question of the definition of objects that do not belong to the category of substance. Elsewhere we have inquired about Aristotle's view on statements of the form 'A is A'.  Aristotle is stating here that if the thing and predicate belong to the same class then we have an essential predication, otherwise we do not.  But how can we accept Aristotle's example of predicating 'man' of a given man being a predication according to essence ?  How can 'white is white' signify essence? 

Chapter 10. This chapter offers us the rudiments of a new kind of intensional logic: a doxastic logic, and is of considerable interest.  We can think of a modal operator Dox(P) satisfying certain logical rules. 

Chapter 13.  Differences of meaning of a term and a term qua term can be objects themselves of propositions.  To formalize the Topics we thus may need a third-order semantic identity relation.

Chapter 15.  It would be interesting to investigate formal systems in which each term is assumed to be interpreted as having possibly a set of references (and meanings) rather than one (or none).  This is the kind of polysemic logic that looms large in the Topics. A kind of semantic set theory, perhaps.  The task is to construct expressions which are singletons and to detect them within the formal logical and grammatical rules of the system. Aristotle must accept that there is a notion of semantic identity (which is not the same as that of 'antikategorein' or extensional equivalence).  We tried to formalize this notion in our second-and-a-half order logic. See the previous remark. 

Thursday, July 17, 2025

The Legacy of Abel in Algebraic Geometry

https://publications.ias.edu/sites/default/files/legacy.pdf

We hope to share similar papers on Galois, Euler, Cauchy, Lagrange, Legendre, Bolzano, Hamilton, Gauss, Sophus Lie, Dirichlet, Dedekind, Grassmann,  Kummer,  Sylow, Liouville, Wronski, Riemann, Weierstrass,  Schröder,  Hecke,  Sofya Kovalevskaya, Hilbert, Poincaré, Krull,  Couturat, Hermite, Picard, Camille Jordan,  Poussin,  Felix Klein, Ramanujan, Hermann Weyl, Teichmüller,  Élie Cartan, Henri Cartan, Oka, Ehresmann, Pontryagin, Siegel, etc.  And also more recent mathematicians such as Smale, Thurston, Hamilton, Milnor, etc.

A philosophy of mathematics:

i) the importance of the rigorous logicist ideal of Leibniz and Frege (not to be confused with formalism)

ii) but equal importance to the training of (higher-dimensional) geometrico-dynamic intuition

iii) and preserving a connection either to philosophy or to applied science (and the unity of mathematics itself)

iv) and the dangers of wrong or faulty abstraction (not to be confused with good, natural or intelligible abstraction)

Thus the wrong direction or trends in mathematics has three aspects: the deviation into unintelligible and exaggerated abstraction, the loss of logical rigor and clarity in concepts and proofs and the loss of the philosophical vision of the unity of mathematics as a whole and its connection both to philosophy and science.  To this we add the lack of a criteria to evaluate progress and quality in mathematical work (and separate it from mere programmed automatic productivity without a unifying synthesis, transparency and purpose). 

Much harm has been done to mathematics (including the teaching of mathematics) through the distortion of core disciplines in number theory and geometry and analysis via wrong and deviated abstractions which obstruct both the logical and intuitive clarity and essence (and dare we say beauty) of the fundamental objects of study.  Also there is a bad habit of naming theorems after people who merely stated them (including when they produced erroneous proofs) rather than the person actually proving them. 

Consider also the arbitrariness in a certain selection of 'great problems'.  And so often the proof of a famous conjecture resembles the building of a cathedral, a collective collaborative endeavor  spanning several generations.  How unfair it is to give the prize merely to the person who happens to put the cherry on the top when the real hard work and deepest insights and ideas belong to other people. And specially if the said 'cherry'  is incomplete or has some dubious elements or 'holes'.

Pedagogically, for undergraduates and basic graduate courses, the focus should be on finite extensions $\mathbb{Q} \subset F  \subset \mathbb{Q}^{ab} \subset \mathbb{C}$ and $\mathbb{F}_q [x]$ wherein all the richness and essential content of basic algebraic number theory (including Galois theory)  happens naturally and in a more illuminating way than with standard abstract approaches.  Algebraic geometry should be concretely focused on complex algebraic curves (and surfaces) using, among other approaches, the excellent philosophy of the book on the same subject by Kirwan - therein the concrete essential algebra of polynomial rings over the complex numbers (or the core ideas about polynomial and power series rings of Hilbert and Krull) and complex analysis (the Riemann surface approach - via elliptic functions and the Weierstrass elliptic function; also modular groups) are brought into play. Elliptic functions and elliptic curves (including over the rationals) should also be fundamental. In algebraic topology the foundation should be in concrete topological and combinatorially based homology theory and natural cohomology theories (de Rham and to a certain extent Čech) avoiding abstract cohomology and relative (co)homology - and we need to focus on the right kind of 'space', more concrete and adequate than topological manifolds - spaces like complex varieties and the 'stratified' spaces of René Thom. Sheaf theory should return to its concrete and complex-analytic roots in Leray and Oka and the traditional topological formulation should be revived (based on local homeomorphisms and covering spaces).  Real algebraic geometry must be revived. Discard derived categories and go back to spectral sequences,

How mathematics leads to reversion to the logoi and to the nous

 


The following note sketches some ideas that attempt to make sense of Proclus' theory of mathematics and dialectic in the Commentary of the First Book of Euclid and Commentary on the Parmenides. How does the study or doing of mathematics lead to the unveiling of the system of the essential logoi in the soul and consequently the souls reversion (according to its mode) to the nous ? What mathematics should be studied or done and how should it be approached ? Is there an essential philosophical difference between ancient and modern mathematics ?

To attempt to answer some of these questions we propose the following theory of mathematics.  The structure of mathematics (be it ancient or modern) resembles the structure of living tissues, it is composed of a grid, a tiling, of 'cells' which are also evidently (logically and conceptually) interconnected. But each cell (even if incomplete and fragmentary from a purely formal mathematical point of view, from the point of view of concepts employed and results derives) exhibits a certain essential unity and sufficiency from a higher perspective. 

In the figure above the lowermost layer of cube represents mathematics with its natural division into cells (small cubes), each representing an autonomous intelligible unit of mathematical theory. It is important to be abe to carve out mathematics according to its natural cells or units. Now mathematics is constantly growing (both in scope and in detail) and self-revising.  But this growth should be represented as a horizontal growth represented by the expansion of the lower layer of the cube (adding new cubes). Over each cube in the upper layer is a column of cubes progressing in the upward direction. These represent the progressive unveiling of the logoic and noetic content of that particular mathematical cell: for each mathematical cell is like a microcosm of self-sufficient intellectual and noetical content and potential.  It would be more accurate to represent the cube as converging like a cone in the upward direction, for the ultimate goal of the vertical process of every cell is the same. It is this upwards interpretation which is also a source of synthesis and progress in mathematics.  It is clear that Proclus' anagogic process cannot depend in any way on the further horizontal progress of  mathematical theory (or on the difference between ancient and modern mathematics).  Rather it must be sufficient to consider one (or a few) genuine mathematical cells and use it a starting point for the anagogic process.

Common mathematical practice is concerned almost exclusively with horizontal expansion and the birth of more cells,  a frenzy for finding proofs, defining concepts and producing new results - which justifies in a certain sense some of the censure addressed at mathematics in Hegel's Science of Logic (the proofs are left behind like a ladder). There is not so much of a return-to-self via dwelling on a given cell, or a gradual development and deepening philosophical and spiritual intuition of a given organic unit of mathematical theory.  All genuine units of mathematical theory have at first sight something 'difficult',  'mysterious' , 'non-evident' or 'surprising' about them (and this is the source of the addictive nature of mathematics), even if this be regarded as proceeding from a mere fortuitous combinations of clever tricks. 

Thus for the Proclean anagogic and reversion process based on mathematics our first, vitally important, task is to identify and natural intelligible cells, noetically self-sufficient units, in the great body of mathematical literature and knowledge.

And yet there are so many factors and qualities involved in a portion of mathematical theory that it seems difficult to assign perfection,  completeness and sufficiency to any given theoretical portion (either ancient or modern).  So the corresponding anagogic process will, it seems, always be approximative only, if we consider merely its dependency with  regards to its purely mathematical basis.  Something else will be required to supplement the defect. 

Sunday, July 13, 2025

Problems of philosophy

What is the nature of 'dialectics' according to Plato, Plotinus and  (in particular in the context of the Parmenides) ? How did it relate to other forms of ancient logic such as Stoic logic and Aristotle's Topics ?  There is also the following interpretation of some aspects of dialectics. Given a logical system L we can study different axiomatic theories in L and how they relate to each other  (for instance, are they mutually consistent) and whether they are in themselves consistent or incomplete.   A major paradigm is starting from a hypothesis H and arriving at a contradiction or staring from the negation of H and arriving at a contradiction and taking this to be (as in classical logic) a proof of H.

And we can study different logical systems and their relationship as well as the relationship of their theories. However all such logical systems use and epistemically presuppose recursion theory and arithmetic - and along with deduction exhibit some of the order-characteristic of temporality and also it seems cyclic temporality. Also the different  theories can be projected outwards in the form of concrete models, specially geometric models. Such models in turn can lead to other discoveries. And models can be reflected in other formal systems (see our theory of reflection) and concepts such as categoricity come into play, which must not be misunderstood in some kind of absolutist sense. And from a Fregean point of view we can consider the theory of the  informal elucidation of the primitive terms and axioms.  It is not at all clear how considering different theories (hypothesis) in logical systems can lead to disclosure of the primitive terms - but we must consider first of all the problem of the meaning of  meaning, proposition, truth, of logical connectives and quantifiers as well as the concept of deduction and inference (this is already a self-reflection of logic), etc. The quest for primitive terms must involve the theory of definition.  All these primitive terms, definitions, axioms in logical systems and theories concern the foundations of all possible knowledge and thought.

Mathematical logic and in particular formal theories of arithmetic and recursion must be seen as a reflection-into-self of logic, recursion theory and arithmetic itself. Gödel's incompleteness theorems are a unique example of reflection-into-self followed by reversion. Arithmetic projects itself outwards, reflects on the insufficiency of this projection and at the same time mediated by the projection cognizes a truth that leads it back to itself, the fact that the undecidable sentence is in fact true. Gödel's famous result gives us noetic knowledge.

Thus the projection into formal systems bound up absolutely with recursion theory and arithmetic (and hence combinatorics and graph theory and finitary set theory) is part of the cyclic process of investigation of the primitive concepts of thought (which appear to be known and clear but actually are not), a process which unfolds through formal projection and clash and comparison with other projections and hopefully leads to self-reversion. 

What is the relationship between more purely 'logical' primitive terms and others which seem to relate more to ontology, metaphysics, philosophy of mind, physics, etc. ?  In what sense the logical more fundamental (the old question of psychologism, etc) ? These concepts must be treated in the same way as logical and mathematical ones (see the quote from Leibniz under the blog header).

We can suspect a term is primitive if it does not seem to be easily definable. Can we define the logical connective 'and' ? We could group it together with other connectives and specify it by its truth-value properties in inferences, but in doing so we are already making use of it. For example, saying that 'A and B' is true iff A is true and B is true.

The biggest error of western philosophy was abandoning the neoplatonic (and augustinian) concept of the soul as an autonomous immaterial substance with potentially unlimited epistemic and ontological capabilities, and of taking the 'self' to be merely peripheral and mixed aspects of ordinary somatically and sensually conditioned psychological experience (this is the target of the original buddhist theory of anatta) or having a 'depth psychology' and elaborating a theory of the 'unsconscious' or 'subonscious' based merely on inferior aspects of the soul while totally ignoring the true spiritual depth which is both 'within and above' oneself. 

The neoplatonic philosophy of  mind and consciousness (through its theory of analogy and projection and reversion) allows us to reconcile logicism, realism and 'psychologism' and both species relativism and absolutism and both subjective idealism and natural science. Note that category theory besides being a rather specialized theory of relations is at the same time an interesting example of a theory of analogy and this is how it arose in the first place.

With regards to mathematics: how are we to understand why and how complex analysis  and complex analytic geometry (for instance developed by the great geniuses Abel and Riemann)  became so central in 19th century mathematics and beyond ?  How is it connected to problems in number theory and physics (and the significance of the work of Grassmann and Clifford is yet to be fully explored) ?  Hyperbolic geometry seems to be of immense philosophical interest, it perhaps represents the geometry of the soul or nous as opposed to the geometry of nature. The Beltrami surface gives an image of the 'inverted sphere' the return-to-self  which is also the projection to infinity of the soul. Hyperbolic geometry expresses the consistency of infinite different possibilities (a point outside a given line has infinitely many lines going through it which do not intersect the given line).

The cause of the descent of the soul must be some kind of internal disorder and forgetfulness which, by means of the descent, is projected and given external manifestation intimately correlated with the soul's own inner activity, the goal being that the soul will recognize through the world and through this correlation the very internal disorder and forgetfulness it started with, but now known clearly as such and by this insight be lead to a spontaneous and total 'reversion'.  So the descent of the soul is a fundamental 'mistake' and a 'fall' which at the same time is necessary to cure the internal 'mistake' that the soul was carrying within herself before the descent.

Friday, July 11, 2025

Prop. 1 of Proclus' Elements of Theology and Brouwer's intuitionism

The proof of the first proposition of Proclus' Elements of theology is among the most difficult to understand from a formal point of view.  Here is out attempt to make some sense of it using concepts which are also employed in Brouwer's intuitionism (or certain forms of finitism) -  the proof then assumes a structure somewhat like the standard proof of König's lemma.

The proposition reads: every multitude partakes in some way or another of the One.  We take 'multitude' to be represented by a mereological relational system in the form of a tree.

Consider the following interpretation. Proclus assumes that no tree  can have more than a countably infinite number of nodes (and hence branches) because, for him,  there is no infinity greater than countable infinity (the cardinality of the natural numbers). 

Hence there does not exist a tree in which every node has at least one successor and a fortiori in which each node has infinitely many successors- because then the set of branches would be of the cardinality of the continuum.

Proclus' proposition attempts to characterize trees with countable many branches.

Here are at least three types. Type 1 may have finitely many infinitely branching nodes but all branches of finite length.  Thus it participates of unity in a type 1 way (we may think of the terminal node as a 'unity').

Type 2 may have infinite branches but only finitely many nodes with more than one branch passing through them. Would Proclus accept this ? What are we to make of such chains (perhaps they express return-to-self) ? 

Type 3 has finitely branching nodes and finite length branches (what in intuitionism is called a 'barred spread').   Note that in a finitely branching tree if the length of the finite branches is not bounded then - in classical mathematics - König's lemma implies that the tree has an infinite branch.  The contrapositive of this lemma - called the fan theorem - is in fact intuitionistically valid. Thus Type 3 trees must  have bounds for the length of their branches. This is certainly a participating in 'unity' and 'limit' ! 

The following considerations may also have a connection to Proclus' proposition, but we leave this for future study: a basic fact about logic is the need to work with (complete) Boolean algebras or (complete) Heyting algebras as truth values. And that predicates assign such values to each elements of a set (Boolean-valued models, tripos, etc.).  A 'set' is a hierarchical structured tree in which at each level 'membership'  is assigned an algebraic value.  Consider also the authentic meaning of dense and generic subsets of a partially ordered set $P$. If P represents a an infinite 'tree' but in which branches can join,  we can think of P as being linearly ordered representing possible states of (finite, imperfect) information  regarding an object along time. A dense subset is a kind of sequence of bars which guarantees for a given set that as time progresses that set will also progress. A generic set represents a the infinite continuous information trajectory of a possible cohesive object. But a generic set is more than that, it is a kind of amalgamation and transcendent diagonalization of cohesive objects - which belong to the transitive model M in question.  Since it touches all it can be entirely none (when  $f = \cup G$ it is a like a synthesis, integration and transcending of all the limited and possibly contradictory points of view in the ground model M). Cohen's forcing proof (ignoring the for the moment the highly questionably philosophical value of transitive models  of ZF and indeed ZF(C) as a foundation for mathematics) seems to be a rather ad hoc combination of many different ideas and tricks (stemming from Skolem's fundamental insight on countable models) which definitely demands further clarification, simplification and refinement. What about when sheaf semantics comes in. Predicates with generalized truth values correspond to sheaves over the truth values.  In the sheaf semantics approach to forcing the generic object is not required (or is implicitly constructed).  Consider the forcing poset $P$ used to falsify the continuum hypothesis. An alternative would be to take $P$ as a pure poset and to construct a presheaf explicitly which associates to each $p \in P$ a finite function in the form of a finite subset of $B \times N$ satisfying the expected compatibility conditions (i.e. functorial conditions), where $B$ is some large cardinal (and any such finite subset is represented uniquely by some $p$). Then the presheaf topos on $P$ is already an intuitionistic model which falsifies the continuum hypothesis, since we can construct the expected mono in a coherent way ? The double negation topology (which corresponds to the dense topology on $P$) is used only to obtain a Boolean topos ?  It appears that this is not so: the double negation (or dense) topology is required for the local mono condition which makes the total sheaf morphism mono.  There is a very deep philosophy behind this sheaf theoretic construction. An object that cannot exist at once can yet exist spread out through time (or a space of situations) if it does so in a coherent way (the globality of a sheaf is determined essentially by its local qualities: thus if something is locally a mono in a coherent way, the sheaf morphism is mono). A fundamental insight is missing to understand the logical and model theoretic properties of the category of sheaves. Note we are considering presheaves over $P$ which unlike a topological space has no maximal element.   Could we construct a topos in which there is a bounded non-constant holomorphic function ? The construction above is like a completion and like a limit and like the construction of infinitesimals (and internalization of flux into an object). The construction of a topos of sheaves in which every function is continuous is just as interesting and important (if not substantially more) than the forcing methods for transitive ZF models. The same goes for the discovery of the topos theoretic version of forcing. Indeed, consider the sheaf model in which all functions are continuous. Classically, continuous functions have the cardinality of the continuum, while general real functions have that of $P\mathbb{R}$. 

Friday, July 4, 2025

Spiritual development

There is the following ancient and widespread theory regarding consciousness (which is found presented in mythological, philosophical and highly detailed practical form).  That human consciousness normally finds itself is state which is very different from its original state or states which it is ultimately capable. This situation has a cause.  Consciousness is mapped out according to certain domains and powers (without implying that they are not all closely interconnected) and it is found that for each of these domains and powers (which we can come into conscious contact with) there is a certain obstacle or counter-energy, in particular in the form of deeply-ingrained habits and tendencies. All these obstacles work together the ensure consciousness stays in its current state.  If to each of these obstacles and 'illnesses' we apply the right remedy and 'virtue' (in the form of an deeply-ingrained counter-habit and counter-tendency) then this will function like so many keys which will remove the shackles binding consciousness to its unhappy condition.  Be it noted that the transformation involved is total and radical and all-encompassing. This freed and purified consciousness becomes apt to receive higher influences and powers and to be ultimately transformed and transfigured to its original state. Since there are billions of different consciousness it is natural that there are many different kinds of corrupt and fallen conditions which require subtle differences and balances of medicinal virtues and counter-energies.  And as regards to religions and spiritual and esoteric traditions and philosophies, besides the pure universal moral law this is the only important and valid core we should look for - and they should be purified again and again (including through restoration of symbolic and esoteric hermeneutics) until only the pure gold of the core shines forth. Anything beyond morality and yoga or which does not contribute directly to them is to be utterly rejected. 

The synthesis between ancient (neo)platonic philosophy (ultimately deriving from Orphism) and Buddhist philosophy  (together with the traditional darshanas and daoism) offers us a solution to all the problems of modern philosophy (and morality and culture) as well as a reconciliation between ancient and modern philosophy.  The connection between Hegel and neoplatonism is very deep by it is neoplatonism that should be taken as our guide and authority. Also Buddhism appears to have exerted a huge influence in the ancient world. Thus we have the two 'good angels' of the west (who often had to go 'undercover'). 

We must not get lost and drown in the tempest and torrent of our minds,  trying to do introspective psychology and platonic dialectics without preparation.  Rather we must first return to the root uncovering the vast unknown aspects of our bodies, feelings and general aspects of consciousness which have great practical consequences. We need to know what is this 'we' and were it needs to dwell and focus and what it should do.  It is not easy to understand the authentic original meaning of satipatthâna and thus its perfect agreement  with and complementarity to platonism.

There are certain difficulties involved with reconstructing buddhism in its most original authentic form as well as extracting the most relevant philosophical exposition thereof (and we highly recommend Bhikkhu Ñânananda's book Concept and Reality). The Nikayas are a vast and complex collection of texts which demand careful historical-critical analysis. The collection of texts in the abhidhamma division of the Pali Tipitaka is likewise a complex and heterogenous collection of texts clearly reflecting later sectarian dogma but also containing older material of the highest philosophical (and logical) value and interest (and we must not forget the importance of studying parallel Chinese versions of many sections of the Pali canon).

The complete mutual consistency, complementary and even essential identity between platonism and original buddhism may appear to be quite a controversial claim, even if the connection to Pyrrhonism has gained some scholarly acceptance (cf. C.I. Beckwith's 2015 book Greek Buddha). Some important points are the following:

i) The meaning of the Buddha's employment of the term anattâ became lost and confused with a doctrine of the denial of the existence of a 'soul'.  In reality this term is used as part of a practice of dis-identification  (cf. the Atthakavagga) entirely consistent with Plotinean anthropology (for instance Enn. I,1,) and purificatory practices. We also have written about the uncanny correspondence with Aristotle's De Anima. 

ii) The correct methodological and epistemic role of dialectics and cognitive abstention involving undecidable or equipollent pairs of propositions also was ultimately lost, leading to a confusion with logical and conceptual nihilism and relativism.  Thus neither madhyamaka nor Pyrrhonism are consistent with original buddhist dialectics. Rather such dialectics  (see Ñânananda's excellent discussion in Concept and Reality) most closely resembles the anagogic and gradual process of Platonic dialectics (see  Enn. I,3).  

Also (as Jayatilleke holds in his famous book) original buddhism was based on direct evidence (which in modern terms could be described as 'positivism', 'phenomenology' and 'the return to the things themselves') simultaneously with the cultivation of the 'eye'  which is necessary to see things as they really are - and in this again there is perfect agreement with platonism.

iii) The sophisticated formal logic and ontology of Stoicism certainly was known to have played a role in neoplatonism and even middle platonism (specially in the context of the controversy between the stoics and later academy) - and we can inquire into the relationship between the Stoic lekta and Proclus' theory of the logoi (in a proto-Fregean way Platonic ideas at a discursive level can be seen as incomplete lekta). Likewise buried within the Pali abhidhamma literature we find (as already acknowledged in the literature on the Katthâvatthu)  a fairly elaborate deployment of formal logic and a sophisticated theory of types of cause.

We note that in neoplatonism the logoi of the soul and the 'ideas' of the nous are to be understood as living beings in communion with each other in a kind of eternal process of cyclic generation and unification...

iv) Both buddhism and platonism have cultural-political dangers and problems. But note that the passages on race and caste found in the Pali texts are some of the most important in the history of mankind. The philosophical content of original buddhism allows us to reject mythological interpolations regarding kamma, previous existences and  the afterlife - without rejecting such concepts in themselves or an alignment between the ethical and cosmic law. A  problem in the subsequent development of buddhism is the order of bhikkhus itself becoming somewhat like the traditional brahmin caste in all except the requirement of birth: for instance the claim that a layman cannot attain full enlightenment, more emphasis placed on accumulating merit by supporting the monks than personal spiritual development or doing good to others.   A problem with original Platonism is the militarism and totalitarianism (among other troubling aspects) of the Republic as well as many aspects of the Laws.  Militaristic values were deeply ingrained within the fabric of Athenian society (and there were of course natural historical causes for this) and it is noteworthy that the iconoclasm of the famous passage of the Theaetetus which rejects many key values of contemporary Athenian culture does not touch the adulation and idealization of the soldier and warrior (or indeed of the athlete).  The concept of a 'noble lie' is one of the lowest points of the surviving Platonic texts.  We hope to show that we can reject all these problematic elements based on the Platonic philosophy itself.

Wednesday, July 2, 2025

Parapsychology and the philosophy of science

It is far from clear what exactly is the so-called 'scientific method'  but it is clear that is actually a complex and fluid combination of various different methodologies and attitudes all of which are inextricably genealogically and logically connected to theoretical assumptions and hermeneutic decisions.

The scientific method conceived as the 'experimental method'  pertains principally to a certain limited and partial domain of reality - that of 'matter'  or 'physicality' or the  strictly physical-chemical dimension and aspects of living beings - and as thus the kind of theory associated exclusively with it must be essentially an abstraction of reality (rather than a negation of other aspects of reality).

The experimental method is not logically or theoretically self-contained or self-justifying or self-sufficient (for instance it depends on previous theory, hermeneutics and mathematical theory).    It has no claim to supremacy and exclusivity as far as a source of knowledge in its particular associated domain nor a fortiori claims regarding other domains of reality which it may well be totally inadequate for. 

Also if the ultimate aim of physical science is the construction of machines that serve mankind and the good of the world or the development of treatments in medicine, then  the kind of deep intuition which guides the engineer or medical doctor is just as important as any experimental protocol: for there is no greater proof or validation than the machine actually working or the treatment being actually effective.

Experimental science is not the only not the best or most certain or even most important source of knowledge (for instance there are the more certain, more important and more fundamental epistemic domains of  logic, mathematics and ethics, all of which have nothing to do with physical experimentation). Nor does its particular limited domain of application exhaust the totality of reality. Nor can experimental science justify any kind of reduction or alleged correlation (supervenience) between its domain and other different domains.  In fact the actual experimental results and evidence contradict  such reductionist claims. Experimental science cannot a priori impose its epistemic methodology on other domains of reality - and much less claim that a physicalist philosophy is somehow justified by the experimental method itself or its results (which is factually false).

When natural science and the 'scientific method'  violate basic ethical principles such as  when causing harm, suffering and death to human beings or animals in the course of its  methodology and 'experiments' , it shows itself to be profoundly mistaken and driven by the same kind of blind superstition, dogmatism and fanaticism it often projects onto and decries in others.

Spirit, soul, mind, consciousness - this is an entirely distinct domain of reality which cannot be reduced to and does not necessarily supervene on physical matter (the physical brain and body).  There is no reason why the experimental method should be the best method  (as opposed for instance to an axiomatic-deductive or first-person phenomenological and instrospective method - both of which were developed to high degree in Ancient Greece and India)  for exploring and obtaining knowledge regarding this domain of reality. 

And yet since spirit, soul, mind, consciousness are in a way connected to or associated with the physical brain and body it leaves indirectly its footprint in the legitimate domain of physical science.  Thus it should be possible to additionally 'beat physicalism in its own domain', to exhibit tangible, measurable phenomena which even the most convinced physicalist could not deny.

This brings us to the subject called 'parapsychology'.  On the surface this subject consists in certain experimental protocols which as a rule tend to lead to plausible conclusions or bring to light evidence which is radically at variance with a physicalist worldview, or  to exhibit a class of phenomena that while involving the physical world suggests that there are forces at play which transcend it.  So parapsychology  while wearing the cloak  of experimental science does patently have  philosophical concerns.

There is the following major problem with parapsychological research.  A massive amount of scientific activity has been funded with the goal of proving or finding evidence for physicalism (neural reductionism) or for various other theories which assume neuro-reductionist premises.  A substantial and important part of parapsychological research should be devoted to a critical analysis of such experiments and their methodology and protocols showing how they completely fail to establish physicalist claims but rather strongly suggest opposite conclusions. Also parapsychology should point out that there is a massive amount of direct evidence (which was not obtained in a parapsychological context)  suggesting the untenability of neuro-reductionist physicalism.  There are also powerful theoretical deductions that can be made based on known neuroscientific facts (for instance regarding the impossibility of dendritic spines being involved in memory) which again refute physicalism.  None of this involves 'spooky' phenomena and is perhaps not as 'fun' and 'exciting' as the usual concerns of parapsychology, and yet its importance and value is immense and fundamental.

While we hold that much of the experimental protocols and results in parapsychology are both valuable and interesting (specially the work of Rupert Sheldrake) it is a mistake to make such experiments and results a sole foundation for the rejection of physicalism (for there are much more powerful, extensive and conclusive arguments and evidence to be found elsewhere as discussed briefly above).  Indeed it seems that as the rule the researchers in this field have still at least half-consciously profess a kind of confused semi-neuro-reductionism in which mind, consciousness and brain are too easily confused and conflated. This opens the door to a kind of theoretical  neuroscience in which these phenomena could be explained by speculations  pertaining to theoretical physics (for instance telepathy is compared to quantum entanglement).  It becomes not about refuting neuron-reductionism but about exploring the quantum superpowers of the brain (or the interconnectivity of brains rather than primarily of consciousnessness).

Some other flaws we find in parapsychology are arguments from authority which also suggests a kind of implicit western supremacy and exceptionalism.   For instance:  person A was a great scientist and he or she thought parapsychology was a legitimate field of study therefore this counts as evidence that it is so.  We have also seen it implied that a non-western person who undergoes a western academic education (or is involved in business) is somehow bound to be more intellectually honest (or less liable to deception) about paranormal phenomena than his counterpart who has not undergone such an education or training. 

It also should be mentioned that in the past both in the east and west there was already a systematic science (first-person or axiomatic-deductive) involving the kind of phenomena (or powers) studied in parapsychology but with the caveat that no great spiritual importance was attached to them and they were rather seen as dangerous distractions and potential obstacles.

Finally we find it quite disturbing that the interest and use of parapsychology by government, military and intelligence agencies is mentioned - the military was interested in it and funded research in it, therefore this consists of evidence that there must be something to its claims (regarding, for instance, remote viewing) - all the while completely omitting to mention the terrible crimes and violations of human rights documented among such projects.  This topic should first of all be mentioned as a cautionary tale that parapsychology can also be perverted  and misused in criminal activities and that the aspiring parapsychologist must be wary of government and military funding and involvement.

Studies regarding the ability of directed thought to influence other minds and living bodies can have dangerous implications. For instance if in a given community things are not going well or there seems to be consistent 'bad luck' then would not  the popularization of such studies encourage finding a culprit (somebody who allegedly is a source negative directed thought-energy) and even engaging in 'witch-burning' ?  Also what about government and military applications of these facts ? Or massive activity of social media generating automatically a kind of powerful psychic field influencing  public opinion, a kind of spontaneous 'brain-washing' at a distance ?

Addendum to our note 'Differentiability, Computability and Beyond'.  We wish to add some considerations to this note which also have some connection to experiments with random number generators and microPK. Recall that we postulated that a truly free particle must have a completely random completely discontinuous trajectory in space.  This begets the problems: i) define this rigorously. ii) this trajectory is not unique but there are uncountably infinite many such trajectories and so there is no well-defined free state of a particle.  And for i) we can draw inspiration from random number generators and the mathematical definitions used (this corresponds to the discrete case).  Now in our note we considered that a field would act on this random particle in a certain way introducing a geometric form to its associated density or distribution. The similarity to the results in experiments involving random numbers generators is patent.

Just as physical bodies appear separated in space we can ask if the multiplicity of consciousnesses is 'situated' in some kind of analogous medium (which may have a very different 'metric' or concept of separation which need not coincide with the spatial aspect of their corporeal counterparts).  In neoplatonism this might be the 'soul of the universe'  and physical space would be its emanation.  So the 'geometry'  of the soul of the universe must be distinct from ordinary geometry and yet this last must be able to be derived from it (as a special case or projection). Also (certain levels) of soul might occupy a 'body' in such a space which is more extensive and complex than the physical body in ordinary space.

Addendum: there is the also the following very important point about NDEs.  It is important to distinguish the hypothesis of survival,  that is, the independence of conscious experience and personality from the physical brain,  from the interpretation of the content of such experiences which may in most cases be of no more objective significance (or intrinsic value)  than lucid dreaming or vivid imagination or recollection, their varied content being mostly drawn from memory and experience.  There is very little agreement or correspondence between such 'lucid dreams' beyond certain generic emotions and perceptions (light, warmth). The dubious hypothesis regarding mediums and channeling apply equally to NDEs. NDEs are of less value and do not generally do not have anything near the transcendental cognitive content of higher states of meditation (save in their emotional content).

Tuesday, July 1, 2025

Logical notes III (Mathesis Universalis)

Many of the problems which concerned western philosophy are just consequences of an a priori rejection (and this rejection also reflects a spiritual, cultural and intellectual regression) of the platonic philosophy (and its sophisticated form found in Plotinus and Proclus and the Proclean philosophy of mathematics).  And indeed it seems that we can do justice all at once to the geniuses of Frege, Gödel, Hilbert,  Russell, Church, Turing, Brouwer, Skolem, Gentzen, Girard, Lawvere, Martin-Löf and to Meinong and his school and to Hegelian phenomenology and dialectics (which has a striking correspondence with Proclus' theory of eternity, time, dianoia, dialectics, the logoi and their projection and the process of reversion to the nous). 

There is nothing wrong with thinking of consciousness as a spiritual substance and as a place wherein are 'located' a system of pure concepts which are independent of and not derived from sensation or imagination.  Our access to these pure concepts is purely objective and yet they are 'subjective' in the sense that they are part of the substance of consciousness and not (directly) outside it.

They are also involved in the morphogenesis and activity of the body.

This system of pure concepts in human consciousnesses is one and the same because it has one and the same cause beyond ordinary human consciousness and this cause is also involved in the explanation of how the system of pure concepts adequately relates to the knowledge of nature (thus the universe is permeated by reflection and analogy). In our ordinary knowledge these pure concepts come into play, there is also a lower rank involved in abstraction from sensation.

Hegel's science of logic gives us an illustration of the Proclean account of dialectics. Furthermore Hegel's science of logic has some deep connections to modern mathematics and mathematical logic and foundations of mathematics (in particular category theory).  Hegel allows one to reconcile  Frege and Brouwer within a larger and more thorough framework (which is to be an typed, intensional, computational-algorithmic-oriented logic and mathematical foundations - which rejects completed infinite cardinalities in the extensional sense).

Modern mathematics (as well as modern physics) needs very much a clarification, improvement and radical reformation of its foundations.  Voevodsky opened up a promising approach. Category theory is not to be seen as universal theory but rather a specialized and partial one suited for the particular turn modern mathematics took in the 20th century.  It is to be replaced with a structure related to dependent type theory or a more universal theory of higher-order relations.

Plotinus and Proclus offer an integral solution to the problems of the theory of knowledge (which in antiquity are associated to the Academics, Pyrrhonism and the debates with the middle Platonists and Stoics - but also found in Augustine).

Neoplatonism also offered a consistent and insightful theory of spiritual yoga within a coherent philosophical and scientific context. And indeed the theory of dialectics gives the genuine  clarification and possible higher meaning of madhyamaka and Pyrrhonism. Also, the apparent discrepancy between Proclus and Plotinus can be explained by a better understanding of procession and emanation  as a kind of instantaneous continuous current between levels: thus there is no difference between the attainment of nous or henosis by the soul and the metaphor of a drop of water merging into the ocean without loosing its individuality.  Or rather, reversion and return is not to be seen as a lower level reflection but as a direct 'plugging in' to a higher current connecting the levels in eternal continuous simultaneity.

Thursday, June 26, 2025

Mally : Gegenstandstheoretische Grundlagen der Logik und Logistik (1912)

This work of Meinong disciple E. Mally (tr. Object theoretic foundations of logic and symbolic logic) can be found (in the original German) here:

https://mally.stanford.edu/mally.html

See also this interesting paper by E. Zalta about the relationship to Husserl's Ideen:

https://philpapers.org/rec/ZALMDA

Wednesday, June 25, 2025

Logical Notes II

Sausurre's Course in General Linguistics (1916) expresses some fundamental ideas of category theory: that an object's 'value' only make sense in the context of a system of other objects to which it is both similar and dissimilar.  His concept ' sign' is furthermore much like a functor from the category of phonetic materials to that of 'concepts'.  Also in Carnap's Aufbau we find an interesting graph-theoretic oriented discussion of the calculus of relations which is category-theoretic in flavor. In fact Carnap's discussion suggests a more general structure than that of category (which can be seen as appropriate mainly to the 'regional ontology' of mathematics).  This structure simply consists in a collection of objects $x,y,z,...$ which are themselves collections of certain elements and for each pair (possibly the same) of objects $x,y$ a  (possibly empty) class of relations subject to the condition that if we have relation $R$ on $x,y$ and one $S$ on $y,z$ then the composite relation $SR$ belongs to the class of relations associated to $x,z$. A special case is in which we consider for each pair of objets all possible relations.  An apparent difference from the concept of category involves the lack of 'arrows', or the polarity of the relations between two objects.   It is curious how relations become important in tripos theory and the effective topos. But here when we have a relation $R(x,y)$ there is always an orientation which we can take as being defined 'from the first argument object to the second argument object' (like the arrows in Sowa's conceptual graphs).

Thus the correct prototype of the concept of category is a collection of objects each pair of which may be subject to a plurality of different relations.  This prototype concept is superior because the relations are implicitly classified - and this is what is often done in practice in category theory. Morphisms (as usually considered) are rather redundant as relations and only special classes of morphisms become in fact relevant (and relations worthy of that name).  Another problem is that 'naive' category theory assumes we have a notion of equality between 'morphisms' (for instance in universal constructions).  But when are two relations 'equal' ? Clearly mere extensional equivalence is not sufficient or at least problematic.

But is not a category a collection of objects that are like different species of the same genus ? And according to our discussion above, species which can have a multiplicity of different relations between them ? But this is precisely the theory in Aristotle's Topics. The immediate species of a given genus are subject to many possible relations (perhaps involving a third factor): opposition, more-or-less, more desirable,  better known, etc.

What are some of the most significant results in philosophy ? The clarification of the pure concept of computability (both in terms of machines and term rewriting: Turing, Church) and the pure general concept of axiomatic-deductive system and in particular the clarification of axiomatic-deductive systems which to some extent mirror actual human reasoning processes (natural deduction, dependent type theory).  The complex interaction between psychological experience and objectivism (if not already thoroughly explored by Hegel) made definite progress with Brouwer's intuitionism and subsequent advances in constructivism (or intuitionism) - specially dependent type theory (Martin-Löf type theory). A major philosophical error: Zermelo-Frankel foundations and the standard proliferation of the concepts of 'topological space' and abstract theory of rings and fields.  Superior to the concept of topological space is J.R. Isbell's concept of 'locale' (1972) which recaptures aspects of Aristotle's, Leibniz's and Kant's (and Hegel's) concept of space - or better still, the notion of Heyting algebra.

Originally there was a profound unity between algebraic geometry, analytic geometry, mathematical analysis and differential equations - between algebra, geometry and the method of infinitesimals and indivisible points.  This 'good' mathematics was neglected or became marginal during the disaster of 20th century mathematics. It is represented by lesser known disciplines of 'real algebraic geometry' (which we can also say is the 'real' algebraic geometry having roots in the work of the Italian algebraic geometers), the study of analytic, semi-analytic and sub-analytic sets, singularity theory and Thom's theory of stratified morphisms: some of this mathematics seems to have been adumbrated by Hegel's long notes on the section on Quantum in the Science of Logic.  The bad mathematics consisted in wrongly abstracted algebraic geometry based on Noetherian rings and fields (A. Weyl, Grothendieck) and the wrong abstraction of analysis based on general topology and 'infinite-dimensional' vector spaces (all ultimately based on the Bourbaki framework and Zermelo-Frankel set theory). The theory of finitely determined germs and universal unfolding is 'true algebraic geometry' which is also the approach to the calculus based on the basis of powers found in Hegel (also we should prefer étale spaces to the abstract definition of sheaf and covering spaces to locally constant sheaves).  The opposition between Thom and Grothendieck (we mean here scheme theory not his later work on topoi, homotopy theory, dessins d'enfants, etc. )  might be seen broadly as the opposition between authentic and false mathematics. A nice project would be to continue Lawvere's work on synthetic differential geometry and Grassmann and also go back to the great algebraic geometers of the past (such as Bonaventura Cavalieri) and give a rigorous foundation to their infinitesimal and 'indivisible'  techniques

It is interesting to read Husserl's Philosophy of Arithmetic in light of Hegel's treatment of Quantity in his Science of Logic. Indeed Hegel's treatment of number, magnitude, infinite progression, ratio,  measure, etc. can be given interesting interpretations in terms of category theory (specially the treatment of the natural number objet and computability in a topos) and modern singularity and bifurcation theory. 
The theory of knowledge involves the analysis of the essence of reason. But it assumed that human consciousness cannot completely abrogate and go beyond reason (and thus ultimately see reason), not in the direction of something 'inferior' to reason (presumably the realm of dangerous lower instincts or to a kind of 'irrationalism' which is a real problem of our times), but something superior to reason, super-rational. The same goes for the 'self' and 'volition'. Indeed Hume's theory of the self does not deny that there is  systematic  'energy' at work causing the impression of 'self' (even if allegedly this 'idea' is unfounded).  Modern western theory of knowledge has the fatal error of ignoring the fundamental principles of ancient philosophy (both western and eastern) relating to the necessity of engaging in systematic practices relating to the purification of consciousness in order to be able to gain access to knowledge (and in particular self-clarity relating to consciousness itself).  But note that these practices themselves already require (partial) philosophical insight. See MacIsaac's thesis on 'The Soul and Discursive Reasoning in the philosophy of Proclus'  for some interesting and overlooked ideas regarding the theory of knowledge. Sausurre's theory of syntagmas and association is an example of truly insightful phenomenology (or cognitive depth-psychology). Interesting also is Peirce's phenomenology.

If to be at home in the world of objectivity much practice and preliminary methodology is required, a cyclic return and refinement,  why should not the same hold for what Peirce called 'phaneroscopy',  direct internal spiritual cognition ? And the 'subjective' and 'objective' being subtly connected, when easy with the objective world is attained some subjective intuitive clarity is automatically attained. And indeed introspection involves an object, consciousness itself or aspects thereof become objects themselves.
We do not understand what consciousness is nor how consciousness can directly perceive itself nor what in that case would be perceived and how it would perceive and how this perception relates to objectivity. What is clear is that we never perceive atomic sensations.  The concepts of 'subject' and 'object' are liable to criticism.  Maybe consciousness can only at first look at itself indirectly when engaged in some cognitive activity.

Speak not of subject or object but of ceasing to look without and starting to look within (Plotinus). And look within in a way in which you are looking at what is really there without distortions and projections. But what is this 'looking without' and 'looking within' and first of all 'looking' ?  (Husserl himself quotes Augustine's noli foras ire, in te ipsum redi, in interiore homini habitat veritas.).

If we need logic to grasp and operate an axiomatic-deductive system, what sense is there is trying to capture logic itself as an axiomatic-deductive system ? (it is a kind of reflection-into-self). Cf. also our previous criticism of Sextus. The net of logic is very vast yet there is no easy escape from it.

The situation of ordinary consciousness: it cannot just immediately gaze inward on itself, see itself through inner perception and perceive the truth (it will just loose itself in a pool of shadows and mirages and fleeting dreams).  It can only manifest itself to itself gradually in and through its activity and specially clear logical cognition. However there is also the possibility of making a constant progressive effort at 'conversion' or 'turning inward'  its aim and target (its intention). 

Stoic logic was not a kind of 'propositional logic', rather it was closer to quantifier free many-sorted first order logic.

Just as different physical phenomena can be described and studied by the same mathematical structure, so too different mathematical theories can contain the same unity and energy of reversion arising from the same source.

A reconstruction of Boethius' logic in Topicis Differentiis

https://www.academia.edu/144302123/A_reconstruction_of_Boethius_logic_in_De_Topicis_Differentiis