Tuesday, October 31, 2023

First steps in Higher Topos Theory 2

To understand Higher Topos Theory it is important to master (among other things) the basics of the following subjects in category theory:

1. Monoidal categories

2. Enriched categories (including ends and coends and enriched Yoneda lemma)

3. Model categories (including combinatorial model categories and homotopy (co)limits)  

For 1 and 2 a good reference is Part I of Birgit Richter's From Categories to Homotopy Theory.

For 3 it is the book chapter Homotopy theories and model categories by W. G. Dwyer and J. Spalinski.

But there is more. One must feel very at home with simplicial sets (and their connection to homotopy types) which play a central role in Higher Topos Theory.  Finally one must master the basics of Grothendieck toposes.

Instead of ordinary (set-enriched) model categories we work with model categories enriched over simplicial sets sSet (seen itself as a category enriched over simplicial sets and when endowed with its classical (Quillen) model structure its fibrant-cofibrant objects are called $\infty$-groupoids).  Instead of presheaves to sets we work with simplicial set enriched functors from the opposite of a simplicial set enriched category to simplicial sets. A key result is Dugger's theorem, the simplicial presheaf analogue for combinatorial model categories of the characterization of sheaf toposes as left exact full subcategory localizations of presheaves - itself a generalization of the presentation of an object by generators and relations. 

So Higher Topos Theory is Topos Theory done (homotopically) over sSet rather than Set.  

It seems we can give a more geometric interpretation of the nerve of category $C$ (a canonical way of extracting an simplicial set ) given usually  in terms of composable sequences of arrows. For take three composable arrows $f,g,h$. Think of $f$ and $g$ as being in the plane but $h$ directed perpendicularly into space. Then we get 3-simplex in $N(C)_3$ with faces $(f,g, g\circ f)$, $(g,h, h\circ g)$ and $(g\circ f, h, h\circ (g \circ f))$ where we view a pair of composable arrows together with their composition as a triangle, i.e. a 2-simplex. Our intuition is that the nerve of a category keeps track of all commutative diagrams and each such diagram is a geometric object.

Of course there are competing definitions of (models of) $\infty$-groupoids besides the sSet-based one (Kan complexes) which itself is only one possible choice for shapes (which include cubical and cellular sets).

All this suggests philosophically that our different concepts and models of what a 'space' is are special embodiments of a single 'pure' concept which is yet to be determined. Note that homotopy type theory connects $\infty$-groupoids to  (identity) types/ propositions or spaces of proofs/functions/computations.
 

Sunday, October 29, 2023

Conceptual engineering and inferential delegation

 In society there is a class of people called 'experts'  which are endowed with both cognitive and social preeminence. Some most common examples are doctors and lawyers which are contrasted with non-doctors or non-lawyers which we refer to as 'laymen'.  Experts generally possess more 'refined' versions of the layman's concepts.  Furthermore a public statement of a laymen in matters of health or law is delegated to the experts in a kind of inferential epokhe. The layman relieves himself of any obligation to provide reasons or draw inferences. He also (in general) acknowledges the imperfection and revisability of his own concepts.

We ask: what is the nature and social origin of this inferential delegation and this conscious conceptual engineering ?  What binds the lay and expert concepts together ? What is it that makes them aspects of the 'same' concept ?

This can be a difficult question for concepts which carry a strong historical and cultural baggage.  For example,  'marriage'.  Clearly the legal definition of 'marriage' can be changed more easily than the socio-cultural concept.  But there is a seemingly circular relationship between the two: the legal concept must have its 'origin' in the popular concept (i.e. in a majority consensus)  just as the popular concept 'must' accept that the legal concept as binding.  But what if under certain circumstances the legal concept became widely divergent from the popular one ? Would they still express the same concept ? For instance if people could legally marry frogs would the legal definition of marriage still correspond to the ordinary concept of marriage ?  One of course could answer that marriage has always been an essentially legal concept so the question boils down to whether the union of A and B should or should not enjoy certain special economic and legal benefits and obligations, a question that is obviously historically and socially contingent as well as subject to moral constraints.

One approach to study delegation is through the nature of disputes and the necessity of instituting an 'arbiter'.  Specially important in when disputes take a metalinguistic turn or at least have a metalinguistic component.  Most disputes of the form 'Is X Y ?' involve not only  inference according to agreed upon criteria for Y but disputes about these very criteria. This can be easily checked in social media by surveying multiple answers to questions  such as  'is X overrated ?' or  whether something should or not be included in 'top ten lists'.  The necessity of ending endless disputes may also be one motivation for devising various statistical metrics as a means of comparison for a given a concept. But this turns out to be problematic and creates new problems and disputes. 

Another position is that the very asking of 'Is X Y?' already implies that there is some agree-upon criteria Z for determining Y.   But there could be a multiplicity of accepted criteria Z,Z',Z'' for Y and X may fail one and satisfy another. So the dispute will take the form of weighing and comparing criteria for X which may involve invoking historical precedents. A counter-argument is given by the question 'Will it rain today?'. I do not think that this implies that there is any agree-upon criteria for weather-forecasting.  But, one could rejoin, if there are no criteria, what is the point of the question ?

Disputes over the correctness of a proof are quite rare in mathematics. If they do happen it is due to the sheer complexity and size of the proof or, on the other extreme, the elliptic, ambiguous and incomplete nature of the proof.  This provides  good motivation for the development of proof assistants and proof checkers and the 'formal mathematics' project.

Saturday, October 28, 2023

Philosophical miscellany

1. Arguments for logical pluralism are at times not very impressive. It is the idea that logic is an arbitrary collection of rules which for some reason were static for milennia until the revolutionary discovery that these are arbitrary and conventional and that now everybody can have their own preferred logic...and at the same time it is claimed (rather incongruently) that one's choice of logic is not metaphysically neutral.  The pet example is the 'law of the excluded middle'.  But the fact of the matter is that this particular example completely fails as an argument for logical pluralism/conventionalism.  The relationship between classical and intuitionistic logic exhibits abundantly many characteristics of surprising pre-established harmony, mutual interpretability, conceptual refinement (or subsumption if you prefer) and the sharing of common structural-conceptual spaces (i.e. topos semantics) which would not be in the least expected if classical logic where just an arbitrary collection of rules in which one rule was arbitrarily changed. And the same goes for minimal logic which jettisons the negation rule(s) completely.  To put it in other terms: classical and intuitionistic logic are brother and sister, not complete strangers or a pair consisting of a human and an alien. Intuitionistic logic is of immense mathematical interest and used in a wide range of applications in computer science. What other tinkerings or changes of classical logic - or any deviant logic for that matter - can even remotely compare to this situation ?

2.  What are meaning-as-use theories but attempts to ground logic and language in sociology ? That logic and language factor in importantly in any cogent sociological theory has always been patent. That the study of the social and functional aspects of logic and language is legitimate and important nobody would doubt.  However all this is a far cry from a claim of setting up a sociological and behavioristic reductionism as a standpoint to level arguments against logical realism or other theories of logic. This stance is circular from the start because in order for us to be clear about sociological matters we need forcefully to deploy a wide  range of sophisticated formal concepts and axioms (for instance those pertaining to general systems theory)  which can only be couched in formal logic and mathematics. And if we stick to ordinary vague  natural language terms to  attempt to describe complex sociological behaviors, changes, interactions and emergent structure then what do we have but a caricature of medieval or Aristotelian science ? It is no use saying that word A is explained by language-game B if we lack the formal conceptual and axiomatic apparatus to analyze and classify the language-game. If we lack a rigorous way of formulating descriptions of language-games then how can we ever hope to be able to test our theories or confront them with the empirical facts of human behavior ?

3.  Suppose we put forward the theory that the 'meaning' of a proposition P in a system T is the set M of proofs of  P in T.  So is T- 'meaning' the whole set M itself or is each proof in M a possible T-meaning of P among others ?  And given a proof p what criteria do we have that it is in fact a T-proof of P, a member of M ?  What is the meaning of the statement 'p is a proof of P in system T' ? This is generally accepted in virtue of some kind of intuitive-conceptual process rather than by producing an elaborate meta-logical proof in some meta-system Z. So if the proof-as-meaning view fails at the meta-level why accept it at all  at the basic level ? The meaning-as-proof theory does not seem prima facie more desirable or economical than rival views.  

4. What is mathematical logic ? It is the mathematical treatment of formal systems, but mostly those formal systems of interest to the foundations of mathematics itself.  Mathematics was done with a high degree of logical accuracy before formalism.  The full force of logic, encompassing multiple generality, higher-orderness and even set theory was deployed in mathematical proof long before the advent of modern formalism, for example in Gauss' Disquisitiones Arithmeticae. So is mathematical logic the mathematical study of formal systems serving as a foundation of mathematics conducted in a pre-formal or semi-formal mathematical way ?  That is, a reflection-into-self of mathematics ? The mathematical logician assumes various structural induction principles and sneaks arithmetic, combinatorics and even ordinal arithmetic under the table. The use of the term 'finitary' is questionable. Key theorems in first-order logic such as $\forall x. A(x) \& B(x) \rightarrow \forall x.A(x)~ \& ~\forall x. B(x) $ are actually theorem-schemes which could only be stated in second-order logic, and this argument might be repeated.  Or can mathematical logic itself be conducted within a formal metasystem but discarding knowledge-claims to any significant properties of this metasystem ?  A mathematician when thinking of a proof often skips many logical steps according to a formal system which could formalize such a proof. In fact whatever formal system we choose we will find the mathematician skipping steps. And yet the mathematician does not invent new rules as he goes along. Does he process the skipped rules very quickly at an 'unconscious level' ? There is no evidence for this.  Is he referring by memory and analogy to case were a similar step was in fact gone through in detail ? Metamathematical theorems about intuitionism or nonstandard models depend on their conception and proof on classical arithmetic, combinatorics and computation.

5.  One dogma reads as follows: for every natural language $L$ there is exists a set $\Sigma$ of symbols and a subset $M \subset \Sigma^*$ together with a pair $(T, S)$ ,where $T$ takes expressions of $L$ into elements of $M, $ such that any meaning that can be expressed by an expression $E$ in $L$ can be given as the assignment under $S$ of a unique expression expression $E' \in M$. In other words: natural language can be disambiguated.  Furthermore whenever natural language is used such a disambiguiation is actually somehow effected internally at the level of expressions rather than meaning, even if this last is not registered phonetically or graphically. We propose that each speaker may have their own particular disambiguated language. While meaning is objective and extra-linguistic whenever a sequence of signs is presented as part of a language then we must relativize to a given speaker, perhaps through a proper name or some sort of description or indexical device. We have degrees of variability of languages from individual to individual and we can hope for close or rough correspondences for members of the same socio-linguistic groups. The fact that we learn other languages not by syntactic transformation but by direct attributions of meaning (or transference of meaning) to a new system of signs is of great philosophical importance.  We propose the question: how can two different speakers determine if they are using a given collection of signs in the same way in terms of meaning-attribution ? The question of identical reference (for instance for proper names) is even more difficult.  What is exactly that which Peter means when he states that : Pierre means the same thing by 'poisson' as I mean by 'fish' ? In general we could expect definite descriptions to be still liable of being conditioned by the different culturally determined semantic webs of the two speakers.  Is modern formal-axiomatic mathematics arare example of a 'universal language'  ? How could the logical pluralist account for the clarity and universality of mathematical language if one's mathematical concepts and understanding rested on one's particular logic ?

6. 

Gustavo Augusto Fonseca Silva has written a very interesting monograph of Wittgenstein in the tradition of the following previous works.
 
Ernest Gellner, Words and Things: A Critical Account of Linguistic Philosophy and A Study in Ideology, with an introduction by Bertrand Russell, Beacon Press, 1960.
C. W. K. Mundle, A Critique of Linguistic Philosophy: with Second Thoughts - An Epilogue after Ten Years, foreward by P.L. Heath, 2nd edition, Glover and Blair, 1979.
Aaron Preston, Analytic Philosophy: The History of an Illusion, Continuum International, 2007.
Uwe Meixner, Defending Husserl: A Plea in the Case of Wittgenstein and Company Versus Phenomenology, De Gruyter, 2014.
Mark Steiner, Mathematical Knowledge, Cornell University Press, 1975.
J.N. Findlay, Wittgenstein, A Critique, Routledge and K. Paul, 1984.
Alain Badiou, Wittgenstein’s Antiphilosophy.
Laurence Goldstein, How Original a Work is the Tractatus Logico-Philosophicus ?
J.W. Cook, Wittgenstein’s Metaphysics, Cambridge, 1994.
J.W. Cook, Wittgenstein, Empiricism, and Language, Oxford, 1999.

Thursday, October 26, 2023

From Pavel Tichý's Foundations of Frege's Logic (1988)

Fate has not been kind to Gottlob Frege and his work. His logical achievement, which dwarfed anything done by logicians over the preceding two thousand years, remained all but ignored by his contemporaries. He liberated logic from the straight-jacket of psychologism only to see others claim credit for it. He expounded his theory in a monumental two-volume work, only to find an insidious error in the very foundations of the system. He successfully challenged the rise of Hilbert-style formalism in logic only to see everybody follow in the footsteps of those who had lost the argument. Ideas can live with lack of recognition. Even ignored and rejected, they are still there ready to engage the minds of those who find their own way to them. They are in danger of obliteration, however, if they are enlisted to serve conceptions and purposes incompatible with them. This is what has been happening to Frege's theoretical bequest in recent decades. Frege has become, belatedly, something of a philosophical hero. But those who have elevated him to this status are the intellectual heirs of Frege's Hilbertian adversaries, hostile to all the main principles underlying Frege's philosophy. They are hostile to Frege's platonism, the view that over and above material objects, there are also functions, concepts, truth-values, and thoughts. They are hostile to Frege's realism, the idea that thoughts are independent of their expression in any language and that each of them is true or false in its own right. They are hostile to the view that logic, just like arithmetic and geometry, treats of a specific range of extra-linguistic entities given prior to any axiomatization, and that of two alternative logics—as of two alternative geometries—only one can be correct. And they are no less hostile to Frege's view that the purpose of inference is to enhance our knowledge and that it therefore makes little sense to infer conclusions from premises which are not known to be true. We thus see Frege lionized by exponents of a directly opposing theoretical outlook. 

The following is a recent very interesting and sophisticated development of Tichy's logic (Transparent Intensional Logic):

Raclavský, Jiří (2020). Belief Attitudes, Fine-Grained Hyperintensionality and Type-Theoretic Logic. Studies in Logic 88, London: College Publications.

R.D. Bradley, A Refutation of Quine's Holism (2001)

 http://www.sfu.ca/content/dam/sfu/philosophy/docs/bradley/refutation_of_quine.pdf

For more than four decades, many Anglo-American philosophers have been held in thrall by a captivating metaphor, Quine's holistic image of the man-made fabric (or web) of knowledge and belief within which no statement is absolutely immune to revision. And many have been led to think that the following three distinctions are indefensible:

       (i) that between sentences and the propositions that they express;
       (ii) that between necessary and contingent propositions;
and
       (iii) that between a priori and empirical knowledge.

First I will argue that Quine's holistic metaphor is incoherent since, by its own lights, some statements turn out to be wholly immune to revision. Then I will argue for the rehabilitation of distinctions (ii), (i), and (iii), in that order.

Wednesday, October 25, 2023

H. P. Grice and P. F. Strawson - In Defense of a Dogma (1956)

 H. P. Grice and P. F. Strawson - In Defense of a Dogma (1956)

In his article "Two Dogmas of Empricism," Professor Quine advances a number of criticism of the supposed distinction between analytic and synthetic statements, and of other associated notions. It is, he says, a distinction which he rejects. We wish to show that his criticisms of the distinction do not justify his rejection of it.

Note on Tabak's Plato's Parmenides Reconsidered

M. Tabak's book Plato's Parmenides Reconsidered (Palgrave Macmillan, 2015) is a breath of fresh air in the ocean Platonic and specifically Parmenidean literature which is characterized not only by the insoluble difficulty of the subject matter, but by a certain propensity to a heavy philosophical hermeneutic bias which is ultimately more informative about the philosophical ideas of the author than about Plato's precise intention and method in writing this puzzling dialogue. The original nature of Tabak's thesis - that of the ironic, parody-like - even light-hearted - content of the second half of the dialogue, and the equation of Zeno's and Parmenides' argumentation to that of the sophists criticized in the earlier Platonic dialogues - as well as his according due importance to the briefer treatment of the same questions in the Sophist, certainly invites and a more neutral and lucid approach.

One of our interests in the second half of the dialogue involves the following question: is it possible to formalize in detail the arguments therein in a system of modern logic (including mereology) ? In particular, is this possible for the first part of the second of the eight arguments ? 

Tabak makes some very interesting observations on this last matter. Basically he says that if A is a part of B then since A is a part then A participates of unity and since A is something it participates of being. We can state this in the general case as

\[\phi x \rightarrow (Ux \& Bx) \tag{1}\]

It also clear that Plato is assuming that if $\phi x$ then $x$ participates of something (the form corresponding to $\phi$) and if $x$ participates of something then it has a (proper ?) part ($Pyx$).  Let us just state this very weakly as

\[ \phi x \rightarrow \exists y. PPyx  \tag{2}\]

where $PP$ denotes proper part which excludes the cases $PPxx$.

Now the hypothesis of the second argument is : if the one is.  Let $\odot$ denote the one and let us state this hypothesis as

\[ B\odot \tag{H2}\]

It seems we could use 1, 2 and a convenient supplementation principle to show that

\[ \exists uv. u\neq v \& PPu\odot\& PPv \odot \& \neg Au\& \neg Av \tag{T1} \]

and in particular that $\neg A\odot$.  But the problem here is that in 2 we have no guarantee that the parts of $x$ corresponding to two different predicates (either logically or syntactically or intensionally) will be in turn different.  Since $U$ and $O$ are both different and apparently co-extensional, this might be difficult to formulate.  We note that passages in the Parmenides as well as others in Aristotle's Topics already anticipate some of Frege's later distinctions.  One solution is to introduce a form-forming operation $[\phi]$ and an axiom scheme guaranteeing fine-grained distinction

\[ [\phi]\neq [\psi] \tag{Fg} \]

where $\phi$ and $\psi$ range over syntactically distinct formulas with one free variable. We replace 2 by

\[ \phi x \rightarrow  PP[\phi]x  \tag{2'}\]

Using H2 and 1,2' we can now derive T1 directly as well as stronger results closer to Plato's conclusion.  The problem with this solution is that Fg is already postulating an infinite number of distinct entities so the Platonic conclusion of $\odot$ having an infinite branching tree of proper parts is no longer too surprising. An even more serious problem is that the same form will be a part of completely distinct entities so that there will be universal overlap, $\forall xy. Oxy$. One solution would be to introduce a function symbol $pxy$ which yields the participated mode of form $y$ for $x$. In other words, we replace 2' by

\[ \phi x \rightarrow  PP(px[\phi])x  \tag{2''}\]

and replace Fg by

 \[ px[\phi]\neq py[\psi] \tag{Fg'} \]

 where $\phi$ and $\psi$ range over syntactically distinct formulas with one free variable.

Wednesday, October 18, 2023

Note on the philosophy of mind

I defend the following theses:

1. Only I can take direct cognizance of the states and processes of my own mind (self-luminosity) save in exceptional cases. Thus I defend qualified privileged access without ruling out the possibility of direct cognizance of the existence of other minds.

2. Mind-conduct words essentially involve episodes of private experience and modifications in the stream of consciousness.

3. The mind is a field of causes and effects.

First of all, these theses are not open to attack via an argument from linguistic usage as examples supporting these theses can be easily adduced.  How can this be squared with the belief that everyday language is in 'perfect logical order' ?

We note that thesis 1 does not necessarily commit one to Cartesian dualism. Hence arguments against Cartesian dualism cannot be used against it.  Indeed there are even physicalist interpretations of thesis 1.

We must be careful with the term 'behaviorism' (associated with J. B. Watson) and be careful to define in what sense we are using it, for instance there is mere methodological behaviorism which rules out introspection as a scientific method.  But Watsonian behaviorism rapidly becomes inconsistent once we start to inquire a little bit more carefully into what constitutes 'publicly observable behavior' in terms of states of the physical body (i.e. should subtle internal physiological changes count ?). Or dubiously interpreting linguistic accounts of internal experience as mere external 'linguistic behavior'.

We define metaphysical behaviorism as a position that states that there are no conscious states or processes or private objects and  that these last are fictions.  This position is incompatible with thesis 1.

We define analytical behaviorism as a position that states that statements about mind and consciousness can be analyzed in terms of the behavior of material things. In other words, statements involving first-person experience can be analyzed into third-person behavioral observation. This is likewise incompatible with thesis 1.

A metaphysical behaviorist could not consistently use expressions such as when an intelligent agent is active he is au fait both with what he has completed and what remains to be done or we eavesdrop on our own unvoiced utterances or speak of a 'tune in my head' or 'image in the mind's eye' or use 'paying heed to' or 'object of retrospection'. And what about the sentence 'he became conscious when he woke up' ?

The structure of natural language harmonizes well with thesis 1 as an analysis of our usage  of to take notice, take heed of, observe, introspect and their adequacy for different forms of sensations shows.

A person's actions can be caused by private occurrences.  Here cause does not need to be interpreted as sufficient reason so there is no problem with resolutions or deliberations. And to ask about someone's goal or motivation for performing an act can indeed be entirely distinct from asking about behavior patterns for similar sets of circumstances.

We note the interesting fact that the verb 'imagine' can be used with a noun 'imagine an A' or with a that-clause 'imagine that p'.  But this last can be also expressed in a grammatically concealed way in 'fancy p' which reads 'isn't it surprising that p'.  This should lead us to question using solely grammatical criteria for discerning deeper level categories. And many predicates that apply to externally perceived images apply equally to images perceived internally in the mind's eye suggesting the two pertain to the same 'category'.

Intelligent actions can be both meditations and immediate as even the most rudimentary analysis of mathematical practice shows. Thus appealing to an infinite regress argument - clearly patterned after Bradley's famous regress argument -  to refute Cartesian dualism fails. The same goes for criticism against self-luminosity, for the very definition of self-luminosity involves knowing the act of knowing without this knowing becoming itself an object of knowledge.

The project we propose involves exploring the anti-solipsist thesis: self-luminosity implies heteroluminosity.  This can be traced back to Kant's arguments in his 'refutation of idealism'.  The knowledge of our process of knowing, and thus the knowledge of our own mind,  is bound up with the knowledge of the existence of other minds (and not only 'objects' as for Kant).

Sunday, October 15, 2023

Brief note on emergence and aufhebung in physics

In Newtonian mechanics we can start with simple postulates about invariance under Galilean transformations and the conservation of angular momentum 'emerges' as a consequence of the invariance of inertial frames under rotation. Starting from the invariance of the speed of light in special relativity we get that energy and momentum are united in a elegant way as a single 4-vector (emergence) and that special relativity subsumes classical mechanics as a limit case when $c \rightarrow + \infty$ (aufhebung).
There is also a further process of 'abrogation' in which concepts first considered static and absolute are then revealed to be relative. Being there (Dasein) is only being there through negation of the 'other'. This is exactly what relativity does by making time, length and mass no longer in-themselves but for-the-other (the observer). In classical quantum mechanics, starting from basic postulates about the wave function, observables and the evolution of the system we obtain that by passing to the classical limit we get an interpretation of the operators and the wave function in terms of classical Hamiltonian mechanics (including the wave function expressed in terms of the action S). This is another case of  'aufhebung'. The uncertainty principle (a surprising result) is a case of emergence. The uncertainty principle interacts with the classical concept of angular momentum to give rise to an entirely novel concept / category: that of spin. Quantum mechanics (and specially quantum field theory) often abrogates previous concepts. Even in Bohm theory I think that the concept of an individual particle is no longer tenable once we consider relativistic versions of quantum theory.

Formalizing Aristotle's Topics

 After this preliminary study on the practice of definition in antiquity and in modern mathematics

Modern Definition and Ancient Definition

 we present a proposal for an axiomatic system to capture the topoi in Aristotle's Topics:

Towards a Formalization of Aristotle's Topics

Monday, October 9, 2023

Pyrrhonian strategy in Rorty's Mirror of Nature

Regarding Rorty let us quote from J.N. Mohanty's The possibility of transcendental philosophy (1985) p.59 :

Impressive as he is in his scholarship, he has given very few arguments of his own. He uses Sellars' arguments against the given and Quine's against meaning, as though they cannot be answered, but he has done little to show they cannot be. He plays one philosopher against the other, and would have one or both dismissed, according as it suits his predelineated moral. These are rhetorically effective but argumentatively poor techniques. What does it matter if Sellars rejects the concept of the given - one may equally rhetorically ask - if there are other good philosophers who accept the viability of that concept? There is also an implied historicist, argument that has little cutting edge. If the Cartesian concept of the mental had a historical genesis (who in fact ever wanted to say that any philosophical concept or philosophy itself did not have one?) whatever and however that origin may be, that fact is taken to imply  that there is something wrong about the concept.

On Susanne Bobzien’s groundbreaking discovery in Frege and Prantl

https://handlingideas.blog/2021/02/05/the-stoic-foundations-of-analytic-philosophy-on-susanne-bobziens-groundbreaking-discovery-in-frege-and-prantl/

Sunday, October 8, 2023

Aufhebung in type theory

 Consider the Barendregt Cube or more specifically the system of logics studied in Jacobs book 'Categorical Logic and Type Theory' which starts with simple type theory and culminates in higher-order dependent type theory which subsumes all the rest. There is a process in which each level subsumes the previous one in particular through reflection-into-self, the prototype being the subobject classifier wherein Prop becomes a Type. The culmination of the Barendregt cube is $\lambda C$. There is a parallel with Hegel's development of Concept (Begriff). In the simple typed lambda calculus we have the classical division between subject and predicate $ t : T$, $t$ is of type $T$, $t$ is a $T$, which corresponds to the division between term and type. The development along the directions of the cube expresses the increased dependency of terms on types, types on terms (and types on types) until the distinction is all but subsumed and abolished in $\lambda C$. There is only one pure 'type' $\square$ which expressed the Concept. Jacob's book allows us to trace this in terms of fibered categories.

 We propose that we extend the Barendregt cube incorporating substructural logics via monoidal categories and generalisations of linear logic. This involves replacing the cartesian product with the monoidal product and losing weakening and contraction. Note that symmetric monoidal categories are a primary example of the internalization of the hom-sets of a category within that category.

 Consider how predicate logics are extended via propositions-as-types. The poset (or preorder) for Prop becomes a full-blown category of proof-terms. This is like the Understanding in the Hegelian dialectic that wishes to erase the past, the temporal dynamic constitutive and genetic process of a concept. The understanding says what matters is what is proven not the process of the proof (repressed history). Against this dialectical reasoning forces the proof itself to be remembered and speculative reasoning incorporates the proof-process into a concept, thus turning propositions into types and proofs into terms.

Friday, October 6, 2023

Why we need formal ontologies

 Consider the sentence I wish I had been born earlier. This implies that we are considering possibilities in which the same individual $A$ had a different life-history in a different possible world. By life-history we mean the total sequence of states, actions and events involving $A$ (all a part of a world $W$) during a certain time interval L from the time of birth to the time of death. 

 For a world $W$ can the life-history of an individual $A$ be deduced or defined ? Or is the identity of an individual $I$ fixed  a priori ? Why do two life-histories corresponding to two different worlds correspond to the same individual ? And even for the actual world how could we interpret prexistence or reincarnation or survival in some different form all in the same time-line of the actual world ? What is that which guarantees the identity and continuity of such different embodiments ? 

 Let each world $W$ is endowed with a partition according  to world-time $T$ (of which the $L$ of an individual is a subinterval). Thus we speak of the world $W_t$ at time $t$. The relationship between possible worlds will reflect a branching structure for $T$.  Given a possible world $W$ and time $t\in T$, we can consider a set of possible worlds $W^t_1,...W^t_n$ which coincide with $W$ up to $t$ and then begin to differ.  

This same construction carries over to life-histories of individuals.  It gives reasonable sense to sentences of the type I wish I had not made that decision. Of even statements of an individual $I_1$ regarding an individual $I_2$: if $I_2$ hadn't died young he might have become famous.  This is expressed by the branching structure on possible life-histories in possible worlds which coincide up to a time $t$.

But there is a great asymmetry involved in If he had been born earlier he would have been able to have met Bertrand Russell. Without a previous theory of individuation and identity we are forced to conclude that such an expression is meaningless.  But a theory of individuation and identity means some kind of more or less sophisticated formal ontology and even a  'general systems theory'. We consider the idea of a 'rigid designator' and thought experiments such as the 'twin earth'  problematic.

Mereology is also connected to such a formal ontology. Here are some older notes and sketches illustrating what a formal ontology or general systems theory might look like.

A general theory of systems 

 Some problems of relevance to mathematical general systems theory 

 A formal ontology for general systems theory

The problem of Eastern philosophy and its influence on the West

We are aware of the highly problematic nature of so-called 'Eastern philosophy' such as presented, for example, by Eastern authors writing in Western languages like D.T. Suzuki. The anthropological, cultural and hermeneutic obstacles are many, specially the openness to charges of cultural appropriation, idealization,  selectivity, denaturalization and decontextualization of Eastern culture and religion to fit a Western ideological framework.  We will not attempt to address the question of whether philosophy is a specifically Western phenomenon.  Only that much of the 'philosophy' we read into Eastern text is done so at the expense of ignoring (filtering) their cultural-religious function and context as well as reading into them specifically modern Western concerns and aspirations. Should we treat these aspects exactly the same way as we do ancient Greek philosophical texts, as marginal and dispensable ? Who really cares if Socrates believed in Asclepius in the Phaedo ? We must be content say: we can give a Western philosophical interpretation of certain elements found in ancient Pali or Sanskrit texts.

Can we find (or make up) interpretative correspondences of Stoicism and Pyrrhonism in the East ? And what about modern philosophical movements with strong roots in these ancient schools ? For example Stoicism in light of the work of S. Bobzien on its connection to Frege has a close relationship to  early analytic philosophy. 

An example of a logical positivist reading of the Pali suttas is given by K.N. Jayatilleke's magnum opus Early Buddhist Theory of Knowledge (1963).  Such readings can be extended using  the framework which J.W. Cook in his books about Wittgenstein calls neutral monism (which Cook traces to William James, Mach, Pearson and Russell). A more  Husserlian approach to neutral monism can be elaborated from Guillermo E. Rosado Haddock's book: The Young Carnap's Unknown Master (2008) . Sue Hamilton's Identity and Experience (1996) is a meticulously detailed attempt to extract a philosophy of mind (or psychology) from the Pali Canon.

J.N. Mohanty[moh4, p.8] writes:  what is distinctive about the Husserlian path (...)[is] an openness to phenomena, to the given qua given, to the intended meanings precisely as they are intended(...)

 Pali Buddhism contains the enjoinment  to develop a special type of neutral awareness and analytic attention to experience.  We encounter many expressions which  echo Husserl's epokhê  such as diṭṭhe diṭṭhamattaṃ bhavissati, sute sutamattaṃ bhavissati, mute mutamattaṃ bhavissati, viññāte viññātamattaṃ bhavissati, Samyutta Nikâya 35.95, 'in the seen there will be merely what is seen, in the heard merely what is heard, in the sensed there will be merely what is sensed, in the cognized there will be only what is cognized'. This same sutta also contains passages involving  past, present and future modes of presentation of the contents of consciousness. The connection to Pyrrhonism is developed by C.I. Beckwith in Greek Buddha: Pyrrho's Encounter with Early Buddhism in Central Asia , Princeton University Press (2015). It is interesting to read Nânananda's Concept and Reality (1971) in conjunction with John V. Canfield's rather tentative article Wittgenstein and Zen.  Chad Hansen's chapter 'Language in the Heart-Mind' in Understanding the Chinese Mind, Oxford 1989 is certainly interesting although it already seems to presuppose,  rather than argue for,  a  Wittgenstenian perspective.

We wonder what role the dissemination of the interpretations (correct or not) of  traditional Chinese and Japanese philosophy of language (in particular the perhaps rather artificial universalist Zen philosophy of D.T. Suzuki, published in the early 20th century)  played in the thought of the later Wittgenstein. We ask if the perspective of Canfield's paper (arguing for the Oriental anticipation of the meaning-as-use or meaning-as-social-function theory) is not capable of being harmonized with Husserl's subjective transcendental idealism. The 'acts' of the transcendental ego are not (private) 'thoughts' or  'concepts' nor is there any distinction between inner and outer.

It is important to point out that aspects of Pali Buddhism related to meditation or so-called 'mysticism' are by no means exclusively Eastern.  Similar material is found in Platonic, Aristotelic and Neoplatonic texts involving self-cultivation (a tentative translation of the Pali term bhâvanâ). And also in Stoicism (see these extracts from Epictetus and also for instance Marcus Aurelius, Meditations, Book XI, $\iota\beta'$).

Thursday, October 5, 2023

First steps in higher topos theory

Note that we are not implying that category theory is preferable to set theory as a foundation for mathematics or even for science in general.  We are inclined to hold that dependent type theory is a good candidate for such a role.

From a categorical point of view what is the simplest object we can conceive ? The singleton category with only one object $\star$ and only one arrow, the identity morphism $id_{\star}$ on this object. All such singleton categories are equivalent and a singleton category is in fact the terminal object in the $2$-category $Cat$ of small categories. They are complementary because they emerge from right and left adjoints respectively (which are necessarily fully faithful) of the same unique functor $\mathbb{T}: \mathcal{C}\rightarrow \{\star\}$.

 This gives rise to adjoint modalities $ \emptyset \vdash \star : \mathcal{C}\rightarrow \mathcal{C}$. We can see the singleton category as the most rudimentary being, the Etwas, something.

  The first section of Hegel's logic, the logic of being,  involves an abstract exploration of the concept of  'space' and the differential geometry used in mathematical physics. Hegel, who taught differential calculus both at high school and  university, dedicates a long section of the Logic to the infinitesimal calculus, the notions of which illuminate many other passages of the Logic. We start with what can be considered the archetype of the concept of space, the adjunct quadruple  associated to a the category of presheaves over a category $C$ with a terminal object. We call this space proto-space.

 

Here $\Gamma(P) = P(1)$. It is very instructive to work out these adjoints $\Pi\vdash Disc \vdash\Gamma\vdash CoDisc$ explicitly. We have that $Disc (X)(A) = X$ for all objects $A$ in $C$ and $Disc(X)(f)$ is $id_X$ for any $ f : A \rightarrow B$ in $C$. We have that $\Pi (\mathcal{A}) := \bigcup_{U \in Obj C} \mathcal{A}(U) / \sim$ where $\sim$ is the equivalence relation which identifies $s$ and $s'$ over $U$ and $U'$ respectively if there are $f: V \rightarrow U$, $g : V \rightarrow U'$ with $s_V = s'_V$. If $\Omega$ is the subobject classifier for presheaves on a topological space $X$ then $\Pi(\Omega)$ gives the connected components of $X$ in the usual sense. We have finally that $CoDisc(X)(U) = \mathcal{P}X$. If we have a set map $f : \mathcal{A}(1) \rightarrow X$ then we
can define a morphisms of presheaves $f^\flat : \mathcal{A} \rightarrow Codisc(X)$ given by $f^\flat (U)(s)= \{ x \in X : \exists w \in \mathcal{A}(1), w_U = s \}$ for $s \in \mathcal{A}(U)$.
This quadruple is closely related to the above diagram for terminal and initial objects. The functors in the quadruple arise by taking the left and right Kan extensions along $\mathbb{T}$ and $1$. Thus we have the development of the concept of proto-space from that of being.
The terminal object is the limit of the empty diagram. (Co)limits and adjunctions are all special cases of Kan extensions. Thus starting from the 2-category of all categories we can derive the concept of proto-space employing only Kan extensions. Kan extensions express dialectical reason's process of passing to the other while preserving an essential mediating connection. Proto-space then assumes various form through various localisations giving rise to the sheaf toposes.

In general the adjoint quadruple does not carry over to a topos obtained by localisation. The condition of being cohesive is what guarantees this. One example of a cohesive topos are sheaves on a cohesive site. Thus we are lead spontaneously to the concept of cohesive topos as the right categorical notion of space.

But it should be stated that $Disc$ represents to movement towards discrete quantity or repulsion of units and $CoDisc$ represents the movement towards continuous quantity and coalescence of units.

The Yoneda embedding $C \rightarrow PrShv(C)$ expresses that each category $C$ unfolds into a proto-space. This unfolding of categories in the category of Being proceeds to something extrinsic, as a passage to the other. Dialectical reasoning asks for how something is constructed, for repressed history. A paradigm is that we can have diagrams into the category without the corresponding (co)limit existing.
But to think of a diagram and the concept of limit is already to posit the limit as something other and lacking in the category but nevertheless proceeding from it. Thus the complete and cocomplete category of presheaves given by the Yoneda embedding is a genuine Hegelian progression.

The key to understanding higher category theory is passing the above considerations into the correct generality of enriched category theory. A cosmos is what categories are enriched in. The above is the special case for the cosmos Set. Thus we should think of enriched presheaves, the enriched functor category between an enriched category and the cosmos itself seen as an enriched category.

Homotopy is the passage of quantity into quality. It is a changing of shape and size which preserves and thus defines a certain quality. Model categories are simply categorical abstractions where all constructions in classical homotopy theory can be carried out. The quality associated to a variation in quantity is expressed as the localisation yielding the homotopy category.

A simplicial set is an abstraction of a topological space, it is a categorical abstraction of a geometric form (i.e. a polytope). But it is a geometric form which contains within itself the process of its own genesis or assemblage, analogous to G-code (cf. the geometric realisation functor associating a topological space to each simplicial set). Category theory enriched over the cosmos of simplicial sets is currently seen to be the correct choice for the doing homotopy theory and differential geometry at the highest level of abstraction.

We must investigate the deeper significance of the simplex category $\Delta$ and its associated augmented simplex category $\Delta_a$ used to define simplicial sets. $\Delta_a$ has the natural structure of a strict monoidal category and $[0,1]$ has a natural monoid structure. This situation is universal in that monoidal categories $B$ with monoid objects $M$ are classified by functors $\Delta_a \rightarrow B$ sending $[0,1]$ to $M$. Similary we can obtain a classification of monads in a 2-category (curiously enough equivalent lax functors from the terminal object category). Perhaps $\Delta_a$ can be viewed as a higher qualitative categorical determination of the natural numbers expressing the exteriorisation and unfolding of the unit, one, monas (Hegel includes a short digression on Pythagoreanism in the section on Quantity). 

$Set$ is an exterior, abstract, discrete concept (Hegel's concept of number seems to be very set theoretic), but the category of simplicial sets $sSet$ represents a greater cohesion between parts and qualitative structure and determination. The morphisms between objects in a simplicially enriched category instead of being a mere set become a space.

For a good introduction to higher topos theory see

https://ncatlab.org/nlab/show/geometry+of+physics+--+categories+and+toposes

Genetic logic and epistemology of theories

  By 'theory' we mean a body of knowledge capable of a formal or semi-formal presentation. But here we focus on the kind of theory which can be formalized in the standard way using  for instance first or second order logic or dependent type theory. A theory is not merely the set of sentences which can be deduced from a given axiomatic-deductive system. Rather some sentences are marked off as being more meaningful and relevant than others (theorems, lemmas and corollaries and also examples and counterexamples). We call these 'relevant' sentences. Mere tautologies are a priori excluded.  And most importantly a theory is built up from an dependently ordered (non-circular) hierarchy of definitions. Relevant sentences and the definition system can be organized in a dependency tree which is usually flattened or made linear when the theory is presented (for simplicity we do not dwell here on the case in which a definition depends on a previous relevant sentences, for instance a uniqueness result). This intrinsically linear or tree-like structure is analogous to the process of organic (ontogenetic) or cultural development. We call it the genetic logic of the theory. This leads to the question of justifying a genetic logic. Is it somehow implicit in the axiomatic-deductive system, built into its 'essence' ? Or does the genetic logic and the axiomatic-deductive system both derive from some third principle, something that well-known incompleteness results as well as the fact that the 'same' theory can be presented in terms of different axiomatic-deductive systems would naturally suggest ? In what sense are the successive stage or branchings of the genetic logic tree dependent on each other or the branching elements contained implicitly in or are an unfolding of the previous elements ? We call this the 'unfolding impulse'. There is the remarkable fact that many previous definitions turn out to be particular cases of a single more universal definition. For instance the case of Kan extensions in category theory.  

 Light is shed on this question by considering the genetic epistemology of a theory. That is, the (optimal) process by which human beings come to learn and understand a theory. This can be either on a individual level or on an historical-cultural level. On an individual level at present the genetic logic largely coincides with the genetic epistemology of a theory. But there are important nuances involved in the distinction between research papers, introductions, fundamental treatises and reference works. The very term 'relevant' already suggests a subjective human dimension. It seems interesting to investigate to what extent the mirroring of the genetic dimension in the epistemological dimension can explain the 'unfolding impulse' mentioned above (we do bear in mind the non-linearity of the human learning process and the necessity of 'cyclic return' in the form of revision and refinement). How does our knowledge of a concept or theorem (or a particular kind of philosophical reflection on this knowledge)  already in itself lead to the thrust or impulse to find the subsequent concepts or theorems ? For instance reflecting on the concept of 'initial object' (or reflecting on our own understanding of such a concept) we could easily be lead to propose the dual concept of 'terminal object' (and vice-versa). Aristotle however distinguished between a method starting from things more clear and fundamental 'in themselves' and a method starting from things more clear and fundamental 'to us'. Thus maybe the concept of Kan extension is more fundamental 'in itself' (relative to category theory) but for cognitive-epistemic reasons 'for us' it is better to first go through a series of concrete cases of this concept. We must investigate the relationship between historical genetic epistemology and  individual genetic epistemology (both for adults and for the process of child development). For instance last century children were taught Euclid's Elements at school, thus manifesting a kind of cultural law of ontogenetic recapitulation. We do not think it an abuse of terminology (pace Piaget) to refer to the process by which an adult comes to learn a given theory 'genetic epistemology'. Also we must answer the objection that genetic logic itself can be historically and culturally dependent. For instance we can perhaps show that in outline the genetic logic of the calculus has remained fairly constant since the 17th-century.  And we must study the relationship, both synchronic and diachronic, between theories and the justification of taking a body of knowledge as a single theory (i.e. study subtheories, branches of theories on one hand and on the other interdisciplinarity and theories of theories).  This includes the study of how the 'same' concept can appear in entirely different theories. This is what gave birth to category theory and the notion of functor in the early developments of algebraic topology.

  We propose the definition of 'general logic' as the study of the genetic logic and epistemology of theories.  And we define 'pure logic' as the study of the genetic logic and epistemology of the system of universal concepts present in all theories.  If we restrict ourselves to mathematical theories then in what sense does category theory contribute to this goal and surpass model theory ? 

 Consider the category theoretic concepts of 'product' and 'co-product'.  The co-product of two objects $A$ and $B$ an object $A + B$ (actually an isomorphism class) and two morphisms $i_1 : A \rightarrow A + B$, $i_2 : B \rightarrow A + B$ which corresponds to the simplest way of collecting $A$ and $B$ together into a whole much as in addition and counting. $A$ and $B$ are treated as unity and there is no relation involved. In natural language this corresponds to some uses of 'and'. For instance  the meaning of 'Alice and Bob' is the meaning of both proper names gathered side by side, collected into a whole in the simplest, freest way.  Type theoretically and in algebraic logic the coproduct corresponds to intuitionistic disjunction.  The coproduct (and pushout) is of great importance in homotopy theory . Also topos theory for defining the generalized concept of 'connectivity'.

 Now the product $A\times B$ and its associated morphisms $\pi_\ : A\times B \rightarrow A$ and $\pi_2 : A\times B \rightarrow B$ is a subtly different concept.  In fact we believe that it is only dependent type theory that can give us the clearest expression of its meaning as a degenerate case of the more general type $\Pi (x : A) B(x)$. The product expresses the idea of the simplest type of mixture and distributive combination of the objects $A$ and $B$, a template for all the relations an $A$ can have with a $B$. We can also consider the notions of 'interaction' and 'entanglement'.

Or there is a mereological analysis:  a part of $A+ B$ may be a part only of  $A$ or  only  of $B$ but a part of $A\times B$ must always involve both $A$ and $B$. Thus the mereological analysis overlaps with the standard Curry-Howard interpretation.

But these considerations can be refined by considering linear logic and the further duality present in each of the basic logic connectives.

  Older note: In formalizing concepts one must ask: scientific (including logical and mathematical) concepts or the semantics of ordinary human life-world concepts ? How do these two relate ? What is their dependency and priority ?  We can investigate  'Semantic Primes and Universals' to use the terminology of A. Wierzbicka. Each folk-concept is given a 'definition', a 'story', employing a fixed set of 'semantic primes'. Piaget's genetic epistemology is also worth exploring from this perspective.

Sunday, October 1, 2023

Addenda on Ancient Natural Deduction

 The following notes can serve as a complement to our paper Ancient Natural Deduction.  First we have a formalisation of Euclid I.1 and Euclid  VII.1 in linearised natural deduction to illustrate the constructivist use of the existential quantifier.  The following note on Physics 231b18-232a18 contains an additional example of how Aristotle is lead to reason with multiple and embedded quantifiers.

 A striking revolution in the history of logic has been the rediscovery of the richness and sophistication of ancient logic, specially Stoic logic and Indian Logic (cf. papers of G. Priest and John N. Martin as well as [jaya, gan] ) and its often close anticipation of modern formal and philosophical logic and philosophy of language. A substructural sequent calculus and many key concepts of Frege's philosophy of logic and language have been shown to have been  in possession by the Stoics.   This has rendered the traditional narrative surrounding Frege untenable (we also need to investigate connections to Lotze and Bolzano.  We need to accumulate evidence that the ancients had a logic embodying axioms and rules capable of dealing with multiple generality and nested quantifiers (see the article by Bobzien and Shogry on multiple generality in the Stoics and T. Parsons' book Articulating Medieval Logic) , and that many concepts of modern logic and category theory were anticipated in ancient logic. We must study the Stoic categories and theory of definition and their affinity to Leibniz's approach as well as the methodology  employed in modern mathematics and science.

Quodlibet

 1. René Thom called quantum mechanics 'the greatest intellectual scandal of the 20th century'. Maybe this was too harsh, but quantu...