Thursday, December 4, 2025

Notes

The embedding matrices used for instance in ChatGPT-3 are a vector space representation of co-occurrence frequency matrices for a given context-window size. This matrix can also be seen as a complete graph with edges labelled by probability values (we can assign values also to polyhedra).  It is important to study the properties of these matrices.

Locally integrable functions (fundamental to distributions and weak solutions to PDEs) are a kind of sheaf-theoretic completion of the $L^p$ spaces. On the other hand from a finitist point of view simple functions are the basic kind of function and these are 'dense' in many fundamental spaces in an appropriate sense. What is the meaning of a PDE in a scientific context ? A positing of certain recursive numerical algorithms and approximation ideals.  How curious that numerical conditions for stability involve the pairing of space and time.  And fascinating is that smooth initial conditions for PDEs as simple as $u_t - g(u)u_x = 0$ may generate shock waves. And what is, from a computational finitist point of view, a weak solution ? A good philosophical goal: to gain a deeper understanding of distributions (cf. Sato's hyperfunctions). Distributions arise from the practical situation of measurement. We do not measure a field at a point but only an integral average weighted with the peculiarity of the instrument employed.

Some of the most interesting concepts in mathematics and applied mathematics: absolutely continuous functions and functions of bounded variation. What does a finitist and computationalist perspective say ? Both the Weierstrass and Cantor step functions are constructed step-by-step. There must be a property expressable in terms of re-scaling.

How could we prove that a physical system is computing beyond the Turing limit ? We could of course produce experimental evidence to the contrary, producing a certain algorithm that agrees with know observations. There are irrational numbers whose sequence of digits are not computable. How are we to view scientific theories about such numbers (which could represent measurements of some fundamental physical constant), calculations and approximations of such numbers, and confrontation with experimental evidence ? Obviously such a question is only interesting from a non-finitist perspective.

The interest in solving the P=NP problem hinges on the complexity of the algorithm for transforming a NP-machine into a P-machine. 

Uncountable infinities (must check Bolzano regarding this subject) are highly questionable (they express a false objectivism). They imply indiscernibles which should be rejected in the light of subjectivism (in terms of computationalism or more generally definability). There is also the problem of the identity and determination of mathematical objects (Benacerraf, etc.).  A foundation for calculus like Lawvere is called for (in which intervals are primitive). The standard definition is that a set A has cardinality greater than a set B if there is no injective function from A to B. But once we restrict the possible functions on philosophical grounds to being recursive or some more general type, then this situation can happen without increase in 'Cantorian' cardinality. Indeed 'infinity' is always present in the concept of recursively enumerable but not recursive and the same goes for the rest of the arithmetical hierarchy (the true hierarchy of 'infinities') ? This makes perfect sense also for ordering. What is the point of ordering something if it is not in a computable way ? In sheaf models (a partial improvement and clarification of the forcing techniques) there is an acknowledgment of the essential relativity of the whole Cantorian framework of monos and cardinalities (already patent in the Löwenheim-Skolem theorems).

A problem, given an inconsistent set of sentence in some language L  is there a canonical disambiguifying language L' (which associates to certain symbols s of L a set of possible clarifications s1,s2,...) such that with a choice of assignment for occurrences of symbols of L to symbols of L' the set becomes consistent ?

Non-classical logic vs. classical logic. What is important is the question: what is the logic required to be able to understand, carry out and check any system of rules ? Kant's rule-based philosophy is a precursor of our computational a priorism. 

No comments:

Post a Comment

Pâninian Linguistics

https://web.stanford.edu/class/linguist289/encyclopaedia001.pdf Is Pânini's formal system (which appears to be in the form of a term-re...