Wednesday, January 8, 2025

Systems theory

To construct a model of reality we must consider what are to be considered the basic elements. Postulating such elements is necessary even if they are seen as provisory or only approximative, to be analysed in terms of a more refined set of basic elements.  A very general scheme for models involves distinguishing between time $T$ and the possible states of reality $S$ at a given time $t$. $T$ is the set of possible moments of time. Thus our model is concerned with the Cartesian product $S\times T$. In modern physics we would require a more complex scheme in which $T$ would be associated with a particular observer. It is our task to decompose or express elements of $S$ in terms of a set of basic elements $E$ and to use such a decomposition to study their temporal evolution.

The most general aspect of $T$ is that it is endowed with an order of temporal precedence $\prec$ which is transitive. We may leave open question whether $T$ with this order is linear (such as in the usual model of the real numbers) or branching. The most fundamental question regarding $T$ concerns the density properties of $\prec$. Is time ultimately discrete (as might be suggested by quantum theory) or is it dense (between two instants we can always find a third) or does it satisfy some other property (such as the standard ordering of ordinals in set theory) ? The way we answer this question has profound consequences on our concept of determinism.

For a discrete time $T$ we have a computational concept of determinism which we call strong determinism. Let $t$ be a given instant of time and $t'$ be the moment $t\prec t'$ immediately after $t$. Then given the state $s$ of the universe at time $t$ we should be able to compute the state $s'$ at time $t'$. If this transition function (called the state transition function) is not computable how can we still have determinism regarding certain properties of $s$ which we call weak determinism. Stochastic models also offer a weak form of determinism although a rigorous formalization of this may be quite involved. A very weak statement of determinism would be simply postulating the non-branching nature of $T$.

We can also consider a determinism which involves not the state in the previous time but the entire past history of states and having an algorithm which determines not only the next state but the states for a fixed number of subsequent moments. For instance the procedure would analyze the past history and determine which short patterns most frequently occurred and then yield as output one of these which the system would then repeat as if by "habit".

The postulate of memory says that the all the necessary information  about the past history is somehow codified in the state of the system in the previous time. For a dense time $T$ it is more difficult to elaborate a formal concept of determinism. In this case strong determinism is formulated as follows: given a $t$ and a state $s$ of the universe at $t$ and a $t\prec t'$ which is in some sense sufficiently close to $t$ we can compute the state $s'$ at $t'$. Models based on the real numbers such as the various types of differential equations are problematic in two ways. First, obtaining strong determinism, even locally, is problematic and will depend on having solutions given by convergent power series expansions with computable coeficients or on numerical approximation methods. Secondly, differential models are clearly only continuum-based approximations (idealisations) of more complex real systems having many aspects which are actually discrete. The determinism of differential models can be thus seen as based on an approximation of an approximation.

We now consider the states of the universe $S$. The most basic distinction that can be made is that  between a substrate $E$ and a space of qualities $Q$ . There is also an alternative approach such as the one of Takahara et al. based on the black box model in which for each system we consider the cartesian product $X\times Y$ of inputs $X$ and outputs $Y$. In this model we are lead to derive the concept of internal state as well as that of the combination of various different systems. We can easily represent this scenario in our model by simulating the input and output signalling mechanism associated to a certain subset of $E$. States of the universe are given by functions $\phi: E\times T \rightarrow Q$. We will see later that in fact it is quite natural to replace such a function by the more general mathematical structure of a "functor". To understand $\phi$ we must consider the two fundamental alternatives for $E$: the Lagrangian and Eulerian approaches (these terms are borrowed from fluid mechanics).

In the Lagrangian approach the elements of $E$ represent different entities and beings whilst in the Eulerian approach they represent different regions of space or some medium - such as mental or semantic space. This can be for instance points or small regions in standard Euclidean space. The difficulty with the Lagrangian approach is that our choice of the individual entities depends on the context and scale and in any case we have to deal with the problem of beings merging or becoming connected , coming to be or disappearing or the indiscernabiliy problem in quantum field theory. The Eulerian approach besides being more natural for physics is also very convenient in biochemistry and cellular biology where we wish to keep track of individual biomolecules or cells or nuclei of the brain. In computer science the Lagrangian approach could be seen in taking as basic elements the objects in an object-oriented programming language while the Eulerian approach would consider the variation in time of the content of a specific memory array.

We call the elements of $E$ cells and $\phi: E \times T \rightarrow Q$ the state function. For now we do not say anything about the nature of $Q$. In the Eulerian approach $E$ is endowed with a fundamental bordering or adjacency relation $\oplus$ which is not reflexive, that is, a cell is not adjacent to itself. The only axiom we postulate is that $\oplus$ is symmetric and each cell must have at least one adjacent cell. We have that $\oplus$ induces a graph structure on $E$. This graph may or not be planar, spatial or embeddable in $n$-dimensional space for some $n$.

We can impose a condition making $E$ locally homogeneous in such a way that each $e\in E$ has the same number of uniquely identified neighbours. For the case of discrete $T$, the condition of local causality states that if we are in a deterministic scenario and at time $t$ we have cell $e$ with $\phi(e) = q$ then the procedure for determining $\phi(e)$ at the next instance $t'$ will only need the information regarding the value of $\phi$ for $e$ and its adjacent cells at the previous instant. Many variations of this definition are possible in which adjacent cells of adjacent cells may also be included. This axiom is seen clearly in the methods of numerical integration of partial differential equations.

Now suppose that $T$ is discrete and that $E$ is locally homogeneous and that we indicate the neighbours of a cell $e$ by $e\oplus_1 e_1, e\oplus_2 e_2,...e\oplus_i e_i$. Then the condition forhomogenous local causality can be expressed as follows. For any time $t$ and cells $e$ and $e'$ such that $\phi(e,t) = \phi(e',t)$ and $\phi(f_i,t) = \phi(f'_i,t)$ ,where $f_i$ and $f'_i$ are the corresponding neighbours of $e$ and $e'$, we have that $\phi(e,t') = \phi(e',t')$ where $t'$ is the instant after $t$.

An example in the conditions of the above definition is that of a propagating symbol according to a direction $j$. If a cell $e$ is in state on and cell $e'$ such that $e\oplus_j e'$ is in state off then in the next instant $e$ is in state off and $e'$ is in state on. Stochastic processes such as diffusion can easily be expressed in our model.

A major problem in the Eulerian approach is to define the notion of identity of a complex being. For instance how biological structures persist in their identity. despite the constant flux and exchange of matter, energy and information with their environment.

We clearly must have a nest hierarchy of levels of abstraction and levels approximation and this calls for a theory of approximation. Some kind of metric and topology on $E$, $T$ and the functional space of functions $\phi$ is necessary. Note that all the previous concepts carry over directly to the Lagrangian approach as well. In this approach a major problem involves formalising the way in which cells can combine with each other to form more complex being. If we consider the example of biochemistry then we see that complex beings made up from many cells have to be treated as units well and that their will have their own quality space $Q'$ which will contain elements not possible to be realise by a single $e\in E$. This suggests that we need to add a new relation on $E$ to account for the joining and combination of cells and to generalise the definition of $\phi:E\times T \rightarrow Q$.

We take the Lagrangian approach. We now add a junction relation $J$ on $E$. When $e J e'$ then $e$ and $e'$ are to be seen as forming an irreducible being whose state cannot be decomposed in terms of the states of $e$ and $e'$. The state transition function must not only take into account all the neighbours of a cell $e$ but all the cells that are joined to any of these neighbours.

Let $J'$ be the transitive closure of $J$. Let $\mathcal{E}_J$ denote the set of subsets of $E$ such that for each $S\in \mathcal{E}$ we have that if $e,e' \in S$ then $e J' e'$. Inclusion induces a partial order on $\mathcal{E}$. Instead of $Q$ we consider a set $\mathcal{Q}$ of different quality spaces $Q$,$Q'$, $Q''$,...which represent the states of different possible combinations of cells. Let us assume that $Q$ represents as previously the states for single cells. For instance a combination of three cells will have states which will not be found in the combination of two cells or a single cell. Suppose $e$ and joined to $e'$ and the conglomerate has state $q \in Q'$. Then we can consider $e$ and $e'$ individually and there is function which restricts $q$ to states $q_1$ and $q_2$ of $e$ and $e'$. In category theory there is an elegant way to combine all this information: the notion of presheaf. To define the state functions for a given time $t$ we must consider a presheaf:
\[ \Phi_J: \mathcal{E}_J^{op} \rightarrow \mathcal{Q}\]
The state of the universe at given instant will be given by compatible sections of this presheaf. To define this we need to consider the category of elements $El(\mathcal{Q})$ associated to $\mathcal{Q}$ whose objects consists of pairs $(Q, a)$ where $a\in Q$ and morphisms $f:(Q,a) \rightarrow (Q',a')$ are maps $f:Q \rightarrow Q'$ which preserve the second components $f(a) = a'$. Thus a state function at a given time is given by a functor:
\[ \phi_J: \mathcal{E}_J \rightarrow El(\mathcal{Q}) \]
But $J$ can vary in time and we need a state transition function for $J$ itself which will clearly also depend on $\phi_J$ for the previous moment. Thus the transition function will involve a functor:
\[ \mathcal{J}_J: hom(\mathcal{E}_J , El(\mathcal{Q})) \rightarrow Rel(E) \]
and will yield a functor
\[ \phi_{\mathcal{J}_J(\phi_J)}: \mathcal{E}_{ \mathcal{J}_J(\phi_J)} \rightarrow El(\mathcal{Q}) \]
Note that we could also consider a functor
\[ \mathcal{E}: Rel(E) \rightarrow Pos \]
which associates $\mathcal{E}_J$ to each $J$.

The relation $J$ is the basic form of junction. We can use it to define higher-level complex concepts of connectivity such as that which  connects various regions of biological systems. We might define living systems as those systems that are essentially connected. These can be defined as systems in which the removal of any part results necessarily in the loss of some connection between two other parts. This can be given an abstract graph-theoretic formulation which poses interesting non-trivial questions. Finally we believe this model can be an adequate framework to study self-replicating systems.

No comments:

Post a Comment

Hume, the most misunderstood philosopher

We grant that the Treatise may not be a entirely consistent work and that its precise aim may still be quite unclear.  But this does not era...