Invitation to Scholars - Part II
Invitation to Scholars - Part II
Two Chilean biologists, Maturana and Varela, founded Autopoiesis. It defines the conditions on process by which a living organism can both create and maintain itself in a state of autonomy in a constantly changing ecological niche. Autopoiesis also teaches us that the role of the central nervous system is not to create models of the external world as is believed in so much cognitive and computer science. The role of the central nervous system is to synthesize acts. Therefore perception comes from “informare” not from external sources. Perception begins with acts followed by the senses being directed to detect what changed. Physical senses are not passive receivers but are active inquirers.
Axiology is the theory of value. Here we choose the work of Robert Hartman who worked towards transforming axiology from value philosophy to value science.His formal definitions of Intrinsic, Extrinsic and Systemic value and the laws relating them explain many puzzles concerning value conflicts and paradoxes. The value harmonization process is a process of finding the most effective action for any situation or condition using the value hierarchy. Living beings, by necessity, must use all three dimensions to find such effective action. Different value dimensions come into play for different functions. For example a person can be valued for all her unique combinations of identity (intrinsic, unconditional love), or for her blue eyes or ability as a mother (extrinsic, conditional measurement), or for how she is the right type for a part in a play (systemic, right or wrong). A paradox then would be we should love one another, but justice requires execution. Justice is a systemic value and carrying out the required execution places systemic over intrinsic.
Hartman’s value classes are based on intensions (meanings). The classes are:
- Intrinsic based on the intension of a unicept, a unique concept applicable to one, and only one, object. Intrinsic values are aesthetic and have cardinality (size) of N1.
- Extrinsic based on the intension of an analytic concept applicable to many objects and learned from experience by abstraction. Extrinsic values can be judged good and bad. Extrinsic values have cardinality N0.
- Systemic based on the intension of synthetic concepts produced by postulates. Systemic values can be judged right or wrong. Systemic values have finite cardinalities.
Note: “Cardinality” is a mathematical term for the number of elements in a set. The alephs (N) refer to orders of infinity.
The value calculus is based on the cardinality of the intensions. Intrinsic values take precedence over extrinsic and extrinsic values take precedence over systemic.
When interaction takes place between living beings, intrinsic valuation is necessary to realize the full potential of the interaction. The reason develops from the fact, as shown by Rosen, that living organisms are not computable.
A living organism is internally a society and externally is part of a society. To fully interact with an organism requires a potential infinity of approaches and perspectives. Extrinsic valuation reduces interaction to a single criterion and judgement. For example, “he is a good worker” or, remembering the movie, “she is a 10”. But at least with extrinsic valuation there is some room for variety in how one might become a good worker. Systemic valuation is still more constraining. Systemic deals with rules. There is no gray here. Systemic is black and white.
It is no wonder religious leaders have tried to teach us the importance of love (intrinsic value).
Facts and Values:
We all live in a world of process. What is manifest, what we call reality, “becomes.” There is nothing that has “become” without arising out of prior existents. As philosophers might say there is no creation ex nihilo, i.e., creation out of nothing.To understand the relation between facts and value we need to understand the processes of “becoming.”
In the process of “becoming” there will be a succession of facts. Each moment of facts establishes a ground for what can become in the next moment of facts. What will become is decided by values.
Let us be clear. At each moment of process what can become in the next moment will be dependent on the facts of the current moment while what will be chosen for the next moment, what will become, is chosen by values. This means that what develops in our lives depends on what we value. However, “God’s mills grind exceedingly slow but exceedingly fine”. It can take a long time for the consequences of a value choice to work themselves out. Thus we may not associate the results with the value choices leading to them. Also, the relations between the value choices and the results may be quite subtle. We may not ever realize that we held the values that led to disaster.
Semiotics is the theory of sign processes. Signs are like symbols in that they stand for something (a semiotic object). However, signs also have interpretants. Thus a sign is a triad consisting of the sign vehicle, a semiotic object and an interpretant. The theory of sign processes was put forward by Charles Peirce a century ago. Until now it has been largely ignored. Now however Peirce’s genius is being recognized. There are many centers for the study of Peircian semiotics. There are scholars around the world developing semiotics, including new versions such as biosemiotics.
I believe a most important next step is to combine semiotics with Rosen’s work. My reason is that semiotic processes take on a life of their own in what may be a primordial matrix.
The Basis for the Synthesis of these aforementioned disciplines:
There are two disciplines primarily concerned with systems of starting assumptions. They are philosophical metaphysics and logic. These are not popularly known disciplines since in normal times one can simply accept tradition. However, in times of radical and fundamental change these two disciplines should come to the fore. That is where we begin.
We know from both semiotics and autopoiesis that our experiences of reality, our perceptions, are not simply dictated through our senses by an external reality. Our experiences and perceptions are co-created by ongoing interactions or, in the words of Floyd Merrell, interdependent interrelations, between that external reality and our selves. Thus we should ask, what do we bring to this co-creation and how does what we bring effect our experience and perceptions?
We all are born into a language using community from which we all learn a whole system of starting assumptions. What we can ultimately experience and perceive depends on what that system of starting assumptions permits; unless, of course, we consciously revise it. Fortunately, language is intrinsically meaningless. We have to work with it to shape it and render meaning. In the course of shaping language, the role of philosophy, we may find flaws in our starting assumptions leading to conscious revision.
Conscious revision is a most difficult process. Let us say we begin with the assumption that the world is composed of things. We learn to recognize things, to act with things, and to survive amongst things. But then we find things which don’t quite act right for things. (For example, the wave/particle duality in quantum physics.) Then begins a glimmering insight that there is something besides things. Initially this may be a frightening insight since we have no experience surviving in a world of more than just things. But ultimately the insight grows until it can not be denied. And philosophers and scientists talk about it. It becomes part of the language, and new generations are born into a different world.
Right now we, the people of the world, are in a transition phase of conscious revision. The system of starting assumptions that has served for twenty-five centuries has begun to fail. Two indicators of the failure are the peculiarities of quantum physics and crises in the foundations of mathematics that occurred around the beginning of the 20th. Century. More spectacularly we find social institutions beginning to fail.
Our Western tradition is based on substance philosophy. The revision, the new metaphysics, is based on process philosophy.
Process Metaphysics: About process metaphysics Rescher says:
“… a process metaphysics propounds certain characteristic stresses of emphasis in contrast to those of a substance metaphysics, as follows:
- discrete individuality
- condition (fixity of nature)
- uniformity of nature
- unity of being
- descriptive fixity
- classificatory stability
- passivity (being acted upon)
- interactive relatedness
- wholeness (totality)
- activity (self-development)
- unity of law
- productive energy, drive, etc.
- fluidity and evanescence
- activity (agency)”
These metaphysical categories reflect what we look for in experience. I found it to be an interesting exercise to try changing my experience by looking for interactive relatedness rather than discrete individuality, etc.
Process philosophy has become a very active branch of inquiry in which there are many competing positions. Without becoming involved in any one such position Rescher characterized process philosophy as a doctrine committed to certain basic teachings or contentions as follows:
- that time and change are among the principal categories of metaphysical understanding.
- that process is a principal category of ontological description.
- that processes – and the force, energy, and power that they make manifest – are more fundamental, or at any rate not less fundamental, than things for the purposes of ontological theory.
- that several, if not all, of the major elements of the ontological repertoire (God, nature-as-a-whole, persons, material substances) are best understood in process terms.
- that contingency, emergence, novelty, and creativity are among the fundamental categories of metaphysical understanding.
Taking the above as a minimum statement of our requisite metaphysics, we immediately see problems for logic as we have known it. Time and change are principle categories, yet time is not mentioned in logic. Aristotle recognized this problem, the omission of time.Also among the fundamental categories are contingency, emergence, novelty and creativity. Yet a fundamental property of logic is that it should be “truth preserving”.
Note that nouns dominate our language. But not all languages are so dominated. There are verb-dominated languages. Further, David Bohm, the famous physicist who proposed implicate/explicate order, attempted to create verbs out of nouns.
Similarly, we find logic dominated by things. Propositions are typically in subject-predicate form. Numbers are defined extensionally in terms of sets. And even in relational logic there are only relations between things. There are no relations between relations.
On Reforming Logic:
A major step in building the desired science is to reform our notions of logic. Logic is now undergoing a revolution. Due to crises in the foundations of mathematics, crises which make the Y2K computer bug seem trivial by comparison, more research has been done in logic during the 20th Century than all previous human history. And even that remains an understatement. It is not just more! It is incomparable! Once upon a time logic was concerned with propositions and truth preserving rules of inference. Logic today is as different from logic of the past as is quantum physics different from Newtonian physics. Perhaps even more so!
Initially I propose some relaxed conditions. The first step towards logic can be used to simply say what can not be said in natural language. As expression becomes stable other criteria can be examined. Coherence, for example.
We want to form a logic of process. Central to the notion of process is the idea of what is being done, or acts, rather than what is. The importance of “what is” will arise later as a condition on what can be done. Can logic be expressed strictly in terms of acts? Yes, it can be. To illustrate we turn to what is now known as combinatory logic. I prefer to call it combinator logic so as to not get it confused with combinatorics. Also, combinators might better be considered an epi-logic, i.e., logic of logic.
Back in the 1920’s rigorous processes for substitution of variables was a serious problem in mathematics. A German logician, Shöenfinkle, proposed a method of variable free mathematics. An American logician, Haskell Curry, was, at that time, in Germany as Shöenfinkle’s student. Curry went on to develop combinators and might well be considered the father of combinatory logic.
Combinators might be viewed as simple acts. The acts are expressed as reduction rules. In the following expressions the bold capital letter is a combinator. The expression says to replace the left side with the right side. Or, we might envision this as the combinator rewriting what follows it and then going away having done its job. For example:
Ix* > x* elementary identificator Cfxy > fyx* elementary permutator Wfx > fxx* elementary duplicator Bfgx > f(gx)* elementary compositor Kcx > c* elementary cancellator
Now one might wonder how this leads to variable free mathematics. To take a simple example, consider the expression (x+1)2. Normally we would substitute some number for the x, say 2, add the one and square. This can be expressed in combinators as B(WM)(CA1) where the bold capital letters are combinators and the non-bold capital letters are ordinary arithmetic operators.
Instead of substituting 2 for x we apply the combinator expression to 2 as follows:
Notice that (WM) corresponds to the f in the reduction rule for B, (CA1) corresponds to g, and 2 corresponds to x.
Now we can apply the B to get WM((CA1)2)
Now we can apply the W to get M(( CA1)2)((CA1)2)
It is a convention in combinator logic that parentheses cluster left and when in normal order parentheses need not be shown. Thus ((CA1)2) can be written as (CA12).Applying the C rule we get
Now there is nothing left but to do the indicated arithmetic.
It is amazing to think that these five basic acts can represent all computable functions. Perhaps it is even more amazing to think that we really only need two rules. They are:
Sfgx > fx(gx) Kcx > c
Also note,* S = B(B(BW)C)(BB)
It seems as if most work on combinators today is done in terms of SK systems. Included in the on-going work in computer science is combinator graph reduction. Functional languages are implemented in terms of SK graphs. Special hardware can be, and is being, designed for such graph reduction. Also, combinators are proving to be most useful in ordinary circuit design.
Also, getting away from the thingness of the past, natural numbers can be defined as combinator processes and so can the arithmetic operators.
Some Consequences: Combinator approaches introduce us to some profound differences. First, to study more traditional logical notions, known as illative notions, a pure combinator framework is established and then illative operators are adjoined. There are many illative systems being studied. One particular issue concerns rules of inference.
The combinators shown above can represent all computable functions. However, with those combinators we immediately face a difficulty. We can form a paradoxical combinator Y. Y = WS(BWB)
This combinator has the property that Yx = x(Yx). Thus if we let N stand for negation we get
YN > N(YN) > NN(YN)* and so forth.
Since we have been taught that we don’t want paradox in mathematics, and we probably don’t, the existence of a paradoxical combinator, and there are many more, threatened the use of combinators in developing mathematics. Much work has been done on developing normal forms and type theories that avoid paradox.
However, there is another way to look at it. The problem with paradox is that it leads to inconsistency, and an inconsistent system is useless. This may seem an odd way of speaking since most people are likely to think paradox is inconsistency. Let us examine this a bit.
In setting up a formal system the first activity is to define what can be said, i.e., defining “well formed” expressions. The second activity is to separate those well-formed expressions we want to assert as part of the theory from those that we want to exclude from the theory. If the rules of procedure collapse the separation,then the theory is considered to be useless, i.e., it can prove or assert anything. It is inconsistent.
Now a surprising observation is that paradox need not lead to inconsistency. Historically a rule called modus ponens was a standard rule of inference. Paradox combined with modus ponens will surely lead to inconsistency. Yet there are now known rules of inference which can be combined with paradox without leading to inconsistency.
For autognomics, which studies living process that includes values, this surprising observation is a more important point since values typically do have a paradoxical character. For example, in an experiment in which people were given a choice in pairs between three things of equal value if they chose A over B and B over C they would typically choose C over A.
Since Y was named the paradoxical combinator new discoveries have led to its name being changed to fixed-point combinator Y0 . From Y0 an infinite series of fixed-point combinators can be generated as
Yn+1 = Yn(SI)
Now the study of fixed-point combinators has become quite active. The study includes both the search for fixed-point combinators and, when found, determining their usefulness. Two sites we know of where such studies are being done include Argonne National Laboratory and BRICS, Department of Computer Science, University of Aarhus in Denmark. ( BRICS stands for Basic Research in Computer Science, Centre of the Danish National Research Foundation.) A BRICS paper, “Constructing Fixed-Point Combinators Using Application Survival” was written by Mayer Goldberg while visiting BRICS from the Computer Science Department, Indiana University.
Combining Logic and Process Metaphysics: Once upon a time the subject of logic was, itself, well defined. One could point to propositional calculi, predicate calculi, type theories, truth preserving rules of inference and processes of reasoning as the subject matter of logical inquiry. One could also point to laws of logic that had held for centuries such as the laws of non-contradiction and excluded middle. At this time there are thousands of formalisms that might well be considered as the logic of something; formalisms that have very different properties from anything traditionally called logic. Such formalisms are studied in depth. Yet it is often impossible to know how to interpret them.
For example, a system of entailment was developed at Yale with the goal of capturing the English meaning of “If x then y”. Some logicians would have equated the implication relation in the propositional calculus with “If x then y”. But implication can be true even though the antecedent and consequent have nothing to do with each other. As implication is defined a false proposition can imply anything. When Bertrand Russell was challenged at a dinner party to prove from a false proposition that he was the Pope he replied, “Two equals one is false; the Pope and I are two, but two equals one, therefore the Pope and I are one”.
The Yale system, developed in part by my friend, Professor Alan Anderson, had rules requiring the relevance of the antecedent to the consequent. When they developed a model for the system, a model discovered by Professor Nuel Belnap who is now Alan Ross Anderson Distinguished Professor at Pittsburgh, it was found to require eight truth values with two of them designated as truth and six designated as falsehood. Of course, no one knew how to interpret eight truth-values. And it was clearly the end of excluded middle asserting there are only two truth-values, true and false.
While formalisms are offering new systems to be interpreted, process metaphysics offers new opportunities for interpretation. For example, Peirce’s semiotics has three main categories known as firstness, secondness and thirdness. According to Floyd Merrell, firstness violates the principle of non-contradiction while thirdness violates the principle of excluded middle. Thus traditional logic does not adequately interpret Peirce’s semiotics. We might now suggest for example, pure combinatory logic for the logic of firstness, plus a systematic study of adjoined illative systems in relation to the requirements of secondness and thirdness.
Finally, we should recognize that Peirce died in 1914 and Whitehead in 1947. Process metaphysics is, today an active area of inquiry which, while profiting from the insights of both Peirce and Whitehead, has moved beyond them and has available from other disciplines such as quantum physics and modern formalisms restraints and opportunities never before available.
Thus the foundation of autognomics research is a wedding of process metaphysics and modern formalisms satisfying Bertrand Russell’s observation that a philosopher knows what he is talking about while a mathematician (formalist) knows precisely what he is saying.
The work and ideas proposed here have already caused us at Autognomics Institute to question much common wisdom. For example:
- One must not negotiate with terrorists.
- Bombing will diminish an enemy’s will to fight.
- Getting tough to solve the problems of crime, education, welfare, etc.
- Passing laws to solve problems.
- That there are good guys and bad guys.
- A confrontation can be caused by one participant.
These ideas are destructive. They are also current beliefs behind policy. We are most anxious to move forward with this proposed project to provide grounds for policy analysis leading to constructive policies.
We want to invite scholars sympathetic to this proposal to join us in cyberspace and ultimately in planned conferences.We feel the urgency to move forward with this project of research and seek immediate response from you and anyone you might consider right for this effort. We are seeking funding and will be putting this team together ASAP. Thank you for you consideration.
To learn more about the Autognomics Institute and some of our recommendations on social dilemmas go to our website www.autognomics.org. Also, we are currently preparing papers addressing various issues. Please Contact Us