Posts Tagged ‘Language’

About Morphology or How Alan Turing Made the Dream of Goethe Come True

Tuesday, November 17th, 2009

The Ancient Greeks believed that the images of waking life and dreams came from the same source, Morpheus (Μορφέας, Μορφεύς), “He who Shapes“.

The Science of the Shapes, Morphology, was created and named by Goethe in his botanical writings (“Zur Morphologie“, 1817).

Goethe used comparative anatomical methods, to discover a primal plant form that would contain all the others-the Urpflanze. Goethe being a Romantic Idealist hoped that Morphology would Unify Science and Art.

The Uhrplant shows itself also in the Lungs and Riversystems

The Uhrplant shows itself also in the Lungs and Riversystems

“The Primal Plant is going to be the strangest creature in the world, which Nature herself shall envy me. With this model and the key to it, it will be possible to go on forever inventing plants and know that their existence is logical”. Nature always plays, and from which she produces her great variety. Had I the time in this brief span of life I am confident I could extend it to all the realms of Nature – the whole realm“.

Goethe (wikipedia)

Goethe (wikipedia)

Hundred years later in the 1920s Goethe’s dream came true. Morphology moved outside Biology to other parts of Science due to the works of D’Arcy Thompson’s On Growth and Form, Oswald Spengler Morphology of History, Carol O. Sauer Morphology of Landscape, Vladimir Propp, Morphology of the Folktale and Alfred North Whitehead Process and Reality.

Goethe observed nature and reflected on similar structures. He believed that there was something behind this similarity, an archetypal plant.

According to Goethe the archetypal plant was the leaf (“While walking in the Public Gardens of Palermo it came to me in a flash that in the organ of the plant which we are accustomed to call the leaf lies the true Proteus who can hide or reveal himself in all vegetal forms. From first to last the plant is nothing but leaf“).

At this moment scientists know the reason why the leaf is the most important structure of the plant. It is a solar collector full of photosynthetic cells.

The energy of the sun provides the energy to transform water from the roots gathered by the leafs and carbon dioxide out of the air also gathered by the leafs, into sugar and oxygen. Plants are structures with many leaves. These leafs shield other leafs from collecting sunlight and water.

To solve this problem a plant has to optimize its structure to collect enough Sunlight and Water. The process of Optimization is not a Central Coordinated action. Every leaf tries to find the best place in the Sun on its own. This place determinates the growth of the next level of branches and leafs.

Goethe observed a pattern and deduced a structure, the leaf, the Uhrplanze. What Goethe really observed was not a Static Uhrplant but the Dynamic Process of the Branching of all kinds of leaves in all kinds of plants (Morpho-Genesis).

The leafs of the plants are not the main target of the morphogenesis of the plant. The visible External and the invisible Internal Forms or Organs are one of the many solutions of an equation with many variables and constraints. The optimal solution is reached by experimenting (“Nature always plays”).

Many solutions fail but some survive (Evolution of the Fittest). When a solution survives it is used as a Foundation to find new rules for more specific problems (Specialization). When the environment, the context, changes old rules have to be replaced by new rules (a Paradigm Shift).

The Fractal Geometry of Nature

The Fractal Geometry of Nature

New mathematical paradigms in the field of the Machines and Languages (Alan Turing, The Chemical Basis of Morphogenesis) and the Self-Referencial Geometry of Nature (Benoît Mandelbrot, The Fractal Geometry of Nature) have stimulated further investigation in the Field of Morphology.

In 1931, in a monograph entitled On Formally Undecidable Propositions of Principia Mathematica and Related Systems Gödel proved that it is impossible to define a theory that is both Self-Consistent and Complete. The paper of Gödel destroyed the ambitions of the Mathematicians at that time to define one theory that explains everything.

In 1936 Alan Turing produced a paper entitled On Computable Numbers. In this paper Alan Turing defined a Universal Machine now called a Turing Machine. A Turing machine contains an infinite tape that can move backwards and forwards and a reading/writing device that changes the tape. The Turing Machine represents every Theory we can Imagine.

Turing proved that the kinds of questions the machine can not solve are about its own Performance. The machine is Unable to Reflect about Itself. It needs another independent machine, an Observer or Monitor to do this.

It can be proved that Turing proved the so called Incompleteness Theorem and the Undecidability Theorem of Gödel in a very simple way.


The Eniac

In 1943 Turing helped to Crack the Codes of the Germans in the Second World War. At that time the first computers were build (Eniac, Collossus).

It was very difficult to Program a Computer. This problem was solved when Noam Chomsky defined the Theory of Formal Grammars in 1955 (The Logical Structure of Linguistic Theory).

When you want to define a Language you need two things, an Alphabet of symbols and Rules. The symbols are the End-Nodes (Terminals) of the Network of Possibilities that is produced when the Rules (Non-Terminals) are Applied. The Alphabet and the (Production- or Rewriting) rules are called a Formal Grammar.

If the Alphabet contains an “a” and a “p” the rules S→AAP, A→”a” and P→”p” produce the result “aap”. Of course this system can be replaced by the simple rule S→”aap”. The output becomes an infinite string when one of the rules contains a Self-Reference. The rules A→a and S→AS produce an Infinity String of “a’-s (“aaaaaaaaaaaaaaaaaa….”).

The system becomes more complicated when we put terminals and rules (non-terminals) on the Left Side. The System S→aBSc, S→abc, Ba→aB and Bb→bb produces strings like, “abc”, “aabbcc” and “aaabbbccc”. In fact it produces all the strings a**n/b**n/c**n with n>0.

The inventor of the theory of Formal Grammar, Chomsky, defined a Hierarchy of Languages. The most complex languages in his hierarchy are called Context-Dependent and Unrestricted. They represent complex networks of nodes.

A language where the left-hand side of each production rule consists of only a single nonterminal symbol is called a Context Free language. Context Free Languages are used to define Computer Languages. Context Free Languages are defined by a hierarchical structure of nodes. Human Languages are dependent on the context of the words that are spoken.

It is therefore impossible to describe a Human Language, Organisms, Organisations and Life Itself with a Context Free Computer Language.

Context Free Systems with very simple rule-systems produce natural and mathematical structures. The System A → AB, B → A models the Growth of Algae and the Fibonacci Numbers.

A Recognizer or Parser determinates if the output of a formal grammar is produced by the grammar. Parsers are used to check and translate a Program written in a Formal (Context Free) Language to the level of the Operating System of the Computer.

grammarRegular and Context Free Grammars are easily recognized because the process of parsing is linear (causal, step by step). The stucture of the language is a hierarchy.

The recognizer (now called a Push-Down Machine) needs a small memory to keep the books.

Context Dependent (L-systems) and Unrestricted Grammars are difficult to recognize or are not recognizable in practice because the parser needs a huge sometimes Infinite Memory or Infinite Time to complete its task.

To find the Context the Recognizer has to jump backwards and forwards through the infinite string to detect the pattern.

If the network loops the recognizer will Never Stop (“The Halting Problem“).

Turing proved that the Halting Problem is Undecidable. We will Never Know for Sure if an Unrestricted Grammar contains Loops.

The Rules and the Output of Unrestricted Grammars Change and never stop Changing. Our Reality is certainly Context Dependent and perhaps Unrestricted.

Parsing or Recognizing looks like (is similar with) the process of Scientific Discovery. A theory, a Grammar of a Context-Free Systems (“aaaaaaaaaaa…”) is recognizable (testable) in Finite Time with a Finite Memory. Theories that are Context Dependent or Unrestricted cannot be proved although the Output of the Theory generates Our Observation of Nature. In this case we have to trust Practice and not Theory.

cellular automata

A 3D Cellular Automaton

In 2002 the Mathematician Stephen Wolfram wrote the book A New Kind of Science.

In this book he tells about his long term Experiments with his own Mathematical Program Mathematica. Wolfram defined a System to Generate and Experiment with Cellular Automata.

Wolfram believes that the Science of the Future will be based on Trial and Error using Theory Generators (Genetic Algorithms). The big problem with Genetic Algorithms is that they generate patterns we are unable to understand. We cannot  find Metaphors and Words to describe the Patterns in our Language System.

This problem was adressed by the famous Mathematician Leibniz who called this the Principle of Sufficient Reason.

Leibniz believed that our Universe was based on Simple Understandable Rules that are capable of generating Highly Complex Systems.

It is now very clear that the Self-Referencial Structures, the Fractals, of Mandelbrot are the solution of this problem.

The Scientific Quest at this moment is to find the most simple Fractal Structure that is capable of explaining the Complexity of our Universe. It looks like this fractal has a lot to do with the Number 3.

It is sometimes impossible to define a structured process to recognize (to prove) a Grammar. Therefore it is impossible to detect the rules of Mother Nature by a Structured process. The rules of Mother Nature are detected by Chance just like Goethe discovered the Uhrplanze. Science looks a lot like (is similar with) Mother Nature Herself.

When a Grammar is detected it is possible to use this grammar as a Foundation to find new solutions for more specific problems (Specialization, Add More Rules) or when the system is not able to respond to its environment it has to Change the Rules (a Paradigm Shift). All the time the result of the System has to be compared with Mother Nature herself (Recognizing, Testing, Verification).

Turing proved that if Nature is equivalent to a Turing machine we, as parts of this machine, can not generate a complete description of its functioning.

In other words, a Turing machine, A Scientific Theory, can be a very useful tool to help humans design another, improved Turing Machine, A new Theory, but it is not capable of doing so on its own – A Scientific Theory, A System, can not answer Questions about Itself.

The solution to this problem is to Cooperate. Two or more (Human) Machines, A Group, are able to Reflect on the Other. When the new solution is found the members of the Group have to Adopt to the new solution to move on to a New Level of Understanding and drop their own Egoistic Theory.

Each of the individuals has to alter its Own Self and Adapt it to that of the Group. It is proved that Bacteria use this Strategy and are therefore unbeatable by our tactics to destroy them.

Turing proved that Intelligence requires Learning, which in turn requires the Human Machine to have sufficient Flexibility, including Self Alteration capabilities. It is further implied that the (Human) Machine should have the Freedom to make Mistakes.

Perfect Human Machines will never Detect the Patterns of Nature because they get Stuck in their Own Theory of Life.

The Patterns of Turing

The Patterns of Turing

The Only ONE who is able to Reflect on the Morphogenesis of Mother Nature is the Creator of the Creator of Mother Nature, The Void.

Gregory Chaitin used the theory of Chomsky and proved that we will never be able to understand  The Void.

The Void is beyond our Limits of Reason. Therefore the first step in Creation will always be  a Mystery.

At the end of his life (he commited suicide) Alan Turing started to investigate Morphology.

As you can see the Patterns of Alan Turing are created by combining many Triangels. The Triangel is called the Trinity in Ancient Sciences.

According to the Tao Tse King, “The Tao produced One; One produced Two; Two produced Three; Three produced All things”, which means that the Trinity is the Basic Fractal Pattern of the Universe.

In modern Science this pattern is called the Bronze Mean.

It generates so called Quasi Crystals and the Famous Penrose Tilings.

The Bronze Mean is represented by the Ancient Structure of the Sri Yantra (“Devine Machine”).

Goethe was not the real discoverer of Morphology. The knowledge was already there 8000 years ago.


About the Observer and Second Order Cybernetics

A PDF About the Morphology of Music.

The origins of life and context-dependent languages

A Website About the Morphology of Botanic Systems

A Website About the Morphology of Architectural Systems

A Plant Simulator using Morphology

About Intelligent Design

The Mathematical Proof of Gödel of the Existence of God

About Bacteria 

About the Bronze Mean

About the Trinity

About the Nine Spiritual Bodies of Ancient Egypt

Saturday, August 1st, 2009

Ka_Statue_of_horawibraThe last examples of ancient Egyptian Hieroglyphs date from 450 AD. After that time the Egyptian script and language was replaced with the Coptic script and the Coptic language until it was replaced by the Arabic language in the 11th century.

From that time on nobody was able to understand the Hieroglyphs. At this moment a complete new interpretation of the Hieroglyphs is emerging showing Egypt as a highly technical advanced culture.

When scientists started to interpret the hieroglyphs they believed the hieroglyphs were nothing more than primitive picture writing. Much later they discovered that the hieroglyphic script is Phonetic just like our own language. The Pictures represent an Alphabet.

The Puzzle of the Alphabet of the hieroglyphs was solved by Jean-François Champollion who had a profound knowledge of the old Coptic language still used in the Coptic Church.

When the Egyptian Alphabet was finally deciphered the scientists could read the texts but they had no idea what the texts meant. To understand the language they used a theory.

Just like the theory of the primitive pictorial language they supposed that the Egyptian Texts were written by Primitive Pagans to describe Religious Rituals. The Scientists translated the texts literally just like the current translation software is doing.

Linguistics, the scientific study of natural language, has evolved a lot since the first scientists started to translate the hieroglyphs. We now know that the deep structure of languages is related to almost Universal Metaphors. Texts always contain a Hidden Meaning.

baThe knowledge of the old Cultures of the East is greatly advanced since the time of Champollion (1832). We now know much more about ancient Egyptian Mathematics and Physics and our knowledge of Mathematics and Physics has also greatly advanced.

This makes it possible to see and understand things we were not able to See 150 years ago.

Scranton and Harvey have developed a completely different and controversial interpretation of the Hieroglyphs. They are convinced that most of the Ancient Texts are not about Rituals but about Physical Practises based on Advanced Knowledge of the Physical Universe.

Scranton researched the Symbols of the Dogon in West Africa. The Dogon Culture is just like the Coptic Language still existent and it was possible to interview one of the Shaman and ask him to explain their symbols.

This interview took place around 1930 but the researchers Griaule and Dieterlen documented the interviews very well. They showed that the Symbols of the Dogon and the Symbols of the old Egyptians were highly comparable. Scranton, a computer expert, analyzed the Symbols and found a striking resemblance with current String Theory.

Clesson Harvey has discovered a new way to translate the Pyramid Texts. The Pyramid Texts are the oldest known “religious” texts in the world. They date between 2400-2300 BC. According to Harvey the Pyramid Texts have nothing to do with religion. He believes the Pyramid Texts are a Technical Manual to Operate a very Powerful Machine.

The conclusions of Harvey and Scranton are the same. The Egyptians were using highly advanced knowledge about the Physics of the Universe. This knowledge was used to build powerfull machines that were controlled with the Mind.

About the Cycles

maat_goddessofjustice1Egyptian Theory is based on Cycles. The Old Scientists knew that every Thing in the Physical Universe is governed by Cycles. This is the principle of Ma’at.

They knew that the one of the most important cycles is the Cycle of Precession. This Cycle takes 25,771.5 years.

Every time when the Signs of the Zodiac moved in the Sky the Egyptian Religion was adjusted to the new Sign (Taurus, Ram, Pisces, Aquarius …). They also knew that the Pole Star moved with the Precession. At this moment the Pole Star is in Polaris.

Clesson Harvey discovered that the Pyramid Texts are focused on Polaris which means that the Knowledge of the Pyramid Texts is at least 25.772,5 years old.

About the Nine Spiritual Bodies

The Egyptians were aware of the Spiritual Bodies of the Human Being. This knowledge is still not accepted in Mainstream Psychology. Mainstream Psychologists call everything that is outside their Theory, ab-normal or Para-Normal.

Some Psychologists Accept the Para-Normal and are trying to explain Phenomena like Out of the Body Experiences and Near-Death Experiences. When they do this they often make use of theories that were created in China, India and Egypt a long time ago. These old theories show a remarkable resemblance.

The Egyptians knew that the Universe was a Self-Referential Structure, a Fractal. They knew that one principle repeats itself on every Level. The Structure of the Universe is repeated in the Human Body and the Human Cells. If the Universe is a Nine-Fold pattern, the Spiritual Body also has to be a Nine-Fold Pattern.

The Nine Bodies were named Ren (the Name of the Body, the Chemical Structure, the Strong Force), Ab (the Heart, the Center, the Consciousness, The Center of Gravity), Akh (Shiner, The Luminous Body, the Force of Decay, Radiation, the Weak Force), Khaibit (Shadow, Aura, the Electro Magnetic Field of the Body), Ka, Ba , Khat, Sahu and Sekhem.

A person’s Ren (his DNA, his Blueprint, Date/Time & Place of Birth) was given to them at birth and would live for as long as that name was spoken. A cartouche (a magical rope) was often used to surround the name and protect it for eternity. The knowledge of the True Name could destroy a man. If somebody knew the Date/Time and Place of a Person he could find out with the Use of Astrology everything he wanted to know.

It was also believed that if a man knew the name of a god or a demon, and addressed him by it, he was bound to answer him and to do whatever he wished.

duatThe Scientists still don’t understand the concepts of Ka, Ba, Khat and Sekhem.

The concept of the Sahu is well known in many old cultures. In Tibetan mysticism the Sahu is called the Dharmakaya (“the Truth Body“). In the Christian Gnostic tradition it is called, “the Resurrection Body“. In Sufism “the Most Sacred Body ” (wujud al-aqdas) and “Supra-Celestial Body ” (asli haqiqi) and in Taoism, it is called “the Diamond Body“. Those who have attained it are called “the Immortals” and “the Skywalkers (the Djedi)”

The Sahu is an InterStellar Space-Ship steered by the Soul (Ba). It accompanies the human being in its endless cycle of birth, death, transformation, and rebirth. The Sahu contains the Projected Personalities (Ren) in the Matrix, the Memories (Akasha), the spiritual Aims and Purposes of every Incarnation of the Soul in every part of the Multi-Verse.

If we understand the Nine-Fold Pattern it is not so difficult to map the first four Spiritual Bodies (Ren, Ab, Akh, Khaibit) to the Four Forces or Four Elements of our Physical Universe.

The next five Bodies are very different. They are related to the Egyptian Underworld governed by Osiris. The Egyptian Underworld is a representation of our Seven Twin Universes associated with what the current scientists call Dark Matter.

The Ka (Dark Matter Chemical Body), the Ba, (Dark Matter Heart, The Center, Conscioussness, Soul), the Khat (Dark Matter Etheric Body) and the Sahu (Dark Matter Light Body) exist in a different Space/Time.

The Twin Universes are only reachable by the Portal of the Sekhem, the Singularity, the Primal Void, the Hole in Time. The “Dark Matter” Bodies can be controlled when the Chemical Body (Ego, Personality, Ka) and Consciousness (Ab, the Heart) merge. At that moment they are able to use the Supra-Celestial Body, the Sahu, the Vessel.

About Timing and Time Travel

duat2The Portal of Sekhem is only open during a short period of time at a special place on the Earth Grid. This moment appears when we move from the Sign of the Fishes to the Sign of Aquarius (NOW!).

When we want to operate the Time-Machine we have to Breath in our Sahu, the indestructible Container of the Human Soul, our Light Body. To use our Sahu we have to Clean Ourselves (Live from the Heart, Ab), Meditate (Becoming the Observer) and Practice the Breathing Techniques of the Old Scientists.

According to Clesson Harvey we are living in a very special Time/Space. It is the Time/Space that opens up the possibility to use the Tremendous Power behind the Dark Matter. This power makes it possible to play with the Laws of Gravity. We are able to lift Heavy Masses (Telekinesis). We are also able to move to another Time/Space in the Multi-Universe (The Underworld).

The only thing that is still missing to operate the Machine, is a Magic Crystal, the Stone of Destiny, the BnBn-Stone. One of the (many) theories is that this Crystal was stolen out of the Pyramid by Moses during the Reign of Pharao Akhenaton, put into the Arc of the Convenant, moved to Chartres by the Knights Templars, taken over by the Cathars and finally put into a Secret Cave close to Lourdes.


About the BnBn-Stone

About the Nine-Fold Pattern in China

About the Egyptian Underworld and Dark Matter

About the Spiritual Bodies of Egypt by Dr. Janet Cunningham

About the Nine-Fold Pattern of the Spiritual Bodies

About the Theory of Clesson Harvey

The Website of Clesson Harvey

Videos of the theory of Scranton

About (Software) Quality

Tuesday, January 20th, 2009

When I attended the University of Leiden Software-Development was in its infancy. In 1969 just a few people were programming for the simple reason that the amount of computers was very low. It took a lot of time (many weeks), intelligence and perseverance to create a small working software-program.

At that time the effect of a software-program on other people was very low. Software-programs were used by the programmers themselves to solve their own problems.

When User-Interfaces, Databases and Telecommunication appeared it became possible to create software for Many Non-Programmers, Users. The software-systems got bigger and programmers had to cooperate with other programmers.

When the step from One-to-Many was made in the process of software-development and exploitation, Software-Quality became on very important issue.

What is Software?

A Software-program is a sequence of sentences written in a computer-language. When you speak and write you use a natural language. When you write a computer program you use an artificial, designed, language.

The difference between natural and artificial languages is small. Esperanto is a constructed language that became a natural language. Perhaps all the natural languages were constructed in the past.

Software programs are very detailed prescriptions of something a computer has to do. The specifications of a software-program are written in a natural language (Pseudo-Code, Use-Case).

To create Software we have to transform Natural Language into Structured Language. The big problem is that Natural Language is a Rich Language. It not only contains Structural components but is also contains Emotional (Values), Imaginative ((Visual) Metaphors) and Sensual Components (Facts). The most expressive human language is Speech.

In this case the Tonality of the Voice and the Body Language also contains a lot of information about the Sender. When you want to create Software you have to remove the Emotional, Imaginative and Sensual components out of Human Language.

What is Quality?

According to the International Standards Organization (ISO), Quality is “the degree to which a set of inherent characteristics fulfills requirements“. According to the ISO the quality of a software-program is the degree in which the software-coding is in agreement with its specification.

Because a specification is written in natural language, Quality has to do with the precision of the transformation of one language (the natural) to another language (the constructed).

According to Six Sigma Quality is the number of defects of an implementation of the specification of the software.

Another view on Quality is called Fitness for Use. It is this case Quality is “what the Customer wants” or “What the Customer is willing to pay for“.

If you look carefully at all the Views on Quality, the Four World Views of Will McWhinney appear.

Six Sigma is the Sensory View on Quality (Facts), ISO is the Unity View on Software (Procedures, Laws, Rules) and Fitness for Use is the Social View on Quality (Stakeholders).

The last worldview of McWhinney, the Mythic, the View of the Artist, is represented by the Aesthetical view on Quality. Something is of high quality when it is Beautiful.

The Four Perspectives of McWhinney look at something we name “Quality”. We can specify the concept “Quality” by combining the Four definitions or we can try to find out what is behind “the Four Views on Quality”.

The Architect Christopher Alexander wrote many books about Quality. Interesting enough he named the “Quality” behind the Four Perspectives the “Quality without a Name“. Later in his life he defined this Quality, the “Force of Life“.

What Happened?

In the beginning of software-development the Artists, the Mythics, created software. Creating high quality software was a craft and a real challenge. To create, a programmer had to overcome a high resistance.

The “creative” programmers solved many problems and shared their solutions. Software-development changed from an Art into a Practice. The Many Different Practices were Standardized and United into one Method. The Method made it possible for many people to “learn the trade of programming”.

When an Art turns into a Method, the Aesthetic, the Quality that Has No Name, Life Itself, disappears. The Controller, Quality Management (ISO), has tried to solve this problem and has given many names to the Quality without a Name. Many Aspects of Software Quality are now standardized and programmed into software.


It is impossible to Program the Social Emotions and the Mythic Imagination.


Software developers don’t use Methods and Standards because deep within they are Artists. The big difference is that they don’t solve their own problems anymore. They solve the problems of the users that are interviewed by the designers.


The Users don’t want the Designers to tell the Programmers to create something they want to create themselves (the Not-Invented Here Syndrome). They also don’t know what the programmers, instructed by the designers will create, so they wait until the programmers are finished and tell them that they want something else.

What Went Wrong?

The first Computer, the Analytical Engine of Charles Babbage, contained four parts called the Mill (the Central Processing Unit, the Operating System), the Store (the database), the Reader, and the Printer. The Analytical Engine and his successors were based on the Concept of the Factory. In a Factory the Users, the Workers, The Slaves, have to do what the Masters, the Programmers, tell them to do.

A part of the Analytic Engine of Charles Babbage

The Scientists modeled successful programmers but they forgot to model one thing, the Context. At the time the old fashioned programming artists were active, software was made to support the programmer himself. The programmer was the User of his Own software-program.

At this moment the Factory is an “old-fashioned” concept. In the Fifties the Slaves started to transform into Individuals but the Factory-Computer and the Practices of the Old Fashioned Programmers were not abandoned.

To cope with the rising power of the Individual the old methods were adopted but the old paradigm of the Slave was not removed. The Slave became a Stakeholder but his main role is to act Emotionally. He has the power to “Like or to Dislike” or “To Buy or not to Buy”.

The big Mistake was to believe that it is possible to program Individuals.

What To Do?

The Four Worldviews of Quality Move Around Life Itself.

According to Mikhail Bakhtin Life Itself is destroyed by the Process of Coding (“A code is a deliberately established, killed context“).

When you want to make software you have to keep Life Alive.

The Paradigm-Shift you have to make is not very difficult. Individual Programmers want to make Software for Themselves so Individual Users want to Do the Same!

At this moment the Computer is not a tool to manage a factory anymore. It has become a Personal tool.

It is not very difficult to give individuals that Play the Role of Employee tools to Solve their own Problems.

When they have solved their own problems they will Share the Solutions with other users.

If this happens their activities will change from an Individual Act of Creation into a Shared Practice.

If People Share their Problems and Solutions, their Joy and their Sorrow, they Experience the Spirit,  the Force of Life, the Quality that has no Name.


How to Analyze a Context

About the Human Measure

About the Autistic Computer

About the Worldviews of Will McWhinney

About Christopher Alexander

About Computer Languages

About Mikhail Bahtin

About Ontologies

About Model Driven Software Development

About the Illusion of Cooperation

About the Analytic Engine of Charles Babbage

About the Analytic Engine of Thomas Fowler

About Human Scale Tools

About Old-Fashioned Programming

How to Make Sure that Everybody Believes what We are Believing: About Web 3.0

Thursday, December 20th, 2007

This morning I discovered a new term Web 3.0. According to the experts Web 2.0 is about “connecting people” and Web 3.0 is about “connecting systems”.

Web 1.0 is the “good old Internet”. The “good old Internet” was created by the US Army to prevent that people and systems would be disconnected in a state of War with the Russians. Later Tim Berners Lee and the W3C added a new feature “hypertext”, connecting documents by reference.

As you see everybody is all the time talking about connecting something to something. In the first phase we connect “systems”. Later we connect “people”. Now we want to connect "systems" again. We are repeating the goal but for some reason we never reach the goal.

In every stage of the development of our IT-technology we are connecting people, software (dynamics) and documents (statics) and reuse the same solutions all over again.

Could it be that the reused solutions are not the “real” solutions? Do we have to look elsewhere? Perhaps we don’t want to look elsewhere because we love to repeat the same failures all over again and again. If everything is perfect we just don’t know what to do!

There is an article about Web 3.0 in Wikipedia.

Two subjects are shaping Web 3.0. The first is the Semantic Web and the other is the use of Artificial Intelligence, Data- & Text Mining to detect interesting Patterns.

The Semantic Web wants to make it possible to Reason with Data on the Internet. It uses Logic to do this. The Semantic Web wants to standardize Meaning.

The Semantic Web uses an “old fashioned” paradigm about Reasoning and Language. It supposes that human language is context-independent. They have to suppose that Human Language is context-independent because if they don’t believe this they are unable to define a Computer Language (OWL) at all.

It is widely acccepted that the interpretation of Language is dependent of a Situation, a Culture and the Genesis of the Human itself. Every Human is a Unique Creation. A Human is certainly not a Robot.

The effect of a wide spread implementation of the Semantic Web will lead to a World Wide Standardization of Meaning based on the English Language. The Western Way of Thinking will finally become fixed and dominant.

The Semantic Web will increase the use of the Conduit Metaphor. The Conduit Metaphor has infected the English Language on a large scale. The Conduit Metaphor supposes that Humans are disconnected Objects. The disconnected Sender (an Object) is throwing Meaning (Fixed Structures, Objects) at the disconnected Receiver (An Object).

The Conduit Metaphor blocks the development of shared meaning (Dialogue) and Innovation (Flow). The strange effect of Web 3.0 will be a further disconnection. I think you understand now why we have to start all over again and again to connect people, software and content.

Why Good programmers have to be Good Listeners

Friday, June 29th, 2007

Edsger Wybe Dijkstra (1930-2000) was a Dutch Computer Scientist. He received the 1972 Turing Award for fundamental contributions in the area of programming languages.

One of the famous statements of Dijkstra is “Besides a mathematical inclination, an exceptionally good mastery of one’s native tongue is the most vital asset of a competent programmer“.

Why is this so important?

People communicate externally and internally (!) in their native tongue. If they use another language much of the nuances of the communication is lost. When people of different languages communicate they have to translate the communication to their internal language.

A computer language is also a language. It is a language where every nuance is gone. With the term nuance (I am a Dutch native speaker) I mean something that also could be translated into the word meaning. A computer language is formal and human communication is informal. We communicate much more than we are aware of when we speak.

So Programming is a Transformation of the Human Domain of Meaning to the Machine-Domain of Structure.

A programmer with a mathematical inclination (being analytical) AND an exceptional good mastery of one’s native language is the only one who can built a bridge between the two worlds.

When he (or she, woman are better in this!!!) is doing this he knows he is throwing away a lot of value but it is the consequence of IT. Machines are not humans (People that are Mad act like Machines).

Machines are very good in repetition. Humans don’t like repetition so Machines and Humans are able to create a very useful complementary relationship.

The person that understood this very well was Sjir Nijssen. He developed with many others something called NIAM. NIAM has generated many dialects called ORM, FORM, RIDDLE, FCO-IM, DEMO. The basic idea of all these methods is to analyze human communication in terms of the sentences we speak. It takes out of a sentence the verbs and the nouns (and of course the numbers) and creates a semantic model of the so called Universe of Discourse.

What Nijssen understood was that a computer is able to register FACTS (reality we don’t argue about anymore) and that facts are stored in a database. If we all agree about the facts we can use the facts to start reasoning. Want to know more about  reasoning. Have a look at this website.

To create a program that supports the user a good programmer has to be a good listener and a highly skilled observer. Users are mostly not aware of their Universe of Discourse. They are immersed in their environment (their CONTEXT). Many techniques have been developed to help the observer to make it possible to recreate the context without killing the context (Bahktin). Have a look at User-Centered-Design to find out more about this subject.

Want to read more about Dijkstra read The Lost Construct.