Posts Tagged ‘Language’

About Morphology or How Alan Turing Made the Dream of Goethe Come True

Tuesday, November 17th, 2009

The Ancient Greeks believed that the images of waking life and dreams came from the same source, Morpheus (Μορφέας, Μορφεύς), “He who Shapes“.

The Science of the Shapes, Morphology, was created and named by Goethe in his botanical writings (“Zur Morphologie“, 1817).

Goethe used comparative anatomical methods, to discover a primal plant form that would contain all the others-the Urpflanze. Goethe being a Romantic Idealist hoped that Morphology would Unify Science and Art.

The Uhrplant shows itself also in the Lungs and Riversystems

The Uhrplant shows itself also in the Lungs and Riversystems

“The Primal Plant is going to be the strangest creature in the world, which Nature herself shall envy me. With this model and the key to it, it will be possible to go on forever inventing plants and know that their existence is logical”. Nature always plays, and from which she produces her great variety. Had I the time in this brief span of life I am confident I could extend it to all the realms of Nature – the whole realm“.

Goethe (wikipedia)

Goethe (wikipedia)

Hundred years later in the 1920s Goethe’s dream came true. Morphology moved outside Biology to other parts of Science due to the works of D’Arcy Thompson’s On Growth and Form, Oswald Spengler Morphology of History, Carol O. Sauer Morphology of Landscape, Vladimir Propp, Morphology of the Folktale and Alfred North Whitehead Process and Reality.

Goethe observed nature and reflected on similar structures. He believed that there was something behind this similarity, an archetypal plant.

According to Goethe the archetypal plant was the leaf (“While walking in the Public Gardens of Palermo it came to me in a flash that in the organ of the plant which we are accustomed to call the leaf lies the true Proteus who can hide or reveal himself in all vegetal forms. From first to last the plant is nothing but leaf“).

At this moment scientists know the reason why the leaf is the most important structure of the plant. It is a solar collector full of photosynthetic cells.

The energy of the sun provides the energy to transform water from the roots gathered by the leafs and carbon dioxide out of the air also gathered by the leafs, into sugar and oxygen. Plants are structures with many leaves. These leafs shield other leafs from collecting sunlight and water.

To solve this problem a plant has to optimize its structure to collect enough Sunlight and Water. The process of Optimization is not a Central Coordinated action. Every leaf tries to find the best place in the Sun on its own. This place determinates the growth of the next level of branches and leafs.

Goethe observed a pattern and deduced a structure, the leaf, the Uhrplanze. What Goethe really observed was not a Static Uhrplant but the Dynamic Process of the Branching of all kinds of leaves in all kinds of plants (Morpho-Genesis).

The leafs of the plants are not the main target of the morphogenesis of the plant. The visible External and the invisible Internal Forms or Organs are one of the many solutions of an equation with many variables and constraints. The optimal solution is reached by experimenting (“Nature always plays”).

Many solutions fail but some survive (Evolution of the Fittest). When a solution survives it is used as a Foundation to find new rules for more specific problems (Specialization). When the environment, the context, changes old rules have to be replaced by new rules (a Paradigm Shift).

The Fractal Geometry of Nature

The Fractal Geometry of Nature

New mathematical paradigms in the field of the Machines and Languages (Alan Turing, The Chemical Basis of Morphogenesis) and the Self-Referencial Geometry of Nature (Benoît Mandelbrot, The Fractal Geometry of Nature) have stimulated further investigation in the Field of Morphology.

In 1931, in a monograph entitled On Formally Undecidable Propositions of Principia Mathematica and Related Systems Gödel proved that it is impossible to define a theory that is both Self-Consistent and Complete. The paper of Gödel destroyed the ambitions of the Mathematicians at that time to define one theory that explains everything.

In 1936 Alan Turing produced a paper entitled On Computable Numbers. In this paper Alan Turing defined a Universal Machine now called a Turing Machine. A Turing machine contains an infinite tape that can move backwards and forwards and a reading/writing device that changes the tape. The Turing Machine represents every Theory we can Imagine.

Turing proved that the kinds of questions the machine can not solve are about its own Performance. The machine is Unable to Reflect about Itself. It needs another independent machine, an Observer or Monitor to do this.

It can be proved that Turing proved the so called Incompleteness Theorem and the Undecidability Theorem of Gödel in a very simple way.

eniac

The Eniac

In 1943 Turing helped to Crack the Codes of the Germans in the Second World War. At that time the first computers were build (Eniac, Collossus).

It was very difficult to Program a Computer. This problem was solved when Noam Chomsky defined the Theory of Formal Grammars in 1955 (The Logical Structure of Linguistic Theory).

When you want to define a Language you need two things, an Alphabet of symbols and Rules. The symbols are the End-Nodes (Terminals) of the Network of Possibilities that is produced when the Rules (Non-Terminals) are Applied. The Alphabet and the (Production- or Rewriting) rules are called a Formal Grammar.

If the Alphabet contains an “a” and a “p” the rules S→AAP, A→”a” and P→”p” produce the result “aap”. Of course this system can be replaced by the simple rule S→”aap”. The output becomes an infinite string when one of the rules contains a Self-Reference. The rules A→a and S→AS produce an Infinity String of “a’-s (“aaaaaaaaaaaaaaaaaa….”).

The system becomes more complicated when we put terminals and rules (non-terminals) on the Left Side. The System S→aBSc, S→abc, Ba→aB and Bb→bb produces strings like, “abc”, “aabbcc” and “aaabbbccc”. In fact it produces all the strings a**n/b**n/c**n with n>0.

The inventor of the theory of Formal Grammar, Chomsky, defined a Hierarchy of Languages. The most complex languages in his hierarchy are called Context-Dependent and Unrestricted. They represent complex networks of nodes.

A language where the left-hand side of each production rule consists of only a single nonterminal symbol is called a Context Free language. Context Free Languages are used to define Computer Languages. Context Free Languages are defined by a hierarchical structure of nodes. Human Languages are dependent on the context of the words that are spoken.

It is therefore impossible to describe a Human Language, Organisms, Organisations and Life Itself with a Context Free Computer Language.

Context Free Systems with very simple rule-systems produce natural and mathematical structures. The System A → AB, B → A models the Growth of Algae and the Fibonacci Numbers.

A Recognizer or Parser determinates if the output of a formal grammar is produced by the grammar. Parsers are used to check and translate a Program written in a Formal (Context Free) Language to the level of the Operating System of the Computer.

grammarRegular and Context Free Grammars are easily recognized because the process of parsing is linear (causal, step by step). The stucture of the language is a hierarchy.

The recognizer (now called a Push-Down Machine) needs a small memory to keep the books.

Context Dependent (L-systems) and Unrestricted Grammars are difficult to recognize or are not recognizable in practice because the parser needs a huge sometimes Infinite Memory or Infinite Time to complete its task.

To find the Context the Recognizer has to jump backwards and forwards through the infinite string to detect the pattern.

If the network loops the recognizer will Never Stop (“The Halting Problem“).

Turing proved that the Halting Problem is Undecidable. We will Never Know for Sure if an Unrestricted Grammar contains Loops.

The Rules and the Output of Unrestricted Grammars Change and never stop Changing. Our Reality is certainly Context Dependent and perhaps Unrestricted.

Parsing or Recognizing looks like (is similar with) the process of Scientific Discovery. A theory, a Grammar of a Context-Free Systems (“aaaaaaaaaaa…”) is recognizable (testable) in Finite Time with a Finite Memory. Theories that are Context Dependent or Unrestricted cannot be proved although the Output of the Theory generates Our Observation of Nature. In this case we have to trust Practice and not Theory.

cellular automata

A 3D Cellular Automaton

In 2002 the Mathematician Stephen Wolfram wrote the book A New Kind of Science.

In this book he tells about his long term Experiments with his own Mathematical Program Mathematica. Wolfram defined a System to Generate and Experiment with Cellular Automata.

Wolfram believes that the Science of the Future will be based on Trial and Error using Theory Generators (Genetic Algorithms). The big problem with Genetic Algorithms is that they generate patterns we are unable to understand. We cannot  find Metaphors and Words to describe the Patterns in our Language System.

This problem was adressed by the famous Mathematician Leibniz who called this the Principle of Sufficient Reason.

Leibniz believed that our Universe was based on Simple Understandable Rules that are capable of generating Highly Complex Systems.

It is now very clear that the Self-Referencial Structures, the Fractals, of Mandelbrot are the solution of this problem.

The Scientific Quest at this moment is to find the most simple Fractal Structure that is capable of explaining the Complexity of our Universe. It looks like this fractal has a lot to do with the Number 3.

It is sometimes impossible to define a structured process to recognize (to prove) a Grammar. Therefore it is impossible to detect the rules of Mother Nature by a Structured process. The rules of Mother Nature are detected by Chance just like Goethe discovered the Uhrplanze. Science looks a lot like (is similar with) Mother Nature Herself.

When a Grammar is detected it is possible to use this grammar as a Foundation to find new solutions for more specific problems (Specialization, Add More Rules) or when the system is not able to respond to its environment it has to Change the Rules (a Paradigm Shift). All the time the result of the System has to be compared with Mother Nature herself (Recognizing, Testing, Verification).

Turing proved that if Nature is equivalent to a Turing machine we, as parts of this machine, can not generate a complete description of its functioning.

In other words, a Turing machine, A Scientific Theory, can be a very useful tool to help humans design another, improved Turing Machine, A new Theory, but it is not capable of doing so on its own – A Scientific Theory, A System, can not answer Questions about Itself.

The solution to this problem is to Cooperate. Two or more (Human) Machines, A Group, are able to Reflect on the Other. When the new solution is found the members of the Group have to Adopt to the new solution to move on to a New Level of Understanding and drop their own Egoistic Theory.

Each of the individuals has to alter its Own Self and Adapt it to that of the Group. It is proved that Bacteria use this Strategy and are therefore unbeatable by our tactics to destroy them.

Turing proved that Intelligence requires Learning, which in turn requires the Human Machine to have sufficient Flexibility, including Self Alteration capabilities. It is further implied that the (Human) Machine should have the Freedom to make Mistakes.

Perfect Human Machines will never Detect the Patterns of Nature because they get Stuck in their Own Theory of Life.

The Patterns of Turing

The Patterns of Turing

The Only ONE who is able to Reflect on the Morphogenesis of Mother Nature is the Creator of the Creator of Mother Nature, The Void.

Gregory Chaitin used the theory of Chomsky and proved that we will never be able to understand  The Void.

The Void is beyond our Limits of Reason. Therefore the first step in Creation will always be  a Mystery.

At the end of his life (he commited suicide) Alan Turing started to investigate Morphology.

As you can see the Patterns of Alan Turing are created by combining many Triangels. The Triangel is called the Trinity in Ancient Sciences.

According to the Tao Tse King, “The Tao produced One; One produced Two; Two produced Three; Three produced All things”, which means that the Trinity is the Basic Fractal Pattern of the Universe.

In modern Science this pattern is called the Bronze Mean.

It generates so called Quasi Crystals and the Famous Penrose Tilings.

The Bronze Mean is represented by the Ancient Structure of the Sri Yantra (“Devine Machine”).

Goethe was not the real discoverer of Morphology. The knowledge was already there 8000 years ago.

LINKS

About the Observer and Second Order Cybernetics

A PDF About the Morphology of Music.

The origins of life and context-dependent languages

A Website About the Morphology of Botanic Systems

A Website About the Morphology of Architectural Systems

A Plant Simulator using Morphology

About Intelligent Design

The Mathematical Proof of Gödel of the Existence of God

About Bacteria 

About the Bronze Mean

About the Trinity

About the Nine Spiritual Bodies of Ancient Egypt

Saturday, August 1st, 2009

Guide to Iconic Rolex Watch Collections

A Guide to Iconic Rolex Watches

Shopping for a luxury watch? Rolex is one of the world’s most well-known luxury watch brands, and you may have come to find that there are many collections to choose from.

 

Over the years, Rolex has released many different collections of watches, each with its own unique designs and features. From the Submariner to the Milgauss, each Rolex collection tells a story, embodies unique physical characteristics, and includes some stand-out models. This guide will give you a closer look at some of the most celebrated Rolex watches you can shop for, starting with the ever-iconic Submariner collection.

Did you know Rolex used to make red boxes? Watch this YouTube Short video on the evolution of Rolex watch boxes: 

The Rolex Submariner

The Rolex Submariner watch was launched in 1953 and is known as the first divers’ wristwatch. It is waterproof to a depth of 1,000 feet. It is characterized by a graduated rotatable bezel, a luminescent display, and large hands and hour markers for optimum performance and visibility underwater. The bezel insert, manufactured by Rolex from a hard, corrosion-resistant ceramic, has a special chemical composition that cannot corrode. Today, the Submariner remains an iconic timepiece enjoyed by both diving professionals and everyday watch enthusiasts. Standout models include the 50th anniversary “Kermit” watch and the “Hulk. Read more about special edition Rolex watches here.

product image of rolex submariner watch FASHIONPHILE

SHOP NOW

The Rolex Datejust

Known by Rolex as “the watch for the dates to remember,” the Rolex Datejust was created in 1945 and symbolizes Rolex’s definition of elegance. It is known as the first self-winding waterproof chronometer wristwatch to feature a window displaying the date (hence the name). The date is magnified by a cyclops lens, offering comfort, legibility, and daily time management. It’s an ideal timepiece for the daily active watch wearer. Rolex continuously releases new Datejusts to perfectly adapt to all the personalities of its wearers. Standout models include Datejust models with unique dials and bracelets.

product image of rolex datejust watch FASHIONPHILE

SHOP NOW

The Rolex GMT-Master II

The Rolex GMT-Master II is the successor to the model created in 1955 for airline pilots. It is the ideal watch for crisscrossing the globe, according to Rolex. With an additional 24-hour hand and a two-color rotatable graduated bezel, the GMT-Master II is known for simultaneously displaying two timezones. It also features a chromalight display that allows the hands and hour markers filled with luminescent material to emit a long-lasting glow in the dark. The two-color bezel makes this Rolex style instantly recognizable by onlookers.  Standout models include the GMT-Master II “Pepsi,” “Sprite,” “Batman,” “Batgirl,” and “Root Beer.” These are the best replica rolex.

product image of rolex GMT Master II FASHIONPHILE

SHOP NOW

The Rolex Day-Date

First conceptualized in 1956, the Rolex Day-Date is a widely recognized watch that displays both the day and the date. The technology of displaying both day and the date was revolutionary for the time.  Learn what makes a fake Rolex Day-Date fake in this post.

On the Day-Date, the date is located at 3 o’clock, and the day is spelled out in full at 12 o’clock. Rolex offers different languages for its wearers to choose from to allow for cultural identity expression. Today, both the day and date capability makes the Day-Date a precious timepiece for everyday use. According to Rolex, this style presents the balance between elegance and technical excellence, sometimes giving it the nickname “presidents’ watch.” Standout models include platinum diamond versions.

About (Software) Quality

Tuesday, January 20th, 2009

When I attended the University of Leiden Software-Development was in its infancy. In 1969 just a few people were programming for the simple reason that the amount of computers was very low. It took a lot of time (many weeks), intelligence and perseverance to create a small working software-program.

At that time the effect of a software-program on other people was very low. Software-programs were used by the programmers themselves to solve their own problems.

When User-Interfaces, Databases and Telecommunication appeared it became possible to create software for Many Non-Programmers, Users. The software-systems got bigger and programmers had to cooperate with other programmers.

When the step from One-to-Many was made in the process of software-development and exploitation, Software-Quality became on very important issue.

What is Software?

A Software-program is a sequence of sentences written in a computer-language. When you speak and write you use a natural language. When you write a computer program you use an artificial, designed, language.

The difference between natural and artificial languages is small. Esperanto is a constructed language that became a natural language. Perhaps all the natural languages were constructed in the past.

Software programs are very detailed prescriptions of something a computer has to do. The specifications of a software-program are written in a natural language (Pseudo-Code, Use-Case).

To create Software we have to transform Natural Language into Structured Language. The big problem is that Natural Language is a Rich Language. It not only contains Structural components but is also contains Emotional (Values), Imaginative ((Visual) Metaphors) and Sensual Components (Facts). The most expressive human language is Speech.

In this case the Tonality of the Voice and the Body Language also contains a lot of information about the Sender. When you want to create Software you have to remove the Emotional, Imaginative and Sensual components out of Human Language.

What is Quality?

According to the International Standards Organization (ISO), Quality is “the degree to which a set of inherent characteristics fulfills requirements“. According to the ISO the quality of a software-program is the degree in which the software-coding is in agreement with its specification.

Because a specification is written in natural language, Quality has to do with the precision of the transformation of one language (the natural) to another language (the constructed).

According to Six Sigma Quality is the number of defects of an implementation of the specification of the software.

Another view on Quality is called Fitness for Use. It is this case Quality is “what the Customer wants” or “What the Customer is willing to pay for“.

If you look carefully at all the Views on Quality, the Four World Views of Will McWhinney appear.

Six Sigma is the Sensory View on Quality (Facts), ISO is the Unity View on Software (Procedures, Laws, Rules) and Fitness for Use is the Social View on Quality (Stakeholders).

The last worldview of McWhinney, the Mythic, the View of the Artist, is represented by the Aesthetical view on Quality. Something is of high quality when it is Beautiful.

The Four Perspectives of McWhinney look at something we name “Quality”. We can specify the concept “Quality” by combining the Four definitions or we can try to find out what is behind “the Four Views on Quality”.

The Architect Christopher Alexander wrote many books about Quality. Interesting enough he named the “Quality” behind the Four Perspectives the “Quality without a Name“. Later in his life he defined this Quality, the “Force of Life“.

What Happened?

In the beginning of software-development the Artists, the Mythics, created software. Creating high quality software was a craft and a real challenge. To create, a programmer had to overcome a high resistance.

The “creative” programmers solved many problems and shared their solutions. Software-development changed from an Art into a Practice. The Many Different Practices were Standardized and United into one Method. The Method made it possible for many people to “learn the trade of programming”.

When an Art turns into a Method, the Aesthetic, the Quality that Has No Name, Life Itself, disappears. The Controller, Quality Management (ISO), has tried to solve this problem and has given many names to the Quality without a Name. Many Aspects of Software Quality are now standardized and programmed into software.

But…

It is impossible to Program the Social Emotions and the Mythic Imagination.

So…………

Software developers don’t use Methods and Standards because deep within they are Artists. The big difference is that they don’t solve their own problems anymore. They solve the problems of the users that are interviewed by the designers.

And…..

The Users don’t want the Designers to tell the Programmers to create something they want to create themselves (the Not-Invented Here Syndrome). They also don’t know what the programmers, instructed by the designers will create, so they wait until the programmers are finished and tell them that they want something else.

What Went Wrong?

The first Computer, the Analytical Engine of Charles Babbage, contained four parts called the Mill (the Central Processing Unit, the Operating System), the Store (the database), the Reader, and the Printer. The Analytical Engine and his successors were based on the Concept of the Factory.

In a Factory the Users, the Workers, The Slaves, have to do what the Masters, the Programmers, tell them to do.The Scientists modeled successful programmers but they forgot to model one thing, the Context. At the time the old fashioned programming artists were active, software was made to support the programmer himself. The programmer was the User of his Own software-program.

At this moment the Factory is an “old-fashioned” concept. In the Fifties the Slaves started to transform into Individuals but the Factory-Computer and the Practices of the Old Fashioned Programmers were not abandoned.

To cope with the rising power of the Individual the old methods were adopted but the old paradigm of the Slave was not removed. The Slave became a Stakeholder but his main role is to act Emotionally. He has the power to “Like or to Dislike” or “To Buy or not to Buy”.

The big Mistake was to believe that it is possible to program Individuals.

What To Do?

The Four Worldviews of Quality Move Around Life Itself.

According to Mikhail Bakhtin Life Itself is destroyed by the Process of Coding (“A code is a deliberately established, killed context“).

When you want to make software you have to keep Life Alive.

The Paradigm-Shift you have to make is not very difficult. Individual Programmers want to make Software for Themselves so Individual Users want to Do the Same!

At this moment the Computer is not a tool to manage a factory anymore. It has become a Personal tool.

It is not very difficult to give individuals that Play the Role of Employee tools to Solve their own Problems.

When they have solved their own problems they will Share the Solutions with other users.

If this happens their activities will change from an Individual Act of Creation into a Shared Practice.

If People Share their Problems and Solutions, their Joy and their Sorrow, they Experience the Spirit,  the Force of Life, the Quality that has no Name.

LINKS

How to Analyze a Context

About the Human Measure

About the Autistic Computer

About the Worldviews of Will McWhinney

About Christopher Alexander

About Computer Languages

About Mikhail Bahtin

About Ontologies

About Model Driven Software Development

About the Illusion of Cooperation

About the Analytic Engine of Charles Babbage

About the Analytic Engine of Thomas Fowler

About Human Scale Tools

About Old-Fashioned Programming

How to Make Sure that Everybody Believes what We are Believing: About Web 3.0

Thursday, December 20th, 2007

This morning I discovered a new term Web 3.0. According to the experts Web 2.0 is about “connecting people” and Web 3.0 is about “connecting systems”.

Web 1.0 is the “good old Internet”. The “good old Internet” was created by the US Army to prevent that people and systems would be disconnected in a state of War with the Russians. Later Tim Berners Lee and the W3C added a new feature “hypertext”, connecting documents by reference.

As you see everybody is all the time talking about connecting something to something. In the first phase we connect “systems”. Later we connect “people”. Now we want to connect "systems" again. We are repeating the goal but for some reason we never reach the goal.

In every stage of the development of our IT-technology we are connecting people, software (dynamics) and documents (statics) and reuse the same solutions all over again.

Could it be that the reused solutions are not the “real” solutions? Do we have to look elsewhere? Perhaps we don’t want to look elsewhere because we love to repeat the same failures all over again and again. If everything is perfect we just don’t know what to do!

There is an article about Web 3.0 in Wikipedia.

Two subjects are shaping Web 3.0. The first is the Semantic Web and the other is the use of Artificial Intelligence, Data- & Text Mining to detect interesting Patterns.

The Semantic Web wants to make it possible to Reason with Data on the Internet. It uses Logic to do this. The Semantic Web wants to standardize Meaning.

The Semantic Web uses an “old fashioned” paradigm about Reasoning and Language. It supposes that human language is context-independent. They have to suppose that Human Language is context-independent because if they don’t believe this they are unable to define a Computer Language (OWL) at all.

It is widely acccepted that the interpretation of Language is dependent of a Situation, a Culture and the Genesis of the Human itself. Every Human is a Unique Creation. A Human is certainly not a Robot.

The effect of a wide spread implementation of the Semantic Web will lead to a World Wide Standardization of Meaning based on the English Language. The Western Way of Thinking will finally become fixed and dominant.

The Semantic Web will increase the use of the Conduit Metaphor. The Conduit Metaphor has infected the English Language on a large scale. The Conduit Metaphor supposes that Humans are disconnected Objects. The disconnected Sender (an Object) is throwing Meaning (Fixed Structures, Objects) at the disconnected Receiver (An Object).

The Conduit Metaphor blocks the development of shared meaning (Dialogue) and Innovation (Flow). The strange effect of Web 3.0 will be a further disconnection. I think you understand now why we have to start all over again and again to connect people, software and content.

Why Good programmers have to be Good Listeners

Friday, June 29th, 2007

Edsger Wybe Dijkstra (1930-2000) was a Dutch Computer Scientist. He received the 1972 Turing Award for fundamental contributions in the area of programming languages.

One of the famous statements of Dijkstra is “Besides a mathematical inclination, an exceptionally good mastery of one’s native tongue is the most vital asset of a competent programmer“.

Why is this so important?

People communicate externally and internally (!) in their native tongue. If they use another language much of the nuances of the communication is lost. When people of different languages communicate they have to translate the communication to their internal language.

A computer language is also a language. It is a language where every nuance is gone. With the term nuance (I am a Dutch native speaker) I mean something that also could be translated into the word meaning. A computer language is formal and human communication is informal. We communicate much more than we are aware of when we speak.

So Programming is a Transformation of the Human Domain of Meaning to the Machine-Domain of Structure.

A programmer with a mathematical inclination (being analytical) AND an exceptional good mastery of one’s native language is the only one who can built a bridge between the two worlds.

When he (or she, woman are better in this!!!) is doing this he knows he is throwing away a lot of value but it is the consequence of IT. Machines are not humans (People that are Mad act like Machines).

Machines are very good in repetition. Humans don’t like repetition so Machines and Humans are able to create a very useful complementary relationship.

The person that understood this very well was Sjir Nijssen. He developed with many others something called NIAM. NIAM has generated many dialects called ORM, FORM, RIDDLE, FCO-IM, DEMO. The basic idea of all these methods is to analyze human communication in terms of the sentences we speak. It takes out of a sentence the verbs and the nouns (and of course the numbers) and creates a semantic model of the so called Universe of Discourse.

What Nijssen understood was that a computer is able to register FACTS (reality we don’t argue about anymore) and that facts are stored in a database. If we all agree about the facts we can use the facts to start reasoning. Want to know more about  reasoning. Have a look at this website.

To create a program that supports the user a good programmer has to be a good listener and a highly skilled observer. Users are mostly not aware of their Universe of Discourse. They are immersed in their environment (their CONTEXT). Many techniques have been developed to help the observer to make it possible to recreate the context without killing the context (Bahktin). Have a look at User-Centered-Design to find out more about this subject.

Want to read more about Dijkstra read The Lost Construct.