## Posts Tagged ‘problem’

Monday, January 5th, 2009

Sri Yantra

The teachings of Ancient Civilizations are often Self-Referential.

The same knowledge is showed on many levels.

What we see with our Senses is an Illusion. Behind this Illusion lies a deeper structure.

The last and deepest level of the ancient teachings is always related to Numbers, Geometry and the Trinity.

This blog is about one of the most important geometric structures of the Trinity called the Sri Yantra.

In the ancient teachings a problem is defined and the teacher gives a clue how to solve the problem.

If the pupil has solved a problem he is able to move to a deeper level.

The whole idea is that real knowledge, wisdom, is only discovered, when the pupil has solved the puzzle of life him- or herself.

Let’s have a look at a Deeper Level.

A deep level is related to a number called Phi. Phi is called the Golden Ratio or the Divine Proportion. It is the real solution of the quadratic equation x**2-x-1.

It is also a solution of the proportion a:b=b:a+b, the sequence of Fibonacci x(n+2)= x(n+1) + x(n), the geometric structure of the Pentagram, the Fifth Element (the Quintessence, the Ether) and the Logarithmic (Golden)Spiral.

Phi is the pattern behind the Egyptian Pyramids, the Stock Market, Harmony in Music and Architecture and many other fields of science including Physics.

Let us first have a look at the way the old teachers have hidden the knowledge of the Divine Proportion in their teachings.

Plato

A beautiful example is Plato.

Plato was an initiate of the Mathematikoi, the Secret Society of Pythagoras. Pythagoras was initiated in the Secrets Societies of Egypt.

What do you think of this problem-statement:

What are the most perfect bodies that can be constructed, four in number, unlike one another, but such that some can be generated out of one another by resolution? … If we can hit upon the answer to this, we have the truth concerning the generation of earth and fire and of the bodies that stand as proportionals between them (Timaeus 53e)”

and

Two things cannot be rightly put together without a Third; there must be some bond of union between them. …and the fairest bond is that which makes the most complete fusion of itself and the things which it combines, and proportion (analogia) is best adapted to effect such a union”.

and

“For whenever in any three numbers, whether cube or square, there is a mean, which is to the last term what the first term is to it, and again, when the mean is to the first term as the last term is to the mean – then the mean becoming first and last, and the first and last both becoming means, they will all of them of necessity come to be the same, and having become the same with one another will be all one [Timaeus 31b-32a]“.

In the last citation Plato is formulating a mathematical problem related to the four bodys A,B,C,D with the three proportions A:B = C:D = (A+B) : (C+D) = (C+D) : (A+B+C+D). This problem is unsolvable if you don’t have a clue where to start.

This problem is solved when you realize that the Divine Proportion has many strange relationships that are very useful to solve the puzzle.

These relationships can be found if you know everything there is to know about Triangles and Triangles are again related to the Trinity (“Two things cannot be rightly put together without a Third“).

The Trinity comes back in the structure of the Dialogues of Plato. They are divided into Three Parts (and an introduction).

The structure of the dialogues relates to itself.

The knowledge of the self-reference of the Dialogues is also a clue to solve more complicated puzzles about the Dialogues themselves.

Just like the famous book of Hofstadter (Goedel, Escher, Bach) the dialogues show a new layer everytime they are read with a new aquired Insight.

The dialogues of Plato are organized according to the model he wants to teach. There are seven layers (-1,-2,-3,0,1,2,3) related to the Seven Mirror-Universes (or Hells and Heavens) of Our Universe (Eight, the Whole) in our Multi-Universe.

The Seven is a combination of Two Trinities with the Zero (The Void) in the middle. The Eight (2**3) State is the Dialog, The Whole, itself.

The Seven layers are divided into Three Sections (the Trinity) so the total amount of clues is 7×3= 21 + 1 (the Whole) = 22.

22 divided by 7 is an approximation of the number π (Pi).

Pi relates the Square to the Circle.

The Square represents the Playing Board of the Universe. On this Board we, the Humans, play our Game of Free Will.

Our Free Will is an Illusion. We are controlled by the Matrix.

The Circle represents the Wheel of Fortune, the Matrix, that Governs the Seven Universes in our Multi-Universe and the Game of Life.

What is Plato Trying to Explain?

The Golden Ratio

The Divine proportion is the basic concept behind Harmony.

The Divine Proportion is Not Symmetrical so Harmony is not related to Balance.

If everything would be balanced the Universe was never created.

Harmony is Balanced Unbalance.

An Architecture is beautiful when there is a slight unbalance in the Design.

This Unbalance shows the Sign of the Creator.

The Universe is created out of an Unbalance between Two Forces, the Positive and the Negative, the Good and the Bad.

The Two forces (-1,0,1) are divided into Four Forces (-2,-1,0,1,2) with the One in the Middle (Five) and are expanded into Seven Levels (-3,-2,-1,0,1,2,3). The Four Forces are “the most perfect bodies that can be constructed, four in number“.

The Seven Levels are related to the Circle. The Four Forces are related to the Square. The Universe oscillates between The Square and The Circle.

When the Square, the Game We Play with the Four Forces and the Circle (The Five Fold Cycle, Our Destiny) are in Balance the Human is in the Tao and Magic happens.

Every division is a split of the Trinity into Trinities until a state of Balance is reached. At that time the process reverses.

Mount Meru

The unbalance of the Good (+1) and the Bad (-1) is an Illusion. They Cooperate to Create Harmony (0).

Behind the perceived Unbalance is a balancing principle, the Divine Proportion. This principle brings everything Back into the Balance of the One who is the Void (0).

What we don’t know is to be known if we understand the progression of the Divine Spiral.

The Future, the Third Step, is a Combination of Two Steps in the Past (The Fibonacci Sequence), nothing more.

Is there a Deeper Structure Behind the Divine Proportion?

Golden Mean Spiral

The Four Forces (Control, Desire, Emotion(Compassion) and the Whole of the Trinity, Imagination) and the related sacred geometry were a guiding principle for the Imagination and the E-Motivation of many Western Scientists.

They tried to Control the Chaos of the Desires of the Senses by enforcing the Rules of Scientific Falsification.

The Proces of Falsification destroyed the Human Intuition.

The Western Scientists forgot to look at the Source of Intuition, the Center, the Quintessence (The Fifth, Consciousness).

The principle behind the Quintessence (Ether, Chi, Prana) is related to higher order symmetries and is a solution of a generalization of the generating function of the Divine proportion X**2 -X – 1

This generalization is X**2-pX-q or X(n+2) = pX(n+1) + qX(n). The solutions of this formula are called the Metallic Means.

When p=1 and q =1 the Divine proportion comes back again.

When the p=3 and q=1 a new sequence X**2-3X-1 or X(n+2) = 3*X(n+1) + X(n), the Bronze Mean appears.

The Bronze Mean generates the pattern: 1,1,4,13,43, the pattern of the Sri Yantra.

It shows a very beautiful pattern of “3″s when the Bronze Mean is evaluated in a Continued Fraction. The Golden Mean is a continued fraction of “1″s.

There are many more “Metallic Means” (other p’s and q’s). They are related to all kinds of symmetries and fractal patterns.

The Bronze Mean shows that behind the Trinity of the Golden Mean lies another Trinity (and another Trinity and …).

Penrose Tiling

The Bronze Mean is the generator of so called QuasiCrystals.

Quasy Crystals play a very important role in the Electro Magnetic Structures of our Body, the Collagens.  Collagens are the most abundant protein in mammals,making up about 25% to 35% of the whole-body protein content. The Collagens in our body explain the Ancient Chinese Science of Acupuncture.

Quasi Crystals are “normal” Crystals with a very complex symmetry.

They are ordered AND not-ordered.

One of the most beautiful examples of the patterns behind Quasi Crystals are the Penrose Tilings.

They were developed by the famous physicist Roger Penrose. He used the tilings to show his insights about consciousness.

Penrose believes that Our Universe is just like our Body a Quasi Crystal, a Hall of Mirrors. We the Souls travel all the Paths of this Magnificent Fluent Crystal.

Was the Bronze Mean Known by the Ancient Architects?

Tripura

The most important implementation of the Bronze Mean can be seen in the Sri Yantra (“Sacred Device”).

The Sri Yantra is related to the Red Triple Goddess of Creation, Tripura also named Lalita (“She Who Plays“).

The Sri Yantra is generated out of the FiveFold Pattern or Creation and the Four Forces of Destruction.

It contains  9 interlocking isoceles triangles. 4 of them point upwards and represent the female energy Shakti, while the other 5 point downwards, representing the male energy Shiva.

The standard form of the Sri Yantra  constitutes a total of 43 triangles. The centre of the Yantra has a Bindu which represents the Void.

The FiveFold Pattern of Creation moves with the Clock. A pattern that moves with the clock is a  generating pattern.

It moves away from the void and generates space. The FiveFold Pattern is the pattern of the Universe. It creates Universes, Galaxies and Planets. The Pattern moves around the Cellestial Center of Creation,  the Black Hole.

The FourFold Pattern moves Against the Clock and is a destructing pattern. It dissolves space and moves back to the void.  The FourFold pattern is the pattern of the Human Being and Earth.

The combination of both patterns is a Moebius Ring (the symbol of infinity) with the celestial Centre in the Middle. The FiveFold/Four Fold pattern resembles the Ninefold Egypian Pesedjet and the Ninefold Chinese Lo Shu Magic Square.

From the fivefold Shakti comes creation and from the fourfold Fire dissolution. The sexual union of five Shaktis and four Fires causes the chakra to evolve” (Yogini Hridaya (Heart of the Yogini Tantra)).

In Pakistan the Mother Goddess (Sharika) is represented by a diagram that contains “one central basic point that represents the core of the whole cosmos; 3 circles around it and 4 gates to enter, with 43 triangles shaping the corners“.

The Penrose Tilings and many other quasi crystals can also be found in Ancient Roman, Islamic and Christian Architecture (Pompei, Alhambra, Taj Mahal, Chartres). The tilings are an expression of the Game of Life and were used to build Educational Buildings (Pyramids, Cathedrals,..) to teach and show the old teachings.

Keplers Model of the Solar System

Kepler (1570-1640), a German Mathematician and Astronomer (The Cosmographic Mystery) and Albrecht Durer (1471-1528), a German Painter, knew about the Penrose Tilings but until the discovery of the Penrose Tilings nobody knew that they knew.

The new scientists (re)discovered old patterns that were known by the old scientist

What is the Meaning of the Bronze Mean?

The Bronze Mean shows the effect of a continuous division of the Universe in Trinities.

It shows that the Universe (and other levels) is suddenly moving from an ordered state to a chaotic state.

This chaotic, not predictable, state is not chaotic at all when you understand the patterns behind chaos.

In our Universe chaos is always ordered. Chaos is an effect of something that is happening in a higher (not Sensible) Dimension or A Higher Consciousness.

The Two Brains of Paul Steinhardt

The writer of an important article about Penrose Tilings and Islamic Art, Paul Steinhardt is like Roger Penrose a well known physicist.

He has created a new theory about the Universe based on Four Forces AND the Quintessence.

In this theory the Universe is Cyclic. It is expanding and contracting.

The expansion of the Universe ends when the Two Major Structures (-1,0, 1) in the Universe, called Membranes or Branes, are in Balance with the Center (0, the Void).

The membranes are higher dimensional Squares that are in parallel.

The Braines at both sides split into many similar cell-like structures. We live in one of the Cells of the Universe.

The Others, our Twins, live on the other membranes and are not aware of our existence until the Brains are getting into Balance.

At that moment the Twin Universes are Connected.

Scientists don’t know when this will happen but the Old Scientists who could travel the Multi-Universe with their United Brains knew.

It would happen at a very special Alignment of the Five Fold Center of Creation of the Milky Way with the FourFold Cross of the Destruction of Earth.

The Bronze mean is the Master-Pattern of our Multi-Universe.

The pattern 1,1,4,13,43 is in its 42nd enfolding and soon we will experience the 43th step, a Merge of the Left and the Right Brain of the Super Conscioussness, Adam Kadmon.

How to Raise the Djed

About the Nine-Fold Pattern of the Egyptian Pesedjed

Everything you want to know about the Divine Proportion

About The Indefinite Dyad and the Golden Section: Uncovering Plato’s Second Principle

About the Self-Referential Structure of the Dialogues of Plato

About the Law of Three of Gurdjieff

Paul Steinhardt, About Penrose Tilings and Ancient Islamic Art

About Penrose Tilings and the Alhambra

About the Geometric Patterns in Ancient Structures

An interview with Roger Penrose about the relationship between Conscioussness and Tilings

A lot of information (including simulations) about the Cyclic Universe of Paul Steinhardt

A simple model for the formation of a complex organism

How life emerged out of one Quasy Crystal

About Quasy Crystals and Sacred Geometry

### About The Limits of Reason

Sunday, August 3rd, 2008

You can always find an infinite amount of equations that fits a finite set of points.

When the set of points changes the equation changes. This represents a major problem when you want to find a general pattern. The solution is to assume that the pattern behind the set of points has to be a Simple Equation (or a Simple Law).

A  theory has to be simpler than the data it explains, otherwise it does not explain anything.

To define Simplicity we have to define a tool that measures the simplicity of an equation. Mathematicians have tried to solve this problem in many different ways. The problem seamed unsolvable until computers and software-languages were invented.

A law of nature is a piece of software, a computer algorithm, and instead of trying to measure the complexity of a law via the size of an equation, we now consider the size of programs, the number of bits in the software that implements a theory.

If every theory is represented by a string of bits we are able to analyze what a computer (our “thinking mind”) is able to represent. The problem is transformed to the problem of representation. Behind this problem lies the problem of Compression.

Our Reality is represented by the simplest equation (the shortest (most compressed) binary set) that when it is expanded represents the most complex binary set that represents our reality.

Gottfried Wilhelm Leibniz

One of the conditions we have to add is the condition of “understand ability”. Perhaps the expression exists but we are unable to grasp the law. Leibniz calls this law the principle of sufficient reason.

Leibniz formulated this principle as follows: “Dieu a choisi celuy qui est… le plus simple en hypotheses et le plus riche en phenomenes” (God has chosen that which is the most simple in hypotheses and the most rich in phenomena)”. “Mais quand une regle est fort composée, ce qui luy est conforme, passe pour irrégulier” (But when a rule is extremely complex, that which conforms to it passes for random)”.

The interesting point in the statements of Leibniz is de term “irrégulier“. It is translated by the term “random“. This term can be interpreted in many ways. In the world of Statistics it means that a certain event is unpredictable. In algorithmic terms it means that we are unable to find a pattern behind the pattern we observe. A random pattern is an essential pattern. It cannot be compressed.

Science ends when we have found randomness and have reached the Limits of Reason.

Everybody has a Limit of Reason and this limit expands in time but for every mind that will be born there is an absolute limit of Reason. When we have reached this limit we will know there are still patterns to find but we will be unable to prove they are real patterns.

Gregory Chaitin

Gregory Chaitin is the expert of the Limits of Reason and he is highly influenced by Leibniz.

By running a program you can eventually discover that it halts, if it halts. When it halts you have found a theory. The problem is to decide when to give up on a program that does not halt.

A great many special cases can be solved, but Turing showed that a general solution is impossible. No algorithm, no mathematical theory, can ever tell us which programs will halt and which will not.

We are never certain that we have found a theory because when we wait a little longer (collect more facts) we find the final theory that explains what we want to explain (if we understand the theory).

We could use a computer to search for patterns (this happens already) but the computer presents an incomprehensible theory (this happens already) or it has to search a little longer. A computer could run “for ever” when there is enough energy but a human has a fixed lifetime. The halting problem shows that we will not know how long “for ever” is. We also will not have enough minds to analyze the output. The Halting problem is proved to be unsolvable.

Chaitin defined a constant Ω that shows our progress in reaching the Limit of Reason. It shows our progress to reach the Incomprehensible.

We still have a long way to go.

The Halting Problem cannot be solved because we (the Humans) are unable to define the Limits of Reason. Even the Brightest Minds will not be able to understand all the patterns that are available in Our Universe. Even Mechanical Devices programmed by the Brightest minds will not solve the Mystery. Somewhere we will make a Mistake.

The Mistake will start a new process of Inquiry and New Theories will be created that will always contain a Mistake. We will be Busy until Enternity to Create because we are not perfect. Only Perfect Solutions are Impossible.

I want to close this blog with a statement of Leibniz: ”Sans les mathématiques on ne pénètre point au fond de la philosophie. Sans la philosophie on ne pénètre point au fond des mathématiques. Sans les deux on ne pénètre au fond de rien”(Without mathematics we cannot penetrate deeply into philosophy. Without philosophy we cannot penetrate deeply into mathematics. Without both we cannot penetrate deeply into anything)”.

George Chaitin about the Principle of Sufficient Reason

About the Quest for the perfect language (A Talk of Chaitin about the book of Umberto Ecco)

Leibniz forgot to mention the role of the Artist

### Why Software Layers always create new Software Layers

Wednesday, March 26th, 2008

The IT-Industry has evolved in nearly 50 years. In that timeframe, it became the most influential business in the Industry. Everybody is completely dependent on the computer and its software.

The IT-Industry has gone through various technology waves. The waves generated integration problems that were solved by the construction of abstraction layers. The layers not only solved problems. They also created new problems that were solved by other layers. The effect of all intertwining layers is an almost incomprehensible, not manageable, software-complex.

The main reason behind this development is the architecture of the general-purpose computer. It was developed to control and not to collaborate.

Charles Babbage invented the first computer (the Difference Engine) in 1833. Babbage wanted to automate the calculation of mathematical tables. His engine consisted of four parts called the mill (the Central Processing Unit, the Operating System), the Store (the database), the Reader, and the Printer. The machine was steam-driven and run by one attendant. The Reader used punched cards.

Babbage invented a programming-language and a compiler to translate symbols into numbers. He worked together with the first programmer, Lady Lovelace who invented the term bug (a defect in a program). The project of Babbage stopped because nobody wanted to finance him anymore.

It was not until 1954 that a real (business-) market for computers began to emerge by the creation of the IBM 650. The machines of the early 1950s were not much more capable than Charles Babbage’s Analytical Engine of the 1830s.

Around 1964 IBM gave birth to the general-purpose computer, the mainframe, in its 360-architecture (360 means all-round). The 360/370-architecture is one of the most durable artifacts of the computer age. It was so successful that it almost created a monopoly for IBM. Just one company, Microsoft, has succeeded to beat IBM by creating the general-purpose computer for the consumer (the PC). Microsoft copied (parts of ) the OS/2-operating system of IBM.

The current technical infrastructure looks a lot like the old fashioned 360/370-architecture but the processors are now located on many places. This was made possible by the sharp increase in bandwith and the network-architecture of the Internet.

Programming a computer in machine code is very difficult. To hide the complexity a higher level of abstraction (a programming language) was created that shielded the complexity of the lower layer (the machine code). A compiler translated the program back to the machine code. Three languages (Fortran, Algol and COBOL) were constructed. They covered the major problem-area’s (Industry, Science and Banking) of that time.

When the problem-domains interfered, companies were confronted with integration problems. IBM tried to unify all the major programming-languages (COBOL, Algol and Fortran) by introducing a new standard language, PL1. This approach failed. Companies did not want to convert all their existing programs to the new standard and programmers got accustomed to a language. They did not want to lose the experience they had acquired.

Integration by standardizing on one language has been tried many times (Java, C-Sharp). It will always fail for the same reasons. All the efforts to unify produce the opposite effect, an enormous diversity of languages, a Tower of Bable.

To cope with this problem a new abstraction layer was invented. The processes and data-structures of a company were analyzed and stored in a repository (an abstraction of a database). The program-generator made it possible to generate programs in all the major languages.

It was not possible to re-engineer all the legacy-systems to this abstraction-level. To solve this problem a compensating integration-layer, Enterprise Architecture Integration, was designed.

The PC democratized IT. Millions of consumers bought their own PC and started to develop applications using the tools available. They were not capable to connect their PC’s to the mainframe and to acquire the data they needed out of the central databases of the company.

New integration layers (Client-Server Computing and Data-Warehouses) were added.

Employees connected their personal PC to the Internet and found out that they could communicate and share software with friends and colleagues all over the world. To prohibit the entrance of unwanted intruders, companies shielded their private environment by the implementation of firewalls. Employees were unable to connect their personal environment with their corporate environment.

A new integration problem, security, became visible and has to be solved.

It looks like every solution of an integration problem creates a new integration problem in the future.

The process of creating bridges to connect disconnect layers of software is going on and on. The big problem is that the bridges were not created out of a long time perspective. They were created bottom up, to solve an urgent problem.

IT-technology shows all the stages of a growing child. At this moment, companies have to manage and to connect many highly intermingled layers related to almost every step in the maturing process of the computer and its software.

Nobody understands the functionality of the whole and can predict the combined behavior of all the different parts. The effort to maintain and change a complex software-infrastructure is increasing exponentially.

The IT Industry has changed his tools and infrastructure so often that the software-developer had to become an inventor.

He is constantly exploring new technical possibilities not able to stabilize his craft. When a developer is used to a tool he does not want to replace it with another. Most developers do not get the time to gain experience in the new tools and technologies. They have to work in high priority projects. Often the skills that are needed to make use of the new developments are hired outside.

The effect is that the internal developers are focused on maintaining the installed base and get further behind. In the end, the only solution that is left is to outsource the IT-department creating communication problems.

After more than 40 years of software-development, the complexity of the current IT-environment has become overwhelming. The related management costs are beginning to consume any productivity gain that they may be achieving from new technologies.

It is almost impossible to use new technology because 70 to 90% of the IT budget is spent on keeping existing systems running. If new functionality is developed, only 30% of the projects are successful.

If the complexity to develop software is not reduced, it will take 200 million highly specialized workers to support the billion people and businesses that will be connected via the Internet.

In the manufacturing industry, the principles of generalization and specialization are visible. Collaboration makes it possible to create flexible standards and a general-purpose infrastructure to support the standards.

When the infrastructure is established, competition and specialization starts. Cars use a standardized essential infrastructure that makes it possible to use standardized components from different vendors.

Car vendors are not competing on the level of the essential infrastructure. The big problem is that IT-Industry is still fighting on the level of the essential infrastructure, blocking specialization.

To keep their market share the software has to stay in the abstraction framework (the general purpose architecture) they are selling and controlling.

A new collaborative IT-infrastructure is arising. The new infrastructure makes it possible to specialize and simplify programs (now called services). Specialized messages (comparable to the components in the car industry), transported over the Internet, connect the services. This approach makes it much easier to change the connections between the services.

The World Wide Web Consortium (W3C), founded in October 1994, is leading the development of this new collaborative infrastructure. W3C has a commitment to look after the interest of the community instead of business. The influence of W3C is remarkable. The big competitive IT-companies in the market were more or less forced to use the standards created by the consortium. They were unable to create their own interpretation because the standards are produced as open source software.

The basis of the new collaborative foundation is XML (eXtensible Markup Language). XML is a flexible way to create “self-describing data” and to share both the format (the syntax) and the data on the World Wide Web. XML describes the syntax of information.

XML has enabled a new general-purpose technology-concept, called Web-Services. The concept is comparable to the use of containers in intermodal shipping. A container enables the transport a diversity of goods (data, programs, content) from one point to another point. At the destination, the container can be opened. The receiver can rearrange the goods and send them to another place. He can also put the goods in his warehouse and add value by assembling a new product. When the product is ready it can be send with a container to other assembly lines or to retailers to sell the product to consumers.

Web-Services facilitate the flow of complex data-structures (services, data, content) through the Internet. Services, can rearrange data-structures, ad value by combining them with other data-structures and can send the result to other services.

All kinds of specialized data-structures are defined that are meant to let specialized services act on them.

An example is taxation (XML TC). XML TC (a part of the Oasis standards organization) focuses on the development of a common vocabulary that will allow participants to unambiguously identify the tax related information exchanged within a particular business context. The benefits envisioned will include dramatic reductions in development of jurisdictionally specific applications, interchange standards for software vendors, and tax agencies alike. In addition, tax-paying constituents will benefit from increased services from tax agencies. Service providers will benefit due to more flexible interchange formats and reduced development efforts. Lastly, CRM, payroll, financial and other system developers will enjoy reduced development costs and schedules when integrating their systems with tax reporting and compliance systems.

Web-Services are the next shockwave that is bringing the IT-community into a state of fear and attraction. Their promise is lower development cost, and a much simpler architecture. Their threat is that the competition will make a better use of all the new possibilities.

The same pattern emerges. Their installed base of software slows most of the companies down. They will react by first creating an isolated software-environment and will have big problems in the future to connect the old part with the new part.

Web-Services will generate a worldwide marketplace for services. They are now a threat to all the current vendors of big software-packages. In essence, they have to rewrite all their legacy-software and make a split in generic components (most of them will be available for free) and essential services users really want to pay for.

Big software-vendors will transform themselves into specialized market places (service-portals) where users can find and make use of high quality services. Other vendors will create advanced routing-centers where messages will be translated and send to the appropriate processor.

It will be difficult for small service-providers to get the attention and the trust of companies and consumers to make use of their services. They will join in collaborative networks that are able to promote and secure their business (The Open Source Movement). It is impossible to see if they will survive in the still competitive environment where big giants still have an enormous power to influence and a lot of money to create new services.

If the big giants succeed, history will repeat itself. The new emerging software-ecology will slowly lose its diversity.

Web-services are an example of the principles of mass-customization and customer innovation. All the software-vendors are restructuring their big chunks of software into components that can be assembled to create a system.

Small competitors and even customers will also create components. In due time the number of possible combinations of components that are able to create the same functionality will surpass the complexity a human (or a collective of human beings) can handle.

How the Programmer stopped the Dialogue

How to Destroy Your Company by Implementing Packages

### Too Many Insights about IT-Architectures and IT-Strategy

Monday, November 19th, 2007

I have been responsible for IT-Architectures and IT-Strategy between 1984 and 1997. From 1997 until now I have reviewed many Architectures and Strategies when I was part of Meta Group (now Gartner).

An IT-Architecture is a System that describes the components of a Software System on an Abstract Level.

An IT-Strategy is a Process that contains Stages. In every Stage a new version of the IT-Architecture is implemented.

A Well-Formed IT-Strategy is able to Adapt the Old Version of the Architecture. When you’re Strategy has failed You have to Start All over Again.

There are two types of Systems. The first type contains Systems. The second type I call a Foundation. It is the level where we think “the Real Things are Happening”. The major problem is to define the Level of the Foundation.

If you look at your Own computer the Foundation lies deeper than you think. It depends on “What You Understand About a Computer“.

We could define the Foundation as the Operating System of Your Computer (most likely a Microsoft Operating System) but below this Foundation other Foundations are in Existence.

At a more abstract level you can see the real Problem. The problem is Containing.

If you use the Containing Metaphor you Never Stop but You Have to Stop Somewhere.

The level where you Stop is the level where you give the responsibility to An-Other. This can be an Organization, A Person or when we dig deep enough even Nature.

When you give the responsibility for something to an-other you have to Trust the Other and the Other has to take Care of You.

The reason why Architectures fail is that they are based on a Foundation that is not stable on the long term.

Suddenly somebody starts to tinker with the Foundation and suddenly everything goes wrong. This happens all the time. The others leave you alone and you have to take care of yourself.

A solution was to create a Foundation that was able to withstand every Change at the Lower level.

This layer was called Middleware. It is situated somewhere between the UP and the DOWN of all the Layers. History has proven that this solution is not helpful.

Everything changes all the time.

I want to give you a Model to understand the complexity of the problem. I use a Horizontal and a Vertical layer, A Matrix. Layered architectures can be mapped on a Matrix, a Cube or a higher dimensional Structure, N-Dimensional Space.

Every component can be described by a point that is connected to N-variables. The links between the components are lines connecting the points.

The first thing we could do is use one dimension to describe the “type” of the component (software, hardware, database, form, etc). If we create a picture of this dimension we have created a System Diagram. There are many types of System Diagrams invented. They were called a Method or a Modeling Language. Every new Method created a “Method War” because the Users always thought that their Method was the Best.

I participated in many activities of the IFIP (International Federation for Information Processing). We tried to find a way to find the “Best Method” to Improve the Practice. It was proven that there was no best method.

Many roads lead to Rome. See “Information Systems Design Methodologies: Improving The Practice“, T.W. Olle, H.G. Sol and A.A. Verrijn-Stuart, (Eds.), North-Holland.

At the end the major Method Wars ended with a Compromise, UML. Compromises are always the worst solution for a problem. UML is a very complicated method.

If we start the Diagram with an Event and we have chosen the right Modeling Language, now called a Programming Language, we are able to “simulate” the System or to “generate” the software. There are many Tools, Modelers, Simulators, Languages and Generators developed.

They also created wars (called Competition) between all kinds of Vendors. In the end many Tools were taken out of the market by the Vendors and the Users got stuck. They were unable to convert the old tools to the new tools or they simply did not take the time to do this. This problem is called the Legacy Problem.

The Market invented something to solve this problem called Reverse Engineering. Reverse Engineering proved to be a failure because the semantics, the meaning, of the software was gone.

When you deconstruct a car and you show an engineer all the parts he knows the parts belonged to a car. When you do this with something nobody ever knew it existed the only engineer that is capable to reconstruct the original is the engineer who constructed it.

When the software is old the original programmer is gone and nobody is able to understand what he was doing. Sometimes the software contains documentation. The programmer has written a Story about the Meaning of the Software. Programmers never took (and take) the time to do this.

I want to get back to N-Dimensional Space. I hope you understand that we when we use enough dimensions we are able to Model Everything.

We are also able to MAP Everything to Everything. Mapping or Converting could solve many problems.

There were Systems on the market that helped you to MAP one Structure to another Structure. An example was Rochade. I implemented Rochade when I was responsible for IT-Architectures and I know Rochade solved many problems with Legacy Systems.

Rochade used something called a Scanner or Parser. A Parser is a “piece of software” that translates a “piece of software” into another “piece of software”. It stored the data of the software (the meta-data) in a “general” format that could be translated to other formats.

When you program in a Software Language the code is translated to another Language. This happens many times until the software reaches the Lowest Level, The Processor or the CPU.

The CPU uses a Cycle to process a very simple language that consists of binary numbers. These numbers are real numbers or operations. The simplest operations are operations on a Set.

The whole concept of the CPU was invented by John von Neumann and is therefore named the Von Neumann Architecture.

The architecture of von Neumann has a big disadvantage called the Von Neumann bottleneck. The CPU is continuously forced to wait.

The Von Neumann Computer is Wasting Time and Energy.

An alternative is the Parallel Architecture. Parallel computing has recently become the dominant paradigm in computer architectures. The main reason is the Rise of the Internet.

The Rise of the Internet started the Fall of Centralized IT-Architectures and Centralized IT-Strategy.

At this moment we need another approach to Manage or Control the sofware-production of a big company.

This approach can be found in the Open Source Movement.

If we use the Matrix Approach we can answer interesting questions.

First I introduce a Rule.

When we increase the amount of Dimensions we are able to make every point and connection between a point Unique. If we do the opposite we are able to make every point and connection The Same.

When people talk about the Reuse of a Component we are looking for a Dimension were some points and their connections are the Same.

I hope you see that it is possible to Reuse Everything and to Reuse Nothing. The Choice is Yours.

This also the Practice in Software. When I discovered that the Year-2000 problem could lead to a disaster I started a research-project with the CWI. The CWI develop a very intelligent parser that could create Software-Maps.

When we studied the maps we saw that some pieces of software came back all the time. These were “citations”. One programmer invented a new construct and others reused the construct all the time. The major difference with the Theory of Reuse was that MANY Parts were the Same.

When you dig deep enough you always find “The Same”.

The CPU is handling binary codes and when you would come from another planet you would not understand why all these zero’s and 1′s are creating a DIVERSITY. They create a DIVERSITY because Something is Interpreting the Sequence. This Something is also a Program. This program uses a Theory about Languages. Most of the time it supposes a Context Free Language. A Context Free Language is a language where the interpretor always moves in one direction. It processes a List.

The Diversity a Computer Produces is based on one long List of Binary patterns. If we could analyze all the possible patterns we could find all the possible software-programs that could be build until Eternity. Because the binary codes can be mapped to the Natural Numbers. We need only one dimension (A line) to classify all the possible software-components in the World.

In 1931 Gödel’s stated the so called incompleteness theorems. He uses the Natural Numbers to prove that a part of our Human Reality cannot be described by a Computer Program.

There is something “left for us” that the Machines cannot take over. This part is related to the Emotions and the Imagination. We cannot Automate them. If we do this we stop Innovation and Commitment.

Now I want to come back to IT-Architectures.

When You start an IT-Architecture Out of Nothing you start with a Small Amount of Dimensions. The world looks very simple to you. When you add detail you have to increase the amount of dimensions. The effect of this is that Everything Changes. New Possibilities arise. If you go on using the Top-Down Approach you will move into a State of Huge Complexity. Always start in the Middle!

At a certain moment You have to move to Reality.This means Programming Software. At that moment You encounter something you never thought of. The Software Legacy!!!!!!!!!!!!

When you increase the Scope of your System and you leave the Boundaries of Your Company (the World of the Internet) the Complexity increases also.At that moment You encounter something you never thought of, Open Source. Millions of Possibilities arise and You don’t Know what to do!

Behind the Software are People. Some of them are creating Small Companies and they are doing things you’re company is also doing but they do it much cheaper and faster.

What to do?

If You Can’t Beat them Join Them.

### Whats wrong with doctors

Monday, May 21st, 2007

Last night I read an article in the New York Review of Books (May 31, 2007). The article is called What’s Wrong with Doctors (written by Jerome Groopman, a cancer specialist).  The article (and the book) is about the current Medical System. It shows everything that is wrong with this system.

The most important problem, according to Groopman, is that doctors Think. This problem is enforced by the fact that they don’t take the time to listen to themselves (introspection) and their patients. The doctor is acting as a “Rule-Based”-system that acts on the variables that are put into the system. He sees what he wants to see because his focus is on “fast delivery” of a solution. The patient is not important. He or she is just “a mean to an end” and the mean is making a lot of money in short time (“efficiency”) and/or achieving status.

Groopman attacks “Evidence Based Medicine”. This approach is based on statistics. The whole problem with statistics is comparable to the “rule-based-system”-approach. Again statistics just shows what you want to see. Statistics is a method to find patterns but the problem with patterns is that there is an infinite amount of patterns possible in every situation. In statistics you have to choose a “pattern-type” (for instance the pattern is linear) otherwise the whole approach is not working.

Groopman shows that statistics make doctors lazy. They trust somebody else (the bright guys) and don’t think for themselves anymore. Doctors have to be “lazy-thinkers” because they don’t have the time to think because they have to produce fast-solution. So the problem is not that they are thinking but the real problem is that they are not thinking at all. They are acting as robots.

The pharmaceutical industry helps doctors to deliver fast solutions. They provide super-pills that solve everything. They provide doctors directly with corrupted information about the huge statistical effects of their inventions. To make the doctors happy they also provide them with gifts (luxury tours disguised in a conference).

The above is just the start of the attack on the medical system. Groopman digs deeper and shows that the fundamental problem is that we cannot fully understand the context of the patient. Everybody patient is unique and has a unique solution that brings him in balance.

Groopman shows that knowledge (thinking again) is a big burden to understand a context. Knowledge acts as a filter. Knowledge shows just what it knows. This problem is related to the doctor and the patient. Both know something (I’m having a problem, I can solve the problem) but their “knowing” shields the real context.

How can we ever solve this problem? The solution is really very simple. We can solve the problem by “not-thinking”. If we are “not-thinking” we know everything. Deep within us (in the great unknown) lays the solution waiting. We don’t want to know this solution because it will destroy our patterns and patterns are what make us feel comfortable.

The patient knows the solution already and wants conformation from a loving and caring person who gives him the confidence that his process of change will get him in a new balance.

The best caretakers are not found in the medical system. They are good friends or people with the gift of healing that can support the person to make the “unknown” know.