Determinism:  From Wikipedia: In the history of science, Laplace’s demon was the first published articulation of  scientific determinism by Pierre-Simon Laplace in 1814. According to determinism, if someone (a super intelligence – aka LaPlace’s demon) captures all information –  knows the precise location and momentum of every atom (or sub-particle) in the universe, their past and future values for any given time are entailed (rigidly, exactly predictable); they can be calculated from the laws of classical mechanics. Laplace and others were absorbing and extrapolating this universal principle from a  large wave of advances in Classical Mechanics in the eighteenth century that could, for instance, predict the future movements of all the heavenly bodies indefinitely into the future –  limited only by the knowledge and accuracy of the starting state – of all parameters.  This would be the fruit of reductionism. The result is a complete, unitary view of a universe that is a machine (“clockwork universe”)that progresses from an early state to one outcome for a given future time .  The outcome cannot be altered. There is no other additional causality. As an example: Reductionism would break down the flight of an artillery shell to all the constituent linear algebra formulae for gravity, momentum, air friction, etc  and each is simple and gives an exact result and as they are accumulated they give an exact result to more significant digits for the final complex interaction.  A characteristic is that the equations are a distillation of each and all variations of the relationships (formula) and outcome of entire families of natural processes and entail less Information than a description of every change at every moment. This is part of what is called “Reductionism”. 

Certain references to themes and ideas will be repeated.  I will mark these putting them in bold and a larger font.

My Purpose:   I intend to question the conceptual – mathematical and scientific assertion of exclusive “Strict Determinism” as defined above. I can not logically and conclusively disprove exclusive Determinism but I can review and question it as an assumption and reveal its many, innate flaws. I can also attempt to, at least, propose an alternate view. Some say this alternate view is now mainstream, but; for example – in a flourish of philosophical eloquence, Albert Einstein once said that “Everything is determined, the beginning as well as the end, by forces over which we have no control. It is determined for the insect, as well as for the star. Human beings, vegetables, or cosmic dust, we all dance to a mysterious tune, intoned in the distance by an invisible piper.” Notice- “as well as the end”.

I am not a scientist or mathematician but the flaws are obvious  and they yield to more common level of analysis, knowledge, and  experience. The alternative that I think is much more believable is that, of course, there are many processes that are very dominantly deterministic. The difference is exclusive, clockwork Determinism vs a mixture of that deterministic causality and additional causality by emergent-self-organization. The difference is huge. The former excludes the possibility of free will among many other things. I also do not think a wholly strict deterministic causality would produce any life – therefore –  consciousness, culture. philosophy or religion. There are of course, many purely physical processes that have additional causality also. 

When I was taking Physics 101, I remember learning about Laplace’s conjecture and I thought about wind over water causing waves. The basic thermodynamic laws governing water are at the lowest molecular level and and must head toward randomness.

The equilibrium state, if enough external energy is applied (wind), the result, is not a flattened sameness  but may be, for example, be a 50ft wave pattern.  This may repeat for miles. Where did the larger pattern come from? I know at the thermodynamic level there is a chance variance. First any thermodynamic analysis at this smallest level can only be addressed by probability. Also there is no immediate or scientific causality for a 50ft. amplitude and a 200 ft. reach and a three mile breath. Determinism is mixed in. Of course, the whole system balances to the first law of thermodynamics. The different scales are in the order of magnitude of 10 to the 20th. Many years later when I heard about Complexity Theory  and self-ordering systems in the book Complexity, I had recognition, but no surprise. You see – anyone who wants to analyze deep and follow their own questions may get some insight. These thoughts anticipated a complex system with energy input that is over-driven that may deal with the energy heading towards chaos but the system self-organizes  to a larger cyclical pattern on a completely different scale. Small scale input, energy driven  —> chaos —>  self organization —> emergence of a new pattern at a higher level from – Additional Causality completely not explainable by low-level deterministic causality – but never contradicting determinism – that would be magic. It is only additional causality.  Is it possible that deterministic causality does not care (it has no deterministic solution)  to the lowest level of a system approaching chaos/randomness. While the cat (determinism) is away the mice (emergent self organization) play. 

Does this limit emergent-self-organization to marginal, low level or minor effect? An early form of algae oxygenated the entire earth.  Some mice have big dreams.


From Wikipedia – “Complexity characterises the behaviour of a system or model whose components interact in multiple ways and follow local rules, meaning there is no reasonable higher instruction to define the various possible interactions”

The term is generally used to characterize something with many parts where those parts interact with each other in multiple ways, culminating in a higher order of emergence greater than the sum of its parts. The study of these complex linkages at various scales is the main goal of complex systems theory.

Alright, at this point, we have to deal with my limitations. I have two principles that I admire. One is to always feel free to consider – “It ain’t necessarily so” the other is to freely admit, at some points the three magic words: “I don’t know.’ So what’s this stuff about emergence? There is something that moves complex systems to be creative – to have a property appear that was not caused by the simple underlying laws at the lowest levels – A little glib – yes. How does it work exactly – I don’t know. There are hints -vague. I love the glib and indefinite causality of “something”. I will address “something” later.

This initial inventory of all positions/dynamics using the terminology of thermodynamics is the original – “information” – of the  Universe system. This information should encompass the information of all future states in the clockwork model. 

Information One possible problem is the concept of exact knowledge – a measurement (information) that approaches infinite degree  in a continuum is a string of numbers approaching infinite and can not be encompassed in a finite universe. I know that quantum theory seems to bound the lowest levels of a continuum but I will not expound on an area where I have little knowledge. In the mixed model with additional causality, needed information can exceed the capacity to encompass it. 

If a future state has more information than the initial state this is impossible by the thermodynamic rules in determinism. Of course if, in addition to strict determination, there is additional causality from emergent-self-organization then that would explain the surplus or unaccountable information.  Life emerged at a specific but, thermodynamically chance moment from a soup of protein that initially approached randomness at the lowest molecular level. It was emergent with no deterministic necessity – causality for that exact moment. Think of the information delta if that happened a million years sooner or later (or one second for that matter). This would exceed all the hard work of all the butterflies that ever existed in messing with tornado schedules. Think of all information in the different the molecular positions if chance is involved. This is just the start. Think of all the new Additional Causality and the new rules of living things. All the evolution from the new RNA groups, then DNA up until the present are determined by deterministic setting and in addition the new rules, a platform,  imposed by this living system with its own rules – (platform). 

The good thing about strict determinism  is that it is one, clear scientific rule that explains everything – by definition. It is easy and  clear to define. If you try to have it both ways – some “inconsequential” processes and outcomes may not be caused by physical laws – then how can you define or know the limit of the “extra” causality. Remember the algae? If living things are not strictly bounded by determinism (emergent-self-organization) then they run amok. Actually seen from a totally detached, objective view the human race does run amok quite often.

Reductionism in science: Scientific reductionism is the idea of reducing complex interactions and entities to the sum of their constituent parts. –  Momentum = Mass X Velocity –  The result is an exact mathematical algorithm and is elegant.  Some scientist extrapolated this to the belief that every single process in nature can be broken down. This is the orchestrating insight needed to support determinism from many complicated inputs and processes. This concept for complex systems with many interactions is a huge weakness for determinism. Even in science circles scientist/mathematicians like Dr. David Berlinski asks: “where are the formulae in the Theory of Evolution? Where are the standardized units of measurements?  When you proceed to the worlds of abstraction – culture, religion, ideology etc. there is a complete blowout. No mere butterflies there.

The same questions are unanswered in all the behavioral sciences.   Where are the quantitative measurables and linear algebra formulae and units for culture, consciousness, religion, feelings, associations.  How can we reign in – no exactly  determine the effects of abstractions throughout an individual’s, and through millions of years of human history. These abstracts have their own rules and contribute to deterministic outcome married to additional causality. On the internet one observer ask – for just one tiny example – what are all the processes, in ultimate detail that determined how an individual is affected by the shower scene in the movie Psycho? 20 years later an involuntary shudder is experienced. What is the precise chain of causality? What is the exact output.  What are the units and measurables. There is no simple, pure-science answer here. Abstractions are involved.  Once you start thinking this way, examples abound. When Christ, Ghandi or Martin Luther King were born out of a chance process – what changes would follow? I will go further and say that in some realms – consciousness or behavioral science, for example, you can forget an Aristotelian edifice as a  foundation. Instead, start in the middle and use initial observation and use science-based techniques and just move ahead. See the other entries on my  blog about dreaming and hypnosis. You can not start this from a  the lowest chain of deterministic base and proceed from there. No one has succeeded to any degree. It would seem that anyone who indicated that they are significantly closing in was – bluffing.

If we look at the entire earth populated with living things in every possible ecological opening, two things are undeniable: One: it does not seem possible that living things could arise out of groups of molecules organizing themselves when the minimum criteria for the first step is so high and complex. And, Two: there they are.  

In the book The Prime Number Conspiracy there is a reading about mathematical trends  in the frontiers of mathematics, An article  – In Mysterious Pattern, Math and Nature Converge – on page 37.  Various test and analysis of number sequences are expected to increase in randomness but a pattern emerges in many unrelated areas . Groups of occurrences tend to bunch unexpectedly. It is called Universality and it appears “when systems that are very complex, consisting of many parts that strongly interact with each other to create a spectrum. ” This is a surprising pattern that appears in many disparate environments – this is why it is called Universality – “Something”. Yes I know this is vague and I do no understand where it comes from. But, it is consistent with the ability of complex system to defy the tendency toward randomness at some point and self-organize.  It is the first step to a process that changes everything at higher scales. It seems to be the missing link or, at least, one part.  Remember this starts with fundamental math – the skeleton of reality. 

As a parallel thought, I challenge anyone to define randomness. The best mathematicians have come to realize that you can not use any reduction. Randomness rejects any generalization.  There is a paradox between any encompassing insight and randomness. If mathematics can not produce a function that computes randomness how can the deterministic universe act like a computer that can support random needs of a process gone complex? If mathematics is skeletal for reality, this may be why there is a rejection at the lowest levels  of any function trying to emulate a “pure” random generator – (“Something”?).  

If a deterministic process that needs randomness must fail. then it needs intelligence for a solution to minimize entropy. The smallest unit of intelligence might be any mechanism of selectivity. This selectivity has no insight or foresight – it is accidental. Water going down a drain or warm vs cold air needing interchange -they need to use a mathematical model: a helix. Any starting, even accidental, selectivity could start a chain reaction that engages all the molecules on a vastly higher scale that solves  for all the complex interactions to a reduced state of entropy at a lower energy cost. This reminds me of Quantum computing. It is the opposite and a rebound from going to infinite complexity – but not deterministic.