One conclusion reached that should be viable and understandable but not necessarily fully realized is that systems can, sans mitigating factors and based on deterministic function, exhibit stable and periodic behavior that is both constant and consistent. If it can’t be established and maintained over time, then it is not a system. If it is an established system then it is likely to develop some resiliency to drastic external or internal change. This would mean fundamentally changing a system, particularly an entrenched system would require far more energy than is often appreciated.

Another conclusion, far less intuitive, even cognitively dissonant but just as viable, is that a deterministic function can result in a random output sequence. The alternative to a deterministic function is stochastic, the same input does not always result in the same output. There is some element of chance producing a random result, similar to what happens with a fair coin toss sequence. The behavioral orbits are unstable and aperiodic.

What we have then is a deterministic, rule-based system, that once past a region of undetermined predictability, behaves unpredictably despite being a deterministic system. A system in which the function has the property of being deterministic but the qualities of its output are random.

The Complexity course teaches that it is important to distinguish between the properties of a process or a system that generates an outcome, the cause, and the properties of that outcome, the effect, especially in the long term.

The course demonstrates step by step that the logistic equation with r=4 [where r multiplies x(1-x)] is as random as a fair coin toss series. A statistical test would be unable to distinguish between the results produced by the logistic equation and that produced by a random coin toss.

The idea is that a deterministic dynamical system, is capable of producing random, or another way of saying it is chaotic behavior, regardless of how close the system is to the annihilation population {(1-x) where x is between 0 and 1}. Keeping in mind, we often don’t have any idea what is the annihilation population, just that by mathematical logic that there is one. This is applied in a relatively simple sense to a finite population, consisting of similar elements or units that die off or are eliminated and must be propagated to maintain or to increase that population and will be eradicated if not, over a limited number of time periods.

This is a result of the logistic equation that has been proven by mathematics exactly and rigorously. It can be proven, or deduced, from first principles. The claim has been rigorously established. It is not merely a computer or an experimental result.

The long-term behavior of an aperiodic or chaotic orbit depends very sensitively on its initial conditions. The idea is that a dynamical system featuring the phenomenon known as "sensitive dependence on initial conditions," or SDIC or more popularly as the "Butterfly Effect,” can with even extremely small differences in initial conditions result in a difference that can grow to become exceedingly large. This idea applies to numerous dynamical systems, not just iterated functions. It also has a more formal mathematical definition which is provided in the course.

To predict the behavior of a system with sensitive dependence requires knowing the initial condition with impossible accuracy. An example used in the course demonstrates that a difference of nanometers can result in very different results in a few time steps.

The course provides one example of tremendously improving the precision of a measured number to 15.00000001 when in truth the actual number is still 15.0 but the prediction still becomes worthless after a relatively few more time steps. To help visualize this degree of sensitivity, 15 meters is about as tall as a 5-story building, while 0.00000001 or about 10 nanometers is about 1,000 times smaller than a single red blood cell, 10 times larger than a single glucose molecule.

Something 15 meters versus 15 meters + 10 glucose molecules then will exhibit completely different behavior after just a few more time steps. Practically speaking, the difference between 15.0 and 15.00000001 isn’t simply a matter of not having good enough measuring instruments. A very small error in the initial condition grows extremely rapidly meaning long term prediction and even medium term prediction are impossible. More accurate measurements can lead to more accurate and longer term predictions but we have to work exceedingly harder to get only slightly better results. It is r, the growth factor, that makes the significant difference, not x.

The course has us imagine one version of a path of a hurricane hitting New York City, and another version hitting North Carolina based on the tiny difference of the flapping of a butterflies wings. Phenomena such as this though are essentially unpredictable because one can never measure something like this in a manner in which values are this accurate or are even physically meaningful. The course quotes James Gleick from his book, Chaos, who explains that:

*‘Its like giving an extra shuffle to a deck of already well-shuffled cards. You know that it will change your luck, but there's no way of knowing how it will change it.’*

Even computers are limited by finite precision and having to round off numbers can't calculate the true orbit with we thought we were dealing. The orbit a computer gives us is never the actual true orbit for a particular initial condition. The course explains that the computed orbit "shadows" the other true orbit, also known as the "shadowing lemma.”

Chaos, like the logistic equation, can then be defined in a mathematical sense. A dynamical system is "chaotic" if the following four criteria are met:

- A dynamical system has to be deterministic, iterated functions and differential equations are certainly deterministic. A dynamical system is just a deterministic rule, if one knows the rule and one knows (with infinite precision) the initial condition, then the trajectory is unique, it's determined.
- The system's orbits are bounded, unable to reach infinity. The logistic equation’s orbits start between zero and one and stay between zero and one.
- The orbits also have to be aperiodic, they never repeat and they never follow the exact same path twice. They don't go into a cycle. It is a requirement that the orbits be bounded that eliminates the possibility of orbits going off to infinity. If we then have bounded orbits that are aperiodic then they are confined to stay in a unit interval and yet never repeat.
- Has sensitive dependence on initial conditions, as again was demonstrated with r=4 for the logistic equation.

It also though provides a more in-depth understanding despite an inability to reach infinitely fine precision. For myself, having a mathematical foundation as a basis for understanding provides a great deal of confidence but confidence that can check itself. Not to predict where the storm will turn but the confidence to navigate the best possible course.

As Prof. Feldman advises, we sometimes have to invert our thinking about things with starting with an equation because we don't get handed equations, we get handed life and sometimes we can turn it into data. We can't assume that the world is made up of things that are either orderly or things that are random and that these are separate. That we are wrong in thinking that maybe they get jumbled together but they are separate things and need different types of explanation and requiring different means of managing seems reasonable.

One can get disorder from an orderly system, one can get deterministic randomness. We need then need to think about determinism and randomness in a completely different way that in a sense they are two sides of the same coin. They are not complete opposites and we need to think about them completely differently. The relationship between randomness and order is more subtle than we might have thought.