Anders Lindman
Penultimate Amazing
- Joined
- Sep 17, 2010
- Messages
- 13,833
If you think about Complexity in terms of Information, I think that the useful energy that is involved depends on our ability to deal with uncertain and/or redundant conditions.
Such conditions are usually considered as "white noise" that has to be reduced in order to get some desired results by using minimum energy.
I think that the tendency of "getting maximum results by using minimum energy" can quickly lead us into triviality and atrophy if redundancy and uncertainty are not taken as natural and challenged conditions, which stimulate us to research beyond our current cozy spotlight.
I have found that the main stream of the mathematical development of the past 3,000 years is based step-by-step serial view of the researched (abstract or not) things, and as a result it uses techniques that are tuned to deal with strict-only information.
Take for example the intuitive concept of Set, which is based on a collection of distinct things such that, for example, {a,a,b}={a,b}.
It means that a property like redundancy as found in {a,a,b} is not taken as an essential case of information but as a case that has to be reduced into {a,b}, in order to get the optimal conditions for analytic research, which is basically a step-by-step serial way of thinking.
By using only analytic methods, our ability to use also parallel thinking is not stimulated and not developed.
Uncertainty and Redundancy are actually natural results of parallel thinking, which deals with information at-once, where order or strict distinction are not its natural characters.
Take for example the concept of Superposition of opposites, which is understood in terms of addition of values that are summed to 1, for example: 0.5True + 0.5False .
This approach weaken the involved opposites in order to avoid logical contradiction, where this logical contradiction is a direct result of step-by-step thinking, which can't deal at-once with opposites like True and False.
But under parallel thinking 1True + 1False is simply an uncertain information of "strong" superposition, which may or may not collapsed into certain 1True or certain 1False.
For more details about this non-standard view, please look at http://www.scribd.com/doc/18453171/International-Journal-of-Pure-and-Applied-Mathematics-Volume-49 .
Ok, that's similar to my guess in another post that what you mean by entropy is related to redundancy which can be seen as a kind of noise.
Yeah, that could be true. And redundancy is also related to uncertainty. Kind of at least I can see. For example if we have strings ab, aab, aaab, aaaab and bab, then only strings ab and bab are free from 'noise' so to speak, and thereby free from uncertainty in some sense.