No, it's based on the fact that you maxed out at 3, and that you only had 3 runs of 3 or more.
Boy, you are drowning in statistical white noise. Almost any suffiently large sample of random numbers will yield low-probability statistically
insignificant patterns or groupings. Consider:
122221222222222112221222122112
This obviously came from a non-random process (e.g., a person hitting 1's and 2's on a keyboard), as I will clearly demonstrate:
- In the first fifteen digits, there are thirteen 2's! The odds of that happening are very low. Clearly, someone had their finger on the "2" key a little long, then decided to throw a couple 1's in there to not make it too obvious.
- In the next fifteen numbers, there are three times as many 1's as in the first fifteen numbers. Again, not too likely for that to happen. Looks like someone was trying to even out the set by adding three times as many 1's in the 2nd half.
- The runs, in order, are four 2's, nine 2's, two 1's, three 2's, three 2's,
two 2's, and two 1's. A very improbable set of runs! Also, you can see the runs dramatically decrease in size after the nine 2's. This is obviously a sign that the person was afraid another long run would make the sequence appear deliberate.
- The distribution is twenty-two 2's and eight 1's. A fairly improbable distribution. But a skewed distribution is just the sort of thing someone would pull because large random sequences often don't appear random.
When you multiply all these improbabilities together, it is clearly the work of someone just hitting one's and two's on their keyboard. What will they pull next? A max three groupings of three numbers on a 55 number sequence?
In other words, you're taking statistical white noise and making huge extrapolations based on arbitrary patterns/groupings you've picked out. A max of three runs of three numbers is improbable, but not significant, just as a max of two and three number runs following a large nine number run (as in the above example) is improbable but not significant. Maybe it's a person not wanting the sequence to look too deliberate after a large run, or maybe it's just chance.
Contrast that with patterns that are statistically significant (Nothing but HTHTHTHTHTHTHTHTH... all heads, all tails, etc.). But you don't have that in the first sequence of numbers I gave many posts ago. You have a piece of improbable white noise you've convinced yourself is statistically relevant. Well, I think thirteen 2's out of the first fifteen numbers is even more improbable and just as indicative of
intent. Esp. the number of small sets right after that large group of 2's! So clearly, I created the above set of numbers.
Oops!
Edit: To tie this all in to the FT argument:
You're arguing that a universe that collapses in on itself after .00012879 seconds is extremely improbable and stastically significant. True, the specific amount of time is improbable, but it is not significant because it is easily explained by chance alone. There is no reason to invoke a multiverse to explain that particular amount of time, nor a fine-tuner.
However, if there are trillions of ways the universe could have gone, and only a few are life-permitting, then that is both extremely improbable
and statiscally significant. Chance alone fails to explain it, unless combined with a multiverse or oscillating universe. Some examples of the precise values needed:
"The anthropic constraints associated with the formation of galaxies involve various cosmological parameters, such as the density of the matter in the universe, the amplitude of the initial density fluctuations, the photon-to-baryon ratio and the cosmological constant (an extra term Einstein introduced into his field equations for cosmological reasons and which may cause the universe to accelerate). Some of these parameters might be determined by processes in the early universe rather than being prescribed freely as part of the initial conditions. However, as Martin Rees discussed, even small deviations from the observed values of such parameters would exclude the formation of structures like galaxies and their subsequent fragmentation into stars."
"Heinz Oberhummer, who has studied this resonance in more detail, reported some beautiful work showing how the amount of oxygen and carbon produced in red giant stars varies with the strength and range of the nucleon interactions. His work indicates that the nuclear interaction must be tuned to at least 0.5% if one is to produce both these elements to the extent required for life."
"It seems that aG must be roughly a20 for both "convective" and "radiative" stars to exist (prerequisites for planets and supernovae, respectively) and roughly aW4 for neutrinos to eject the envelope of a star in a supernova explosion (necessary for the dissemination of heavy elements). These "coincidences" might be regarded as examples of the strong anthropic principle."
And the introduction to the article was very interesting:
"Cosmologists who study the link between life in the universe and the values of the physical constants were once viewed with suspicion by other scientists. But a recent high-profile conference at Cambridge showed that the subject is fast becoming academically respectable."
http://physicsworld.com/cws/article/print/3