As always. plenty of unrelated data....
Except that's not true. Twice now you've dismissed my thorough coverage of a topic with these one-liner deflections, then you come back a day or so later and try to regurgitate that same coverrage, which is now somehow relevant again. Only you try to pretend you're the one teaching it, as if it were something you knew all along. Do you really think people don't see through these obvious stunts?
...and complete lack of knowledge of the concept of randomness. Just stop pretending that you know something about statistics and follow this link
https://en.wikipedia.org/wiki/Random_sequence
Yes, any beginning student is familiar with what a random sequence is. What I displayed, and what you obviously cannot understand, is how the constructs of randomness are actually used to achieve the desired ends in the statistical control of data collection and analysis, and how the randomization itself becomes moot once the ends are met. Further, in your rush to present an elementary concept as if it were some great cosmic truth, you have ignored what I said about the nature of randomness and how it applies to your assertion that "randomness" prevents any data from ever being removed from a dataset.
The
sine qua non of randomness is independence. What makes a sequence truly random is the property that the value of any number in the sequence is wholly independent of any other value. Wikipedia doesn't really get into that, and that seems to have limited your understanding of the subject. If we say that no two elements in a random sequence can depend in any way upon each other, then this severely limits what we can say about relationships within and among sets suggested by any rule we invent as functions of such a sequence. That we sometimes assign meaning to those sequence values for one purpose or another does not suggest that the properties which attach to those purposes somehow trickle back to the numbers themselves and must remain faithful.
Random variables that generate such sequences stand in for quantities we do not know, or cannot observe, but about which we know some things generally. A sequence of values generated by a fair die, for example, is expected to be a random sequence governed by the rule that each of the six possible values can occur with equal probability at each roll, independent of any prior or subsequent roll. That's something we know generally about a fair die, and stands in for the complex Newtonian dynamics behavior we know actually determines the outcome. That we play various games with dice in which we assign different meaning to those outcomes does not violate the randomness of the underlying process. The random sequence serves the process of game play. That's not to say that all other elements of game play must also adhere to randomness. The rules of game play are entirely separate from the properties of Newtonian dynamics, and from the random sequence that represents them.
In the empirical sciences, a sequence produced by a random variable stands in for quantities the experimenter
should not know, because if he knew them he might apply an unconscious bias and skew the results. Zimbardo tossed a coin to determine whether any given subject should be a Guard or a Prisoner. That process stood in for what he or the other experimenters might have thought about the propriety of such an assignment. All the variables that could have contributed, consciously or unconsciously, to the assignment of role were deliberately set aside. Zimbardo didn't get to say things like, "Hm, I think Harry would be a good guard," or "I think it would be interesting to see how the big Hispanic guy fares as a prisoner."
But then of course once the random sequence had done its job, the rest of the experiment was governed by the rules invented to apply to Guards and Prisoners, one of which was that Guards had to honestly pretend to be guards. Similarly, once the dice are rolled, the rules of craps determine what happens next. If the player breaks one of those rules and is dismissed from the game, it doesn't change the random process that provided the antecedent to that rule. This is where it's very important to understand independence. The assignment of role -- Guard or Prisoner -- doesn't create any sort of new dependence within the group, or new independence between the groups, that the prior randomness cares about. Tom and Dick may both have been assigned as Guards, but that doesn't mean that Tom and Dick's coin tosses are now somehow dependent. They are related only by the contrived rule that was driven by a random variable, not any property of the variable itself. Nor, if Harry is a prisoner, does this mean that any sort of dependent disjunction now exists between him and either Tom or Dick. Indeed, even if Dick gets hit by a bus and killed, his consequential withdrawal from the Guard group has bugger-all to do with the independence of the coin toss that put him there in the first place. Indeed the independence property says we can remove any arbitrary element from a random sequence and the sequence will remain random. It's vitally important that this property hold. Contrary to your assertion, randomness says we
must be able to remove any arbitrary element of the sequence without the sequence collapsing into non-randomness.
The key concept there being any
arbitrary element. We can't remove a number based on its value or a relationship of its value to any other value. "Remove every third element" is okay. "Remove all the even numbers" is not, because a random sequence must be able to produce an even number at equal probability with an odd one. "Remove each number that is greater in value than its predecessor" isn't okay, because that would constrain the value of every item according to its place in the sequence. "Remove the first N elements" is valid. In your rush to pontificate, you forgot that Jahn's REG used a shift register that happily discarded some of the random bits, not based on their value but upon where the digit occurred in the sequence whatever its value. Commensurately, Zimbardo classified one of his subjects as an outlier based not upon which group he belonged to -- not upon whether the coin came up heads or tails in his case -- but upon violation of the rules of the experiment. That in no way affected the randomness of the remaining Guard group. The coin tosses that drove the rule that placed them there was still a random sequence of coin tosses, because their tosses were completely independent of the tosses that put the outlier in that group.
Similarly Palmer chose to disregard Operator 010 not based on whatever process selected her as a subject from the pool, but upon the conformance of her subsequent data to the expected distribution of results and upon suspicion of violation of a control protocol (i.e., the volitional variable, which seemed to defeat even Operator 010 every time). The chastisement you received for wanting to withdraw data was not because it violated randomness but because it violated homogeneity. I explained this previously. Initially, random sequences are used to achieve homogeneity. Thereafter, homogeneity is the property the experimenters endeavor to preserve.
Your knowledge of statistics is limited to elementary concepts you frantically Google for the day of, combined with your arrogant-yet-simplistic theorization for how things "must" work. And as such you can't possibly deal with an actual informed discussion of the topic, so now you're propping up all the standard excuses for why you don't have to deal with your most competent critics. Your act is nothing but bluff-and-bluster and gaslighting. Good luck finding an audience who will endure that, or even be fooled by it for very long.