Richard Gage Blueprint for Truth Rebuttals on YouTube by Chris Mohr

Status
Not open for further replies.
Just checking - do you know what a margin of error is?
Almost certainly ;)

Let me be a bit clearer, now that we've seen how ignorant MM is.
There's a lot of it around.

The margin of error is proportional to the range of measurement results that can be found when measuring the value of a variable physical property; it's fundamentally statistical in nature so it's commonly expressed in standard deviations.
So you should probably be talking about uncertainty instead :)

So where did MM make a stupid mistake? Well, he asked how large a margin of error it would take for the acceleration not to be freefall.
He said "how great", but that's just being pedantic, innit.

femr2's answer: No, sorry, even your inequality's the wrong way round.
lol.

Quite some effort to respond to a simple ">0" :)
 
You also could have just acknowledged that he was right.
I don't really think he should be talking about...
margin of error
...at all, but sure, if you choose to specify margin of error in this context. Perhaps tfk will turn up and bore everyone silly with protracted discussion of enourmous levels of uncertainty using the most long-winded notation possible.

Of course, in the context of the data being discussed, and any non-zero "margin of error", it is still impossible to say if any particular value is a particular exact value.
 
Last edited:
Perhaps tfk will turn up and bore everyone silly with protracted discussion of enourmous levels of uncertainty using the most long-winded notation possible.

You just described what an engineer does. (somehow it makes sense to other engineers). :D


Do you believe the NIST reports should have been written for laymen? (slightly off topic).
 
"Just checking - do you know what a margin of error is?..."

I was defending Chris7's point, at least until someone legitimately addresses it and proves it wrong.

Chris was addressing the real effect of noise on the accuracy of the individual data point readings;

"...By "noise" you are referring to the variance in the data points.

The point I want Chris Mohr to understand is - tfk's interpretation of the data points showing variances in velocity is incorrect, that's just "noise".

Wrong. You have acknowledged that the variation in the data points is "noise", not a variation of acceleration.

FFA is a very unique acceleration. When something is measured by several sources to be FFA within the margin of error, then it is considered to be FFA. That is the way NIST and Chandler or any reasonable person would describe it. ...
"

I wanted to know what % of the data plot readings was actually noise, and what effect that had on the 'real margin of error'.

For instance, how many, if any, of these data plots would not support FFA if the % of noise was subtracted?

Try and keep on topic David.

MM
 
I was defending Chris7's point, at least until someone legitimately addresses it and proves it wrong.

Chris was addressing the real effect of noise on the accuracy of the individual data point readings;



I wanted to know what % of the data plot readings was actually noise, and what effect that had on the 'real margin of error'.

For instance, how many, if any, of these data plots would not support FFA if the % of noise was subtracted?

Try and keep on topic David.

MM
Why can't you state a mechanism that would support a FFA as evidence of a CD? Are your handlers forbidding you to address this?

:confused:
 
I wanted to know what % of the data plot readings was actually noise, and what effect that had on the 'real margin of error'.

For instance, how many, if any, of these data plots would not support FFA if the % of noise was subtracted?

Whut?

Really, whut? Can you perhaps give a specific hypothetical example of what all this might mean?

I must say in all candor that if you are as confused as the text I just quoted implies, I would have no idea how to persuade you that a point has been legitimately addressed. If you really want to address some question about one of the plots, I think you'll somehow have to explain your question with reference to one of the plots.

This topic really isn't so hard. What is c7's evidence that the variation in measured accelerations is merely "noise"? As for his assertion, "When something is measured by several sources to be FFA within the margin of error, then it is considered to be FFA," first of all, what is the basis of that assertion? whence that serene, authoritative passive voice? Second, how could NIST's measurement of average acceleration -- regardless of its 'margin of error' -- be used to prove that the acceleration was constant?
 
I was defending Chris7's point, at least until someone legitimately addresses it and proves it wrong.

He has repeatedly been legitmately addressed and proven wrong.

1. If he cannot say the acceleration was exactly g from one time slice to the next, he cannot say that the acceleration was constantly g. Therefore he cannot legitimately claim a constant acceleration of FFA. Margin of error, you know.
2. In any case, he has not shown a scientific derivation eliminating other causes of g acceleration at any particular incident, than his theory of CD. So, g acceleration (your FFA) is not proof of CD.
 
Last edited:
Any reasonable and competent person would also quantify the margin of error in order to support that statement. Would you care to do so?
In this video Mr. chandler states that the result is within 1% of FFA - 9.885m/s2 +or- 0.456 [margin of error]
In NYC, g= 9.802m/s2 which is well within the margin of error.
He uses the terminology "Indistinguishable from free fall".

NIST did not say what the margin of error was but their measurement was within 1/10th of 1%. That is as close as could be measured and the difference is negligible - too small to be considered.

Both NIST and Chandler stated what a scientist or any reasonable person would state. WTC 7 fell at FFA for ~100'.

Only someone with a desire to deny anything that proves CD would try to deny that WTC 7 fell at FFA for ~100'.

I keep repeating these facts because y'all keep ignoring/denying them and insisting that WTC 7 did not fall at FFA. You are kidding no one but yourselves.

You can lie to yourself if you want but for Chris Mohr to engage in mean-spirited obnoxious putdowns; that NIST and Chandler don't know what they are talking about, while appealing to the "authority" of biased anonymous posters, and using those opinions as a valid rebuttal in a rebuttal video, is unconscionable.
 
Looking at the "margin of error" question, my limited understanding is that the data points created in the NIST Report were within the margin of error of their own measurements. I made that clear in my video 18, that we MAY be looking at >g or MAY be looking at margin of error.

If I understand correctly, FEMR is saying that his measurements are more precise, and the margin of error is smaller and some data points fall outside the margin of error and therefore strongly point to actual >g at some times. FEMR is this correct?

Chris7 your last post is unworthy of you and says more about you than about me.
 
Looking at the "margin of error" question, my limited understanding is that the data points created in the NIST Report were within the margin of error of their own measurements. I made that clear in my video 18, that we MAY be looking at >g or MAY be looking at margin of error.
That is called sophistry. You are trying to create a doubt about FFA when there is none.

The slightest movement in the camera, several miles away would account for the momentary >g anomaly.

FEMR's momentary >g anomaly is not the same a your ><>< interpretation, which you should now know is incorrect.

FEMR's graph confirms the NIST and Chandler statements that WTC 7 descended at FFA [and momentarily >g if you want to believe that] for ~100 feet.

FEMR's results, like Chandler's and NIST's, mean that all the supporting structure provided NO resistance after a descent of only 4 or 5 feet when the exterior columns would still be providing considerable resistance. i.e. they were removed.
 
Only someone with a desire to deny anything that proves CD would try to deny that WTC 7 fell at FFA for ~100'.

Right here is the heart of your error.

You don't seem to understand that while you think that FFA would prove CD, most of us don't. So, while you apparently have a strong incentive to believe that "WTC 7 fell at FFA for ~100'," we don't have any corresponding incentive to deny it. For us it's an empirical issue.

FEMR's graph confirms the NIST and Chandler statements that WTC 7 descended at FFA [and momentarily >g if you want to believe that] for ~100 feet.

The question isn't whether someone wants to believe it; the question is whether it's supported by the data. Facially, femr2's analysis supports it.

Moreover, if average acceleration is very close to g over some time period, and if acceleration is > g for part of the period, then acceleration must be < g for some other part of that period. The math doesn't care what anyone wants to prove about CD.
 
In this video Mr. chandler states that the result is within 1% of FFA - 9.885m/s2 +or- 0.456 [margin of error]
+/- 0.456 is not the "margin of error". Uncertainty was not determined, and is certainly greater, probably by an order of magnitude.

NIST did not say what the margin of error was
Correct.

but their measurement was within 1/10th of 1%. That is as close as could be measured and the difference is negligible - too small to be considered.
Incorrect.

Both NIST and Chandler stated what a scientist or any reasonable person would state. WTC 7 fell at FFA for ~100'.
Incorrect...

NIST say their estimation of approximate and average acceleration is equivalent to the acceleration of gravity during that period.

...again, NIST say their estimation of approximate and average acceleration is equivalent to the acceleration of gravity during that period.

In addition, their data is flawed.

Only someone with a desire to deny anything that proves CD would try to deny that WTC 7 fell at FFA for ~100'.
Incorrect. Only a stubborn fool would continue to willfully omit a simple "~" before the "FFA".

I keep repeating these facts
You are repeating your personal and deliberate "version" of the "facts", and ignoring the "truth" in the process. A very silly idea.

because y'all keep ignoring/denying them and insisting that WTC 7 did not fall at FFA. You are kidding no one but yourselves.
My god man. Why on Earth do you think there is a reason to NOT say ~FFA instead ? AT FFA is false. At ~FFA is true. The distance is a bit out, more like ~80ft, but as I've said to you before...I have no big issue if you interpret the data as showing ~80ft of ~FFA.

Continuing to argue EXACTLY "FFA" is ... not very clever.
 
C7's stubborn insistence on repeating false things wrt the acceleration profile of some points on some collapsing assembly has only one explanation in my opinion:

C7 not only thinks that "FFA proves CD" (false), but that also "only and exactly FFA proves CD" (even falser).
 
Looking at the "margin of error" question, my limited understanding is that the data points created in the NIST Report were within the margin of error of their own measurements.
That doesn't make much sense. NIST do not state uncertainty.

If I understand correctly, FEMR is saying that his measurements are more precise
Correct.

and the margin of error is smaller
I wouldn't use that metric, but yes :) Uncertainty has not been fully established.

and some data points fall outside the margin of error and therefore strongly point to actual >g at some times. FEMR is this correct?
No, though I am confident >g is indeed reached.

I am confident that the general profile "shape" is true. Similar profile shape emerges for multiple datasets, multiple viewpoints and multiple smoothing methods...


There is also close agreement on profile magnitude, although all magnitude is subject to error imposed by translation from pixels to real-world units.

Checking static variance alone (the variance of the profile during time the traced location is assumed to be static)...

...it is trivial to suggest +/-3ft/s2, perhaps more.
 
You are trying to create a doubt about FFA when there is none.
~FFA.

The slightest movement in the camera, several miles away would account for the momentary >g anomaly.
No, it would not. Similar profile shape (and >g behaviour) from multiple viewpoints.

The effect of camera motion is also minimised by subtraction of static point data...

Note the vertical scale. Sub-pixel scale motion of the camera is already subtracted from the data.

FEMR's graph confirms the NIST and Chandler statements that WTC 7 descended at FFA [and momentarily >g if you want to believe that] for ~100 feet.
No, that is YOUR interpretation. Not far off, but...

FEMR's results
My results :) Not YOURS...

[qimg]http://femr2.ucoz.com/_ph/7/2/5930538.jpg[/qimg]

My acceleration graph shows:

a) Rapid increase in acceleration from release to somewhat over-g in approximately 1s.

At the end of this period, the NW corner had descended ~9ft

b) Slow reduction in acceleration to approximately g over approximately 1.5s.

At the end of this period, the NW corner had descended ~83ft

c) More rapid reduction in acceleration to roughly constant velocity over approximately 2s.

At the end of this period, the NW corner had descended ~270ft


If you use the velocity graph...
[qimg]http://femr2.ucoz.com/_ph/7/2/135024800.jpg[/qimg]
...you'll obviously miss some profile shape detail, but you could say...

~1.75s at ~FFA (and I'd not complain too much)


I'd be okay with... ~1.75s of ~FFA, of which >~0.5s is over-g ... for the NW corner.
~1.75s of ~FFA, of which >~0.5s is over-g.
 
The effect of camera motion is also minimised by subtraction of static point data...
[qimg]http://femr2.ucoz.com/_ph/7/2/468167389.jpg[/qimg]
Note the vertical scale. Sub-pixel scale motion of the camera is already subtracted from the data.

It should also be highlighted that data is for the Dan Rather viewpoint.

The Cam#3 viewpoint allows higher fidelity data extraction...

...with ~0.01 pixel variance on static point data (camera motion), which is pretty darn spectacular, but as the viewpoint perspective is skewed that data is less suited to provision of derived velocity and acceleration data.

Needless to say that NISTs decision to use that viewpoint, without static point extraction, without perspective correction, ... , was not wonderfully wise.
 
Chris, it might help to drill down here.

Looking at the "margin of error" question, my limited understanding is that the data points created in the NIST Report were within the margin of error of their own measurements.

First let me say that my training is inter alia in survey sampling; "margin of error" has (in principle) an exact meaning in that context. I am not aware of a comparably exact meaning in other contexts, but by analogy, the idea is that we can estimate bounds on the error(s) in our measurements. Initially, NIST and others measure displacement (vertical and/or horizontal) from some reference position. Each such measurement is made with error.

I'm about to walk you through something you may already understand -- perhaps more exactly than I am about to explain it -- so that I can refer to it when I react to your statement.

Imagine a graph with time on the X axis and displacement on the Y axis. Each point has a vertical error bar, implying that the true displacement at that time could be anywhere on that bar. (That's really inexact, but in this context I can't be more exact without inventing some assumptions.) Then we can imagine arbitrarily many possible "real" displacement curves that go through all those error bars. I desperately want to complicate this account, but I think that's complicated enough for the moment.

Now, imagine a velocity graph based on the displacement graph. Our best estimate of the average velocity between two consecutive measurements will depend on the change in displacement. Since the displacements are measured with error, the velocities are estimated with somewhat more error, so the error bars are longer, and the possible "real" velocity curves are more varied. (Crudely, if one displacement is 50 plus-or-minus 1, and the next is 60 plus-or-minus 1, then the change in displacement could be 8, 12, or anything in between: i.e., 10 plus-or-minus 2.)

Then we can make an acceleration graph based on the velocity graph; same general idea, mutatis mutandis. So all the error bars get longer again.

Now, back to your words: "the data points created in the NIST Report were within the margin of error of their own measurements." It's not obvious what you mean by "data points." Most often it would refer to the "measurements" themselves -- but I don't think you intended to offer a tautology. Two paragraphs down I offer a possible interpretation of what you said.

At a high level, I think you may mean that given the possible measurement error in NIST's displacement estimates, their data are (maybe!!) consistent with a constant rate of acceleration. Thus, if NIST put error bars on their displacement estimates, and then derived a velocity graph (with bigger error bars), a straight line (denoting a constant change in velocity) would fit through that set of error bars. Equivalently, if NIST went on to derive an acceleration graph (with even bigger error bars), a horizontal line (denoting zero change in acceleration) would fit through those error bars.

Maybe that last is the picture in your mind: the acceleration estimates are all pretty close to g; every error bar overlaps g; equivalently (if all the error bars are the same size), each point (acceleration estimate) is "within an error bar" of g. I don't usually think of derived estimates as "data points," but it's not unreasonable.

(Or you may not have been referring to constant acceleration at all, but only to whether NIST's estimate of average acceleration is within measurement error of g. I'm not sure that question is inherently of much interest, so I won't walk through that scenario right now.)

If I understand correctly, FEMR is saying that his measurements are more precise, and the margin of error is smaller and some data points fall outside the margin of error and therefore strongly point to actual >g at some times.

Again, if this means that femr2's displacement measurements have smaller error bars (confidence intervals, perhaps), and therefore his acceleration estimates have smaller error bars, and some of those error bars don't overlap g but are always > g, that works.

What has been missing is a way to assign those "error bars," or more formally, to model the error.
 
Here are people who worked on 9/11 matters in one way or another. All had specialized training in whatever aspect of the work they did. Not one has come forward to support the 9/11 Truth version of events: 7,000+ FBI Agents who conducted a three-year 911 investigation; 1,500 people who worked the flight 93 crash scene; 40,000 people who worked the piles at Ground Zero; 55 FBI Evidence Response Teams at Fresh Kills in New York; 8,000+ people who worked the scene at the Pentagon. There are at least five peer-reviewed journal articles in ASCE publications that conclude the buildings came down by fire and those Journal articles have been cited in other articles and used to redesign fire codes in tall buildings (meaning that the articles have peer respect). NIST was peer-reviewed by four independent civil engineering firms and concluded fire brought the buildings down. "What Did and Did Not Cause the Collapse of the WTC Towers in NY" is another. That was produced by the Department of Civil Engineering at Northwest University (Structural Engineering Report No 07-05 / C605c).
 
Checking static variance alone (the variance of the profile during time the traced location is assumed to be static)...
...it is trivial to suggest +/-3ft/s2, perhaps more.

Hey, can you point me to the data you used there? Maybe at some point I can work up an illustration of what I was laboriously trying to describe in words re: measurement error.
 
Status
Not open for further replies.

Back
Top Bottom