Tony Szamboti
Illuminator
- Joined
- Jun 2, 2007
- Messages
- 4,976
It's simple arithmetic. The discrepancy between the raw and the smoothed data comes about because of the smoothing procedure you're applying. It should be obvious that a smoothing algorithm removes spurious peaks and troughs in data; that is, after all, exactly what a smoothing algorithm is supposed to do. Unfortunately, it will just as easily remove significant peaks and troughs in data. Since you're looking for a significant trough, you shouldn't be smoothing the data. If your data is noisy enough that you need to smooth it, then it's too noisy to detect the presence or absence of the feature you're looking for.
This is first year undergraduate stuff, if that.
Dave
The size of the discontinuity if WTC 1 had collapsed naturally would be quite large and it certainly shows itself in the Verinage demolitions.
The large deceleration seen in all of the Verinage demolitions with the same measurement technique completely refutes your argument here that we are smoothing out any real deceleration.
You are forced to argue that we are missing the discontinuity, yet you haven't taken any measurements yourself. The reasons for that is that there was no deceleration in the WTC 1 collapse, you have no real argument, and would prove yourself wrong.
Dave, it really is important to your credibility that you tell us what other work you do besides posting on this forum.
Last edited: