• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Merged Discussion of femr's video data analysis

We obviously view images in a different way.

.....
Behaviour of descent is the focus. From clearer details of behaviour over time, clearer details of mechanism can be determined...such as the implications for the significant over-g period.

If you see no value in revealing the very short period of time to reach g, and exceed it for a while, that's entirely up to you.

What's your estimate for the error bands of the data in m/s2? I think it may have been discussed but I can't remember what the number was.
 
My point is that ignoring data is what keeps the JREF balloon afloat. You have a recorded history of it.
 
AE, anyone who believes that accleration close to or exceeding g are normal doesn't know anything about the mechanics behind any collapse, including intention demolitions has probably dropped a butter knife off a table.
fixed.
 
Last edited:
Your personal frame of reference is entirely up to you.

The relevance of yours remains in question.

Your opinion is noted. You shouldn't have wasted any of your time at all discussing 9/11. Easy eh ? ;)

Ah - that retort came too quick. I explained below why you should have stopped (and not done Poly(n)s). You on the other hand don't follow up with an explaination for your opinion.

Splendid.

You're welcome.
(Qualification: I have not actually studied your sources, analysed your methods and repeated your work to figure out that your data is good. I just couldn't find a major mistake at quick glance, and am giving you the benefit of the doubt. Of course I do note that generally your data points don't come with a margin of error, as they should, or you don't use that m.o.e. in any subsequent steps. So to that extent, your data is not so good. Maybe still better than NIST's, but I don't care too much).

In your opinion. I disagree. Physics is certainly in mind, just not embedded within Poly(10) equation which is, as has been suggested a number of times, a form of compression only.

Yup. So you agree all that work on Poly(10) is a waste of time?

Just doesn't need to be anything more, and by definition the low parameter count of the NIST function could not possibly reveal and subtle variation in trend. All basically a repeat of #1324.

You have yet to convince me that the subtle variations in trend are signal and not noise.

If they are signal, a non-wasted use of time would be a search for a physical model that explains the wiggles. Maybe some wave function...

Numerous methods reveal the same underlying trend. The Savitzky-Golay curve is simply smoothed data. Differentiating between low level noise and low level signal is impractical and pointless.

Impractical maybe, pointless certainly not. Numerous methods recreate noise and reduce signal.

The data contains some low level noise. The derived profiles contain some noise, or are distorted to some extent by some noise. Just deal with it :) It's clear that the resultant profile is as accurate as has yet been presented.

The trend the S-G profile reveals is in close correlation to the Poly(10) and Poly(50) curves generated by a completley different method.

If you want to provide a more accurate and detailed profile, by all means. Until you do, it's the best available.

The most accurate and detailed profile is the (unsmoothed) data itself.
 
AE, anyone who believes that accleration close to or exceeding g are normal doesn't know anything about the mechanics behind any collapse, including intention demolitions.

I posted earlier to the first analysis of Frank Greening days after the release of the NIST function. He is a debunker and was probably the best poster on this forum.

He could see the problems with near g acceleration and the greater than g hump from his very first post in 2008.


In a different universe it is considered normal. He understood more the first couple of days than you have understood in 3 years.

MT, that kind of baseless assertion is not helpful. You're not a structural engineer, if I recall correctly neither is Frank Greening - even though he is obviously very adept at physics.

Acceleration close to G, IMO is not a big deal from the engineering viewpoint. It is remarkable and interesting, of course, but I think I understand the mechanics of it. The principle is relatively simple: you had an internal collapse of structure with partially intact connections to the exterior structure. The building structure was being destroyed E to W with the most intact structure to the West. When the exterior columns finally buckled, the structure they were supporting was attached to a mass that was already accelerating, hence the near G interval of collapse - which was relatively brief.
I don't see a problem with it.
 
AE, anyone who believes that accelerations close to or exceeding g are normal doesn't know anything about the mechanics behind any collapse, including intentional demolitions.

I posted earlier to the first analysis of Frank Greening days after the release of the NIST function. He is a debunker and was probably the best poster on this forum.

He could see the problems with near g acceleration and the greater than g hump from his very first post in 2008.


In a different universe it is considered normal. He understood more in the first couple of days than you have understood in 3 years. Where in your universe do buildings collapse with this acceleration range?

Personal incredulity (yours and others') leads you to put the conclusion before any argument.

I see a strawman too, an appeal to authority, ...
 
Yup. So you agree all that work on Poly(10) is a waste of time?

Not at all, it demonstrates the inherent distortions of the data smoothing. And it pretty much kills ETA[Femr2's] previous attempt to measure the exact time interval above G..... or claim that the NIST data are 'wrong' (any more than Poly(50) is 'wrong' or 'right', you see)
 
Last edited:
They serve to filter out noise, and clearly reveal the underlying trend, as can be seen by comparison with simple symmetric difference derivation of acceleration which shows the same general profile but with much higher noise...
111femr82136974.jpg



They reveal more detail in the profile than the NIST model, which is their purpose. They do indeed smooth the profile, yes.


Wordplay.

My intent is clear, and the results a significant improvement upon pre-existing similar information.
They filter out the noise? How? Reference that please. Show me the book which says that. Thank you in advance. Thank you very much.
 
Last edited:
Compared to the S-G curves the Poly(10) and Poly(50) both introduce distortions at various points.
I'd hate to see someone try to claim much precision out of the Poly curves, frankly. I don't think that's possible.

Polynomal expansion around a point of interest has been perfomed successfully for centuries.

The question is when you deviate from the region around which you are expanding. Accuracy of the acceleration profile will depend on region, areas around the endpoint being naturally suspect.

The reason is simple in that it is very difficult to reproduce curvature of a shape if you are missing half the shape. Around endpoints the shape of the curve is lost, and retreiving curvature has serious limits.

(acceleration being curvature)
 
Last edited:
Compared to the S-G curves the Poly(10) and Poly(50) both introduce distortions at various points.
Indeed.

I'd hate to see someone try to claim much precision out of the Poly curves, frankly. I don't think that's possible.
There seems to be grey understanding of what *precision* means in this context. In my view you're asking about detail, and in that sense the variation in acceleration over time (the trend) revealed by all my methods provide increase in detail over what we have with the NIST curve.

But in terms of indicating general trends, I much prefer the S-G, because it represents the data better than all the others.
I agree, and is not subject to the, imo, misplaced need for others to liken it to a model simply because it involves an equation. Always a problem with techy-types as I know well :)

Relating back to building mechanics and engineering, I agree with the other commenters who see column buckling up to about the 12.5s mark, then the ensuing rapid acceleration from about 12.5s to about 14.5s, then a period of diminishing acceleration.
Looks reasonable.

I also accept Femr2's view that acceleration might have exceeded 32m/s2 briefly.
I'm not quite sure why you assert it as my view. The data from all profiles, including that from NIST, includes a period of over-g acceleration.

It would be interesting to come up with a method which could explain the approximate nature of this data, and the futility of trying to assign some absolute value to it.
:) Perhaps replicating the NIST linear regression using my data would be funny such a method ?

Note: In my view, the suggestion of >freefall interval is actually helpful in that it could only occur if the 'core' collapse were pulling on the exterior structure
It's not supposed be unhelpful. It's simply honest observation and reporting of behaviour apparent in all datasets.

My data shows a significantly higher peak acceleration, but it's data for the NW corner, which clearly was subject to higher acceleration. (So don't throw out the high value just because it's too far above g for you to be comfortable ;) )
 
My data shows a significantly higher peak acceleration, but it's data for the NW corner, which clearly was subject to higher acceleration. (So don't throw out the high value just because it's too far above g for you to be comfortable ;) )

Nobody's throwing it out. That is a non sequitor.
 
Behaviour of descent is the focus. From clearer details of behaviour over time, clearer details of mechanism can be determined...such as the implications for the significant over-g period.

That's where a relationship to a model would come in handy.

;)
 
What's your estimate for the error bands of the data in m/s2? I think it may have been discussed but I can't remember what the number was.
I estimate positional variance of +/- 0.2 pixels for the Dan Rather Vertical Component Data...
 
Except that the building wasn't built out of pixels. Hence the question to you:

What's your estimate for the error bands of the data in m/s2? I think it may have been discussed but I can't remember what the number was.
 
You're welcome.
(Qualification: I have not actually studied your sources, analysed your methods and repeated your work to figure out that your data is good. I just couldn't find a major mistake at quick glance, and am giving you the benefit of the doubt. Of course I do note that generally your data points don't come with a margin of error, as they should, or you don't use that m.o.e. in any subsequent steps. So to that extent, your data is not so good. Maybe still better than NIST's, but I don't care too much).
Noted.

So you agree all that work on Poly(10) is a waste of time?
Not at all. Quite a bit of this discussion about it, absolutely.

You have yet to convince me that the subtle variations in trend are signal and not noise.
That does not change my confidence level in the validity of the presented trends.

If they are signal, a non-wasted use of time would be a search for a physical model that explains the wiggles. Maybe some wave function...
It's a chaotic real-world event. It's unlikely the *wiggles* conform to a studied and known behaviour, but if you come up with anything by all means grab the raw data and give it a whirl.

Impractical maybe, pointless certainly not. Numerous methods recreate noise and reduce signal.
By all means grab the data, eliminate as much noise as possible and let me know the process. That would be great.

My current preferred method is the Savitzky-Golay filter.

The most accurate and detailed profile is the (unsmoothed) data itself.
I've provided that too ;)
 
Acceleration close to G, IMO is not a big deal from the engineering viewpoint. It is remarkable and interesting, of course, but I think I understand the mechanics of it.

Name one example on the planet earth of a building reaching that range.

As also pointed out by Frank Greening, it is. way, way outside the range of even known demolitions.

You make up some physics story and your fellow posters just nod.

And you call that "science".
 
Name one example on the planet earth of a building reaching that range.

As also pointed out by Frank Greening, it is. way, way outside the range of even known demolitions.

You make up some physics story and your fellow posters just nod.

And you call that "science".

For starters, we don't have a lot of footage of 47 story buildings collapsing, so lack of data is a problem. But you're making a modified 'no precedent' argument, which doesn't actually prove anything.

Sometimes things happen that have not happened before.....that's what a precedent is, y'know. :)

2) Frank Greening is not a structural engineer nor a demolition expert. So why you would reference him puzzles me. It comes across as an attempted argument from authority.

it is. way, way outside the range of even known demolitions.
Gee, way to quantify it... but to your point, that makes it very unlike a controlled demolition, by Frank's metrics. :i:

3) You haven't made a point-by-point rebuttal of my outline, nor have you made any engineering argument.

4) That's just another bare assertion, you've been doing that a lot....:(

Feel free to make an engineering argument.
 
Last edited:
LOL

A truther saying someone's ignoring data?

Precious.

Yes, Femr measurements also show the posters here and the NIST to be living in a dream wrt WTC1 early motion.

You basically faked data to support south wall failure. JREF posters still tend to believe in that fake data and base their whole understanding of WTC1 collapse initiation upon it.

This is why real data is so important.
 

Back
Top Bottom