• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Merged Discussion of femr's video data analysis

I will make some effort towards finding data on some CD's for you to look at.
Fine, though I am not likely to action it until each of the points we have been discussing can be resolved.

Quite a few have drifted by in the course of recent discussion. I'm not a fan of leaving final opinion hanging and unstated.
 
Ultimately this is an argument about standards. Femr2 feels that the investigation NIST performed did not attain sufficient detail to meet his personal standards. Others disagree -- not because they claim femr2's wrong about whether or not it satisfied femr2's standards, but because it satisfies their standards that differ from femr2's...
I agree it is about standards but I suggest that you are missing one further relevant standard and the context where that standard is needed. That standard is needed in any situation where the essence of the issue is that lower quality data leads to erroneous conclusions whilst higher quality data can lead to more accurate conclusions.

We have such a situation in this thread with the current debate. And the higher standard of data is not needed merely to satisfy femr2's personal interest - it may do that BUT it is needed to satisfy the requirements of the topic under discussion - independent of femr2's personal wishes.

The context for the start of this current discussion was that claims by cmatrix were erroneous and at least in part the error was the result of relying on lower quality data. Calling into play the higher quality data was a necessary part of addressing those erroneous claims - a process which requires the errors in the lower quality data to be identified.

The fact that one lot of data comes from NIST and the other from femr2 is actually irrelevant EXCEPT that it attracts a high level of noise which I do not need to explain to members familiar with this sub forum.

And none of the preceding detracts from the technical need to establish that what I describe as 'lower' and 'higher' quality data is in fact truly 'lower' or 'higher' respectively. That in itself being a legitimate topic for discussion/debate.
 
And do you not think it matters that such infamous metrics such as 2.25s of freefall and 40 percent longer than freefall are actually...wrong ?

Are there two numbers generated within the NIST report that needed to be accurate more than those two ?

And yet you seem confused about motivation and relevance. Odd.

I'd say that since those two numbers have no bearing on the result of the investigation that they are accurate enough for giving an indication of collapse time.

They are probably two of the least important numbers in the report, so much so that one of them wasn't even included in the draft report.
 
I'd say that since those two numbers have no bearing on the result of the investigation that they are accurate enough for giving an indication of collapse time.

They are probably two of the least important numbers in the report, so much so that one of them wasn't even included in the draft report.
Both statements true enough.

But those are not the reasons why the figures are being discussed in the current discussion in this thread.
 
Calling into play the higher quality data was a necessary part of addressing those erroneous claims - a process which requires the errors in the lower quality data to be identified.
I don't see such errors - rather misinterpretations of the accuracy of the NIST data and of their interpretation of such data, and baseless assumptions made over a personal assessment of what they mean. Since it is one of the premises for this discussion, I think it's important to highlight that part.

Plus, the unjustified adornments with which femr2 accompanies his discussion also introduce quite some noise and have brought up the topic of irrelevance.
 
They are probably two of the least important numbers in the report, so much so that one of them wasn't even included in the draft report.
That rather depends upon which audience you are referring to.

Assuming that your presence here is to resolve the concerns of your target audience in regard to the events, then your target audience consists almost entirely of folk who lend a great deal of significance to those two metrics, and little importance of other values you may personally think of more import.

As I have regularly highlighted, you cannot hope to change stances such as...
cmatrix said:
However, if their theory is to believed, the 2.25 seconds of free fall must have resulted from near-simultaneous buckling and breaking of the 58 perimeter columns and most of the 25 core columns over eight stories.
...without highlighting that the literal interpretation of the number is incorrect because of the fact that there was not 2.25s of freefall. There was in fact effectively zero period of *freefall*, but instead a constantly changing acceleration throughout descent, with variation at all points across the dimensions of the building.

Again, a more representative acceleration profile is...
http://femr2.ucoz.com/_ph/7/628055186.png

...and it is clear that only two small instants would satisfy the description of *gravitational acceleration* with all other points in time being under or over such.

Does the profile contain an amount of error, of course. But what would not significantly change is the trend/shape.

As folk seem more concerned about the source of information, here is what tfk generated from the data...
51514794.png


Of course, variation in method changes the profile elements, but the trend is pretty clear.

Again, as poly fit order is increased oscillation of sections of the curve are emphasised...
http://femr2.ucoz.com/_ph/7/408829093.gif
 
Last edited by a moderator:
Whilst I'm there I think it prudent to highlight the main reason that I am able to produce smooth acceleration profiles.

This is information all the way back in post #46 in this thread.

A high order poly fit is first applied to the raw position/time data...
WTC7DanRatherDropSmall.png

A 5964*3720 pixel version so you can actually see what's occuring...
http://femr2.ucoz.com/WTC7DanRatherDrop.png
View the large image to see how closely and consistently the curve fit adheres to the underlying data.

dydx functions of the freely available XlXtrFun Excel Plugin are then used to generate the derived velocity and acceleration curves.

XlXtrFun said:
Three Dimensional Interpolate and Extrapolate Functions; Two Dimensional Interpolate, Extrapolate, Cubic Spline, Polynomial Curve Fitting, Line Intersections, Minima, Maxima, First Derivative, Second Derivative, and other functions.

XlXtrFun.xll is a collection of functions which extends the capabilities of Microsoft Excel; developed primarily to facilitate, interpolation of 2-dimensional and 3-dimensional data, and simplify 2-variable curve fitting. XlXtrFun has been used for years by engineering and research and development personnel on every continent who need to interpolate, extrapolate, and curve fit data rapidly, reliably, and with a virtually non-existent learning curve.
Available here: http://www.xlxtrfun.com/XlXtrFun/XlXtrFun.htm
 
That the effective *order* of the resultant acceleration curve is relatively low (even when the derived data order is also high) is, in my opinion, an indication of it's correctness, as the input position/time curve is high order (50).

Am sure there will be opinion on this comment...
 
Last edited:
High order polynomials to match a graph over a set finite domain will prove a pretty useless tool for examining physical meaning within observed motion.

It is just a math trick without much practical use. Since physics should be practical and useful above all, anyone examining features of motion will use piecewise fitting around small areas.

The piecewise fits will give practical information for motion around any chosen time. It is just like the basic ideas behind calculus. By using small delta t (time) and small delta z (position), you can derive velocity, acceleration and change in acceleration around any moment.

Global attributes are pretty meaningless. The meaningful physical values can all be extracted locally around any chosen time t.

newton's inveltion of calculus works the same way. Calculus is the result of taking delta t and delta z to zero. Nothing fancy.

It is just like doing calculus the old fashioned way.
 
Last edited:
The piecewise fits will give practical information for motion around any chosen time.
Whilst I agree that piecewise fits are an improvement over a fit across the full time domain, the effective end result is simply an increased full time-domain order.

Perhaps it would be insightful to do so, and cross-check the results.

I suggest the end result full-domain curve trend would not significantly change...if it could in fact be pieced back together at all.

Will probably do the leg-work anyway.
 
It is just a math trick without much practical use.
I don't agree.

By applying the fit at the earliest step, the raw position/time data, it's function is to significantly reduce the noise in the data.

It is clear from the large image I provided that there is very close correlation between the actual curve and the poly fit curve.

In this context I'm not looking to find *mini jolts* or very small variation in instantaneous acceleration, but instead to determine the trend.

As I'm sure you know, I've also derived velocity and acceleration data with other methods, including simple moving average smoothing, and similar (but much more noisy) graph result.

There must be some form of trade-off between noise reduction and localised acceleration profile, or we end up with less useful graphical data due to extreme amplification of underlying noise via sequential derivation steps...
82136974.png


You can see the trend in there, but it's not too useful.
 
Last edited:
I'm glad I don't have to argue against you regularly.
Eh ? :confused:

As I said, I agree that piece-wise is *better*, but also that there's reasons why I've done it in the way I have.

I'll more than likely also do a piece-wise fit and cross-check, but feel free to... ;)

(Quite how the separate fits will bolt back together to form a full time domain profile I'm not sure...)
 
Femr2, I'm going to the cabin for a few days, no internet, no phone, but I will make some effort towards finding data on some CD's for you to look at.

Looking forward to working on some orchestra pieces and doing some snowshoeing.

cheers

AE



"E. Yarimer at London University appears to be one of the few engineer/scientists who has studied real building demolitions by explosives. He has at least two papers on this subject, both written back in the 1990s, (and unfortunately hard to find on the web).

Here is a quote from the first (1994) paper:

“The current practice in controlled demolition (CD) by explosives is to pre-weaken the building on most floors, and to blast only a portion of the floors, for example one floor in two, or one floor in three. Even so, the number of charges to be placed in individual boreholes can be large: up to 6000 charges have been used depending on the size of the job. The blast floors will readily disintegrate, but the non-blast floors need the force of the impact in order to break-up. Even on the blast floors, the perimeter walls above the ground floor are usually not charged for safety reasons, and they are expected to break up by impact. The entire process is driven by gravity but the downward velocities are attenuated by the energy absorption at the point of impact, and the motion will accelerate less than a case of free fall; it may even decelerate. A spectacular case of decelerating motion was that of Northaird Point in London in 1985, which came to rest with 10 floors still intact.”

In his second, (1996) paper, Yarimer used electronic and photographic timing devices to study a number of real CDs. One of great interest to the present discussion was the 1995 demolition of a 20-story high-rise known as Sandwell East Tower. This demolition showed - as was observed for some other CDs studied by Yarimer - a latency period of ~ 1.5 seconds before significant bulk motions were detected.

I have taken Yarimer’s data to look at the accelerations for the Sandwell East Tower CD. Some time-drop data for the first 5 seconds are: 0 s, 0 m; 1 s, 0 m; 2 s, 1.8 m; 3.0 s, 10 m; 4.0 s, 22.3 m; 5.0 s, 35.9 m. These data show the collapse was well below free fall. Indeed, Yarimer states in his discussion of this data: “Near time t = 0, the calculated accelerations are influenced by the observed latency, thus lifting the estimate of the upwards reaction force.” It appears that even Yarimer had t(0) problems!

Nevertheless, I have analysed Yarimer’s data (with allowance for the t(0) problem) using the same approach many of us have applied to WTC 7 collapse data. What is most significant is that, even with a time shift of ~ 1.5 seconds, the Sandwell East building fell only about 40 meters in the first 4 seconds of bulk motion with an acceleration of no more than 5 m/s^2. And let’s remember that this was observed for a real-world CD on a 20-story building. Scaling this result to a 47-story, (WTC-7-sized building), I would predict a 50 % collapse to take at least 6 seconds and allowing for a latency period of about 1.5 seconds, a full collapse to take ~ 10 seconds or more.

D. Isobe et al. have carried out finite element calculations on a 20-story steel framed building subjected to a Kobe-wave type of seismic collapse. Isobe found that incremental collapse begins
after an initial 26-second period of vibration during which time plastic hinges are formed and column fractures occur near the ground level of the building. The modelled structure was 50 % collapsed about 10 seconds after the first bulk downward motion, and still only about 35 % collapsed after 14 seconds!

Thus we see experimental and theoretical confirmation that the global collapse of a 20-story building would take at least 10 seconds to partially collapse from deliberate man-made explosive or natural seismic trauma to lower portions of its structure. "

From This post: http://the911forum.freeforums.org/did-wtc-7-fall-too-fast-t85.html#p1107


Isobe link here: http://www.kz.tsukuba.ac.jp/~isobe/seismic-e.html


Dr Yarimer may be a good place to start. No use reinventing the wheel.
 
I have repeatedly stated reasons why.

And do you not think it matters that such infamous metrics such as 2.25s of freefall and 40 percent longer than freefall are actually...wrong ?
No. How does it effect the analysis of the collapse starting at the first sign of movement of the penthouse (the real "release point")?
Are there two numbers generated within the NIST report that needed to be accurate more than those two ?

Yes. All the structural member and connection strength numbers.

I'll respond to the rest of the post latter when I have more time. I have 6 trees I need to fell and cut up. Hey, maybe I'll video tape them and see if they reach "free-fall".

;)
 
I don't agree.

By applying the fit at the earliest step, the raw position/time data, it's function is to significantly reduce the noise in the data.

It is clear from the large image I provided that there is very close correlation between the actual curve and the poly fit curve.

In this context I'm not looking to find *mini jolts* or very small variation in instantaneous acceleration, but instead to determine the trend.

As I'm sure you know, I've also derived velocity and acceleration data with other methods, including simple moving average smoothing, and similar (but much more noisy) graph result.

There must be some form of trade-off between noise reduction and localised acceleration profile, or we end up with less useful graphical data due to extreme amplification of underlying noise via sequential derivation steps...


You can see the trend in there, but it's not too useful.
Do you make up this nonsense yourself? You failed to fill in the details.
 
No. How does it effect the analysis of the collapse starting at the first sign of movement of the penthouse (the real "release point")?
A valid point, BUT the data is supposed to relate to *the north face*, rather than other structures. I agree that there is a release point for the East penthouse well in advance, thoug I assume you are not suggesting that that point int time marks the beginning of facade vertical descent ?

For the *no* there, all I can do is repeat that cmatrix-et-al are not going to let go of the literal interpretation until they can be shown why a literal interpretation is misleading (aka wrong).

Hey, maybe I'll video tape them and see if they reach "free-fall".

;)
Remember to measure it's height then. Static camera. As near the center of the height as possible. Minimal/zero camera tilt. ;)
 
From the quote in my last post:

"In his second, (1996) paper, Yarimer used electronic and photographic timing devices to study a number of real CDs. One of great interest to the present discussion was the 1995 demolition of a 20-story high-rise known as Sandwell East Tower. This demolition showed - as was observed for some other CDs studied by Yarimer - a latency period of ~ 1.5 seconds before significant bulk motions were detected.

I have taken Yarimer’s data to look at the accelerations for the Sandwell East Tower CD. Some time-drop data for the first 5 seconds are: 0 s, 0 m; 1 s, 0 m; 2 s, 1.8 m; 3.0 s, 10 m; 4.0 s, 22.3 m; 5.0 s, 35.9 m. These data show the collapse was well below free fall. Indeed, Yarimer states in his discussion of this data: “Near time t = 0, the calculated accelerations are influenced by the observed latency, thus lifting the estimate of the upwards reaction force.” It appears that even Yarimer had t(0) problems!

Nevertheless, I have analysed Yarimer’s data (with allowance for the t(0) problem) using the same approach many of us have applied to WTC 7 collapse data. What is most significant is that, even with a time shift of ~ 1.5 seconds, the Sandwell East building fell only about 40 meters in the first 4 seconds of bulk motion with an acceleration of no more than 5 m/s^2. And let’s remember that this was observed for a real-world CD on a 20-story building. Scaling this result to a 47-story, (WTC-7-sized building), I would predict a 50 % collapse to take at least 6 seconds and allowing for a latency period of about 1.5 seconds, a full collapse to take ~ 10 seconds or more.

D. Isobe et al. have carried out finite element calculations on a 20-story steel framed building subjected to a Kobe-wave type of seismic collapse. Isobe found that incremental collapse begins
after an initial 26-second period of vibration during which time plastic hinges are formed and column fractures occur near the ground level of the building. The modelled structure was 50 % collapsed about 10 seconds after the first bulk downward motion, and still only about 35 % collapsed after 14 seconds!

Thus we see experimental and theoretical confirmation that the global collapse of a 20-story building would take at least 10 seconds to partially collapse from deliberate man-made explosive or natural seismic trauma to lower portions of its structure."


Are there cases of real demolitions which experience anything close to the accelerations seen in WTC 7?

Many years of talking, but isn't that the place to begin asking questions about the WTC 7 acceleration rates?


Are there any real cases of natural collapse or demolition which experience such extreme acceleration rates?

Why not ask this question first?
 

Back
Top Bottom