• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Care to Comment

You are an obvious obfuscator who never adds to the conversation.

What I said about deceleration and amplification here is correct, yet you insist on making feigned posts about its accuracy.

Take a hike.
Incorrect on all points again, Tony. At least you're consistent.
 
The Sauret video that the data was taken from is 30 frames per second and if one took data every frame it would be extremely noisy. Even with the measurements taken every 167 milliseconds some noise will still be present.

Symmetric differencing does generate an average about a data point and thus will smooth out noise. However, when charting data like this it is the trend which is significant.

Using every data point by simple differencing essentially doubles the noise.

The question one needs to ask is why the average between two data points on either side of the point in question causes a higher velocity than the previous average. How did it get to be greater?

This is the reason for regression analysis.

I know some will say that the measurement resolution is not sufficient to discern whether or not a jolt took place, but that is a feigned argument for two reasons: One is that the trend is obviously increasing and two the size of the jolt required is much higher than what could be masked by one data point.

The premise of the Missing Jolt paper is valid for the reasons stated above.

In any case, we will be redoing the measurements with a more sophisticated system called Tracker, which is in the Open Source Physics project and is available on the Internet. I will make the results of the new data set public.

It would probably be good for some of you guys here to do some measurements yourself.
How can you be sure that the points in question are merely noise?
 
Easy to claim as you are also the only one to claim to be able to make sense of what you are saying.
I can make sense of it now, but I strongly believe Tony has changed the way he interprets "acceleration" in order to cover up his previous mistake, rather than admit error. He seems to be a bit sensitive about his technical savvy.
 
Easy to claim as you are also the only one to claim to be able to make sense of what you are saying.

You don't seem to understand and it would be best if you continued to use figures to express your argument as you one time earlier.
 
How can you be sure that the points in question are merely noise?

That is what I believe, and I did explain that to prove it isn't one has to answer the question of how the average about the data point in question can be higher than the previous average.

However, I also said we will be taking a new data set with a more sophisticated system.
 
Originally Posted by D'rok
This is about as straightforward and clear cut as can be.

Response, Tony? It's right there in the table on page 7 of your paper.

.
That appears to be true.

I'll take a look at it.
.
<snip>
However, I also said we will be taking a new data set with a more sophisticated system.


.
Dewey/Truman Pre-pared Headlines
...

Szamboti Admits Error - Now Debunker All Is Forgiven.
Ryan And Tony Trade Beers At Hooters

...

Szamboti Complains His Original Data Desn't Prove Premise Therefore Data Wrong
Promises To Torture New Data Until It Does

All Is Normal
As You Were.
 
Last edited:
.

.


.
Dewey/Truman Pre-pared Headlines
...

Szamboti Admits Error - Now Debunker All Is Forgiven.
Ryan And Tony Trade Beers At Hooters

...

Szamboti Complains His Original Data Desn't Prove Premise Therefore Data Wrong
Promises To Torture New Data Until It Does

All Is Normal
As You Were.

Basque, you forgot this part.

I am sure some here will say that the measurement resolution is not sufficient to discern whether or not a jolt took place, but that is a feigned argument for two reasons: First is that the trend is obviously increasing, and secondly the size of the jolt required is much higher than what could be indicated by one data point.

The premise of the Missing Jolt paper is valid for the reasons stated above.


Another important thing that nobody here has even attempted to address is why measurements on all of the Verinage demolitions show a very pronounced deceleration for much more than one data point and yet we don't see that with the WTC 1 measurements.
 
Last edited:
Originally Posted by D'rok
This is about as straightforward and clear cut as can be.

Response, Tony? It's right there in the table on page 7 of your paper.

.
That appears to be true.

I'll take a look at it.
.
<snip>
However, I also said we will be taking a new data set with a more sophisticated system.
.
Basque, you forgot this part.

I am sure some here will say that the measurement resolution is not sufficient to discern whether or not a jolt took place, but that is a feigned argument for two reasons: First is that the trend is obviously increasing, and secondly the size of the jolt required is much higher than what could be indicated by one data point.

The important thing that nobody here has even attempted to address is why measurements on all of the Verinage demolitions show a very pronounced deceleration for much more than one data point and yet we don't see that with the WTC 1 measurements.

The premise of the Missing Jolt paper is valid for the reasons stated above.

.
Dewey/Truman Pre-pared Headlines
...

Szamboti Admits Error - Now Debunker All Is Forgiven.
Ryan And Tony Trade Beers At Hooters

...

Szamboti Complains His Original Data Desn't Prove Premise Therefore Data Wrong
Promises To Torture New Data Until It Does

All Is Normal
As You Were.
 
Another important thing that nobody here has even attempted to address is why measurements on all of the Verinage demolitions show a very pronounced deceleration for much more than one data point and yet we don't see that with the WTC 1 measurements.


.
I'm disappointed you don't read all my posts.
.

My bolding


Wrong -TS’ own data (W.D.Clinger’s velocity vs time chart) shows this negative slope therefore deceleration occurred (assuming TS data) therefore the Missing Jolt paper based on no deceleration is wrong.


If there is a loss of velocity in the Verinage demolitions it is because unlike at the Towers, the falling loads of the block above are equally distributed onto the block below . And TS knows this.

.
Wrong – Balzac-Vitry was not a column and beam structure. It was a precast concrete loadbearing wall structure.
.
Balzac -Vitry Tower
[qimg]http://www.cg94.fr/files/diaporama/11190/11199p.jpg[/qimg]

Tannen Towers
“Architectural precast concrete wall panels that act as loadbearing
elements in a building are both a structurally efficient and
economical means of transferring floor and roof loads through the
structure and into the foundation. …

The 32-story Tannen Towers condominium project in Atlantic City, New
Jersey, completed in 1987 uses portal frames at the base, and bearing walls
in the upper levels (see Fig. 36). The building is subdivided from top to bottom by a central corridor. A row of 37 ft (11.3 m) long bearing walls, which are typically 8 in. (203 mm) thick, runs along either side of the corridor.
The walls cantilever 11 ft (3.35 m) beyond the face of the base structure on
both sides of the building. To stabilize the structure, the design links pairs of
bearing walls across the corridor with steel ties back-to-back angles rein -
forced with a continuous plate.
The entire structure was built using precast, prestressed concrete hollow core slabs, balcony slabs, precast load bearing walls, stairs and landings.”

http://www.pci.org/view_file.cfm?file=PR-24.PDF


Another example of a concrete loadbearing wall structure
[qimg]http://www.cpci.ca/images/sectionpics/potm/22005/1.jpg[/qimg]
 
Another important thing that nobody here has even attempted to address is why measurements on all of the Verinage demolitions show a very pronounced deceleration for much more than one data point and yet we don't see that with the WTC 1 measurements.

Verinage was specifically engineered to exhibit the perfect limiting case of an aligned column to column impact, a "luxury" that none of the towers had.

Considering you've made it abundantly clear you believe no such tilt exists with tower 1, and you want to avoid WTC 2 altogether I'll be very surprised if it ever sinks in...
 
That is what I believe...
That's a little underwhelming

...and I did explain that to prove it isn't one has to answer the question of how the average about the data point in question can be higher than the previous average.

However, I also said we will be taking a new data set with a more sophisticated system.

Sounds like you're throwing your own data under the bus rather than re-consider your conclusion. Actually, it sounds like your conclusion is pre-determined rather than arrived at.
 
That's a little underwhelming



Sounds like you're throwing your own data under the bus rather than re-consider your conclusion. Actually, it sounds like your conclusion is pre-determined rather than arrived at.

Hardly.
 
Verinage was specifically engineered to exhibit the perfect limiting case of an aligned column to column impact, a "luxury" that none of the towers had.

Considering you've made it abundantly clear you believe no such tilt exists with tower 1, and you want to avoid WTC 2 altogether I'll be very surprised if it ever sinks in...

Take a real hard look at the Balzac-Vitry demolition. The whole upper section shifts to the side by several feet, so it can't be having perfect column on column impact, yet it has a serious deceleration.

The reality that a deceleration should occur even without perfect column alignment is overwhelming. The fact that it doesn't in WTC 1 is overwhelming proof that the strength of the structure below was largely removed before impact.
 
Last edited:
Thank you, Tony, for acknowledging that symmetric differencing is a form of data smoothing. Although you still have not acknowledged that your raw unsmoothed data show an actual decrease in velocity, I guess we should thank you for not repeating your usual denial of that fact.

Symmetric differencing does generate an average about a data point and thus will smooth out noise.
Yes, and it also smooths out signal.

Using every data point by simple differencing essentially doubles the noise.
Yes, and it also improves the resolution by a factor of two (compared to symmetric differencing).

The question one needs to ask is why the average between two data points on either side of the point in question causes a higher velocity than the previous average. How did it get to be greater?
I see two possible interpretations of this question.

You might be asking what could cause the downward velocity of a collapsing structure to increase with time, in which case the answer is gravityWP.

On the other hand, you might be asking us to explain to you why velocities calculated by balanced (symmetric) differencing aren't the same as those calculated by simple forward or backward differencing, in which case the answer is that balanced differencing is a form of data smoothing that effectively degrades the resolution of your data by a factor of two.

Both interpretations of your question leave me to wonder whether your paper's first author might have been responsible for the technical aspects of your paper, with the second author brought in to handle the religious aspects.

This is the reason for regression analysis and when charting data like this it is the trend which is significant.
Nonsense. Everyone knows the trend (average acceleration) is about 0.7g. If that 0.7g trend were the only significant issue, we wouldn't be having this conversation.

The central question of your paper, silly though it be, is whether that 0.7g average is (1) a fairly smooth acceleration, as would be expected from the smeared-out collision caused by a tilt, or (2) the smoothed-out average of a jerky, jolting acceleration, as expected by MacQueen and Szamboti.

I am sure some here will say that the measurement resolution is not sufficient to discern whether or not a jolt took place,
Yes, there are competent people here. One of them (Dave Rogers) has analyzed the situation correctly.

but that is a feigned argument for two reasons: First is that the trend is obviously increasing, and secondly the size of the jolt required is much higher than what could be indicated by one data point.
(Insert laughing dogs here.)

"first is that the trend is obviously increasing"
The object is being accelerated downward by gravity. No one in his right mind would expect the smoothed trend of the downward component of its velocity vector to be anything other than increasing.

"the size of the jolt required is much higher than what could be indicated by one data point."
A single data point measures position at a single point in time. It can't show anything about velocity. To say anything at all about velocity, you have to look at two or more data points. Even two data points can't show anything about acceleration. To say anything at all about acceleration, you have to look at three or more data points.

Finally, and most important: Trends and averages can't show anything about the presence or absence of jolts. In this context, jolts are brief decreases in the downward component of the instantaneous acceleration vector, and cannot be estimated without estimates of the downward component of the instantaneous velocity vector. Trends and averages say nothing about brief changes in instantaneous velocity or instantaneous acceleration.

If you look at MacQueen and Szamboti's three data points for position at times 1.5, 1.67, and 1.83 seconds, they show a decrease in velocity: this is exactly the kind of jolt that MacQueen and Szamboti say is missing from their data. That's an epic fail.

The premise of the Missing Jolt paper is valid for the reasons stated above.
As I have explained above, your "reasons stated above" are laughable.

I can't just accept your claim on the basis of your technical authority, either. For at least four months now, you have been having obvious difficulties with grade school arithmetic.

In any case, we will be redoing the measurements with a more sophisticated system called Tracker, which is in the Open Source Physics project and is available on the Internet. I will make the results of the new data set public.

It would probably be good for some of you guys here to do some measurements yourself.
You and MacQueen made a truly extraordinary claim that's contradicted by your own raw data. It would have been a good idea for you to have realized that before publishing your paper instead of after.

Competent peer review would have helped.
 
Last edited:
The reality that a deceleration should occur even without perfect column alignment is overwhelming. The fact that it doesn't in WTC 1 is overwhelming proof that the strength of the structure below was largely removed before impact.

The Balzac-Vitry building was more robustly built than the towers. So the deceleration should be far greater than observed in thew WTC towers. I'll let some number-cruncher work on how much. Then you can compare notes.

You handwave too many obvious diffeerences.
 
The graph in post #272 is not an accurate representation of the data in the Missing Jolt paper.

Yes it is. It is an accurate representation of your raw data before smoothing.

There are no velocity values in the table on page 8 which are less than the one that preceded it in tiime, so there is never a loss in velocity and thus there would be no negative slope in a velocity curve made from that data.

The values in table 8 are derived from the raw data by applying a two-point smoothing algorithm to the raw data, which is why there are no negative slopes in your velocity curve. Your manipulation of the data has removed the phenomenon you were trying to prove was not present, which is convenient to say the least. You're either incompetent or a liar; there is no third alternative here.

You still never answered my question on whether you have a job other than replying to posts on this forum.

The forum rules forbid me to advise you on where you can stick your personal questions.

Dave
 
Another important thing that nobody here has even attempted to address is why measurements on all of the Verinage demolitions show a very pronounced deceleration for much more than one data point and yet we don't see that with the WTC 1 measurements.

That is an outright, barefaced lie. You've been told several times that the verinage structures are completely different in construction to the WTC towers, and more importantly that the structure in a verinage demolition is symmetrically removed so that the upper block falls without a tilt. There is very much more area for the vertical structures to collide directly, the fall is vertical so that the impact is axial, and the impact is simultaneous across the entire structure. The absence of all three of these features in the WTC collapses results in either the absence of, or a much smaller jolt; the clearly visible 2G jolt in your raw data that you've removed by data smoothing is easily large enough.

Dave
 
This is the reason for regression analysis and when charting data like this it is the trend which is significant.

Good god, I didn't realise it was possible for you to come up with a more moronic excuse. You're looking for discontinuities in the data, and you're using regression analysis to look for overall trends? If I even have to explain why that's idiotic, you're probably incapable of understanding the answer.

Dave
 
That is what I believe, and I did explain that to prove it isn't one has to answer the question of how the average about the data point in question can be higher than the previous average.

It's simple arithmetic. The discrepancy between the raw and the smoothed data comes about because of the smoothing procedure you're applying. It should be obvious that a smoothing algorithm removes spurious peaks and troughs in data; that is, after all, exactly what a smoothing algorithm is supposed to do. Unfortunately, it will just as easily remove significant peaks and troughs in data. Since you're looking for a significant trough, you shouldn't be smoothing the data. If your data is noisy enough that you need to smooth it, then it's too noisy to detect the presence or absence of the feature you're looking for.

This is first year undergraduate stuff, if that.

Dave
 

Back
Top Bottom