• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Merged Discussion of femr's video data analysis

I think what beachnut was trying to point out is that, without some engineering understanding of the processes you're investigating, blindly applying a smoothing function to experimental data is a simple GIGO analysis. At least, that's what I took away from the bits of his post that femr2 chose to ignore completely.

Smoothing data should only ever be done as a necessary evil. It's based on a belief that the information you're removing is necessarily noise, rather than signal. Without any justification for that belief - for example, if the information is being used to try to understand an unknown mechanism, which one might think was the starting point for anyone trying to overturn an established understanding - then smoothing shouldn't even be attempted. So the answer to the question "What smoothing method is better than Savitzky-Golay" is, therefore, "Not smoothing the data at all."

Dave
 
Truther dishonesty?

no....

Nuh, that would be assuming too much. Beachnut is pretty notorious for ranting, and it is not always clear what is information, and what is rant. In that particular case, in reply to femr's "You have a smoothing method better than Savitzky-Golay ?", I read beachnut's "For a gravity collapse of a building? Yes, but then I am an engineer, and I do this for a living; engineering." as "yes, engineering is better than Savitzky-Golay, and I say this because I am an engineer". But it isn't altogether clear whether this is what he meant at all.
 
You snipped all words from beachnut's post 1860 except one: "Yes". Had you quoted in full, you might have noticed the answer. I think it was: "engineering" (as opposed to pure math).

"engineering" is not a smoothing method.

Do you have a smoothing method better than Savitzky-Golay for the purpose here (revelaing the underlying acceleration profile trend for position/time data extracted from video) ? Beachnut clearly doesn't, or he'd have waded in with his size fours.
 
Do you have a smoothing method better than Savitzky-Golay for the purpose here (revelaing the underlying acceleration profile trend for position/time data extracted from video) ?

You're assuming that there is a significant underlying trend masked by random noise. Until you've got some basis for that assumption, you're better off using the real data rather than a mathematical abstraction derived from it. Therefore, as I just said, using no smoothing at all is a better smoothing method than Savitzky-Golay in this specific circumstance.

Dave
 
"engineering" is not a smoothing method.

Do you have a smoothing method better than Savitzky-Golay for the purpose here (revelaing the underlying acceleration profile trend for position/time data extracted from video) ? Beachnut clearly doesn't, or he'd have waded in with his size fours.

Read Dave's latest reply. It came before mine. Do you read posts in reverse order?

The answer is: No smoothing at all is the best smoothing algorithm, unless you have first already understood the process and know that smoothing will remove noise, not signal.
But when you are collecting data in an effort to understand the process, obviously, understanding the process comes later.

As it is, we cannot know which smoothing algorithm is the best, unless we already understand the process we are modelling with our data.
 
I think what beachnut was trying to point out is that, without some engineering understanding of the processes you're investigating, blindly applying a smoothing function to experimental data is a simple GIGO analysis. At least, that's what I took away from the bits of his post that femr2 chose to ignore completely.
The varying smoothing methods I have applied have not been done blindly in the slightest. Numerous differing smoothing mthods have been utilised, with various different datasets, from various diferent video segments.

Guess what .... they all reveal the same trend.

Do you have a smoothing method better than Savitzky-Golay for the datasets in the context of discussion ?

I didn't ignore bechnuts waffle. Engineering is not smoothing method. Therefore the rest of his post was irrelevant to the question.

Smoothing data should only ever be done as a necessary evil.
Of course. Are you suggesting that high sample rate data containing noise should be derived twice WITHOUT smoothing ? :jaw-dropp

It's based on a belief that the information you're removing is necessarily noise, rather than signal.
Nonsense. There will be removal of both to some extent, and some noise will also remain. Some higher frequency signal will of course be lost, but as I'm not looking for stuff like *mini jolts* it's really not important data for the purpose in hand...revealing the underlying acceleration profile trend.

Without any justification for that belief
LOL. Your belief, not mine.

So the answer to the question "What smoothing method is better than Savitzky-Golay" is, therefore, "Not smoothing the data at all."
Nonsense. Deriving second order data from high sample rate data containing noise WITHOUT smoothing is a really bad idea.


Do you have a better smoothing method ?

The high order single poly-fit received some criticism for its behaviour near end-points, which is fair enough. Symmetric difference and running average both have time based issues, so better than S-G ?

Anyone ? ;)
 
You're assuming that there is a significant underlying trend masked by random noise.
Nonsense. See previous post. Numerous datasets, numerous smoothing methods, numerous different video segments, numerous different trace locations....same end result. Slight differences in trend, to be expected, but otherwise... :rolleyes:

Until you've got some basis for that assumption
See above. Just because you make inept assumptions does not mean you are right, y'know ;)

you're better off using the real data rather than a mathematical abstraction derived from it.
Deriving second order data from high sample rate data containing noise WITHOUT smoothing is a really bad idea.

Therefore, as I just said, using no smoothing at all is a better smoothing method than Savitzky-Golay in this specific circumstance.
Nonsense. Deriving second order data from high sample rate data containing noise WITHOUT smoothing is a really bad idea. Why don't you have a go... (I already have plenty of graphs which show the acceleration data without any smoothing if you're not able to do it y'self)
 
The answer is: No smoothing at all is the best smoothing algorithm, unless you have first already understood the process and know that smoothing will remove noise, not signal.
But when you are collecting data in an effort to understand the process, obviously, understanding the process comes later.
ROFL. Blind leading the blind.

As it is, we cannot know which smoothing algorithm is the best, unless we already understand the process we are modelling with our data.
So you don't have a smoothing method better than S-G for this purpose ? You only have to say so you know. It is not an excuse that you do not know how many datasets exist. It is not an excuse that you do not know that the same trend is revealed with usage of varying smoothing methods, varying trace locations, varying video segments...

The Savitzky–Golay method essentially performs a local polynomial regression (of degree k) on a series of values (of at least k+1 points which are treated as being equally spaced in the series) to determine the smoothed value for each point. Methods are also provided for calculating the first up to the fifth derivatives.

The main advantage of this approach is that it tends to preserve features of the distribution such as relative maxima, minima and width, which are usually 'flattened' by other adjacent averaging techniques (like moving averages, for example).

Any suggestions for a better method ? Anyone ?
 
Last edited:
Just because you make inept assumptions does not mean you are right, y'know ;)

I'm not the one making the assumptions here. You are.

Deriving second order data from high sample rate data containing noise WITHOUT smoothing is a really bad idea.

So is applying a smoothing algorithm without understanding of the physical processes you're trying to model.

Dave
 
ROFL. Blind leading the blind.

Careful!

So you don't have a smoothing method better than S-G for this purpose ?

Leading question. So you did not stop beating your wife yet?

You only have to say so you know. It is not an excuse that you do not know how many datasets exist. It is not an excuse that you do not know that the same trend is revealed with usage of varying smoothing methods, varying trace locations, varying video segments...

How do you know that all these algorithms don't smooth away signal? Answer: You don't, unless you know already what's going on.

Any suggestions for a better method ? Anyone ?

Yes. No smoothing.
 
Careful!



Leading question. So you did not stop beating your wife yet?



How do you know that all these algorithms don't smooth away signal? Answer: You don't, unless you know already what's going on.



Yes. No smoothing.

You must smooth the data using some fancy algorithm to fool the 911 truthers into thinking you are using science not technobabble nonsense. He can't explain how this relates to 911 conspiracies, or make a conclusion beyond attacking NIST.
 
The varying smoothing methods I have applied have not been done blindly in the slightest.
(Insert laughing dog here.)

Numerous differing smoothing mthods have been utilised, with various different datasets, from various diferent video segments.

Guess what .... they all reveal the same trend.
Yes, and that general trend is consistent with NIST's data and models. Had femr2's dismissal of NIST's work been justified, we'd have had to dismiss femr2's work as well.

The reason we don't have to dismiss femr2's work is that femr2's dismissals of NIST's analysis were, to put it as charitably as possible, hyperbole.

Nonsense. Deriving second order data from high sample rate data containing noise WITHOUT smoothing is a really bad idea.
True.

Numerous datasets, numerous smoothing methods, numerous different video segments, numerous different trace locations....same end result. Slight differences in trend, to be expected, but otherwise... :rolleyes:
True.

No one has ever made a credible claim that the small differences between femr2's analyses and NIST's are significant. I'm glad femr2 has finally conceded that point.
 
I'm not the one making the assumptions here. You are.
LMAO. Here is a list of the assumptions you have made merely in the last couple of posts...

I think what beachnut was trying to point out is that
Assumption.

without some engineering understanding of the processes you're investigating
Assumption.

blindly applying a smoothing function to experimental data is a simple GIGO analysis
Assumption.

At least, that's what I took away from the bits of his post that femr2 chose to ignore completely.
Assumption.

Smoothing data should only ever be done as a necessary evil.
Assumption.

It's based on a belief that the information you're removing is necessarily noise, rather than signal.
Assumption.

Without any justification for that belief
Assumption.

for example, if the information is being used to try to understand an unknown mechanism, which one might think was the starting point for anyone trying to overturn an established understanding
Assumption.

So the answer to the question "What smoothing method is better than Savitzky-Golay" is, therefore, "Not smoothing the data at all."
Utter nonsense in this context.

You're assuming that there is a significant underlying trend masked by random noise.
Nope. Revealed with numerous different datasets. Numerous different video segments. Numerous different trace locations. Numerous different smoothing methods (and no smoothing at all).

So, yet another assumption from you Dave. Tsk, tsk...

Until you've got some basis for that assumption
Which I have. You are assuming I don't :rolleyes:

you're better off using the real data rather than a mathematical abstraction derived from it
Nonsense. Suggest you re-read this thread from the first page. Highly relevant to the current discussion AGAIN.

Therefore, as I just said, using no smoothing at all is a better smoothing method than Savitzky-Golay in this specific circumstance.
In this specific circumstance using no smoothing is utter nonsense. Shame.

How many assumptions ? :rolleyes:
 
Yes, and that general trend is consistent with NIST's data and models.
In what way is the NIST acceleration profile data consistent with mine ?

Had femr2's dismissal of NIST's work been justified, we'd have had to dismiss femr2's work as well.
What dismissal are you referring to ?

The reason we don't have to dismiss femr2's work is that femr2's dismissals of NIST's analysis were, to put it as charitably as possible, hyperbole.
Be specific.

Aiii.

Aiii.

No one has ever made a credible claim that the small differences between femr2's analyses and NIST's are significant.
NIST didn't generate any detailed acceleration profile data at all. I see a repost of the issues with the NIST trace data looming :rolleyes:

I'm glad femr2 has finally conceded that point.
I have conceded no such thing.
 
... NIST didn't generate any detailed acceleration profile data at all. I see a repost of the issues with the NIST trace data looming :rolleyes: ...
This is it.

Why does NIST need a detailed acceleration profile for a gravity collapse caused by fire?
 

Back
Top Bottom