beachnut said:femr2 smooths acceleration using silly methods
You have a smoothing method better than Savitzky-Golay ?
Then what is it ?
I note no response yet beachnut...
beachnut said:femr2 smooths acceleration using silly methods
You have a smoothing method better than Savitzky-Golay ?
Then what is it ?
Truther dishonesty?
no....
You snipped all words from beachnut's post 1860 except one: "Yes". Had you quoted in full, you might have noticed the answer. I think it was: "engineering" (as opposed to pure math).
Do you have a smoothing method better than Savitzky-Golay for the purpose here (revelaing the underlying acceleration profile trend for position/time data extracted from video) ?
"engineering" is not a smoothing method.
Do you have a smoothing method better than Savitzky-Golay for the purpose here (revelaing the underlying acceleration profile trend for position/time data extracted from video) ? Beachnut clearly doesn't, or he'd have waded in with his size fours.
The varying smoothing methods I have applied have not been done blindly in the slightest. Numerous differing smoothing mthods have been utilised, with various different datasets, from various diferent video segments.I think what beachnut was trying to point out is that, without some engineering understanding of the processes you're investigating, blindly applying a smoothing function to experimental data is a simple GIGO analysis. At least, that's what I took away from the bits of his post that femr2 chose to ignore completely.
Of course. Are you suggesting that high sample rate data containing noise should be derived twice WITHOUT smoothing ?Smoothing data should only ever be done as a necessary evil.

Nonsense. There will be removal of both to some extent, and some noise will also remain. Some higher frequency signal will of course be lost, but as I'm not looking for stuff like *mini jolts* it's really not important data for the purpose in hand...revealing the underlying acceleration profile trend.It's based on a belief that the information you're removing is necessarily noise, rather than signal.
LOL. Your belief, not mine.Without any justification for that belief
Nonsense. Deriving second order data from high sample rate data containing noise WITHOUT smoothing is a really bad idea.So the answer to the question "What smoothing method is better than Savitzky-Golay" is, therefore, "Not smoothing the data at all."
Nonsense. See previous post. Numerous datasets, numerous smoothing methods, numerous different video segments, numerous different trace locations....same end result. Slight differences in trend, to be expected, but otherwise...You're assuming that there is a significant underlying trend masked by random noise.
See above. Just because you make inept assumptions does not mean you are right, y'knowUntil you've got some basis for that assumption
Deriving second order data from high sample rate data containing noise WITHOUT smoothing is a really bad idea.you're better off using the real data rather than a mathematical abstraction derived from it.
Nonsense. Deriving second order data from high sample rate data containing noise WITHOUT smoothing is a really bad idea. Why don't you have a go... (I already have plenty of graphs which show the acceleration data without any smoothing if you're not able to do it y'self)Therefore, as I just said, using no smoothing at all is a better smoothing method than Savitzky-Golay in this specific circumstance.
ROFL. Blind leading the blind.The answer is: No smoothing at all is the best smoothing algorithm, unless you have first already understood the process and know that smoothing will remove noise, not signal.
But when you are collecting data in an effort to understand the process, obviously, understanding the process comes later.
So you don't have a smoothing method better than S-G for this purpose ? You only have to say so you know. It is not an excuse that you do not know how many datasets exist. It is not an excuse that you do not know that the same trend is revealed with usage of varying smoothing methods, varying trace locations, varying video segments...As it is, we cannot know which smoothing algorithm is the best, unless we already understand the process we are modelling with our data.
The Savitzky–Golay method essentially performs a local polynomial regression (of degree k) on a series of values (of at least k+1 points which are treated as being equally spaced in the series) to determine the smoothed value for each point. Methods are also provided for calculating the first up to the fifth derivatives.
The main advantage of this approach is that it tends to preserve features of the distribution such as relative maxima, minima and width, which are usually 'flattened' by other adjacent averaging techniques (like moving averages, for example).
Just because you make inept assumptions does not mean you are right, y'know![]()
Deriving second order data from high sample rate data containing noise WITHOUT smoothing is a really bad idea.
ROFL. Blind leading the blind.
So you don't have a smoothing method better than S-G for this purpose ?
You only have to say so you know. It is not an excuse that you do not know how many datasets exist. It is not an excuse that you do not know that the same trend is revealed with usage of varying smoothing methods, varying trace locations, varying video segments...
Any suggestions for a better method ? Anyone ?
Careful!
Leading question. So you did not stop beating your wife yet?
How do you know that all these algorithms don't smooth away signal? Answer: You don't, unless you know already what's going on.
Yes. No smoothing.
(Insert laughing dog here.)The varying smoothing methods I have applied have not been done blindly in the slightest.
Yes, and that general trend is consistent with NIST's data and models. Had femr2's dismissal of NIST's work been justified, we'd have had to dismiss femr2's work as well.Numerous differing smoothing mthods have been utilised, with various different datasets, from various diferent video segments.
Guess what .... they all reveal the same trend.
True.Nonsense. Deriving second order data from high sample rate data containing noise WITHOUT smoothing is a really bad idea.
True.Numerous datasets, numerous smoothing methods, numerous different video segments, numerous different trace locations....same end result. Slight differences in trend, to be expected, but otherwise...![]()
LMAO. Here is a list of the assumptions you have made merely in the last couple of posts...I'm not the one making the assumptions here. You are.
Assumption.I think what beachnut was trying to point out is that
Assumption.without some engineering understanding of the processes you're investigating
Assumption.blindly applying a smoothing function to experimental data is a simple GIGO analysis
Assumption.At least, that's what I took away from the bits of his post that femr2 chose to ignore completely.
Assumption.Smoothing data should only ever be done as a necessary evil.
Assumption.It's based on a belief that the information you're removing is necessarily noise, rather than signal.
Assumption.Without any justification for that belief
Assumption.for example, if the information is being used to try to understand an unknown mechanism, which one might think was the starting point for anyone trying to overturn an established understanding
Utter nonsense in this context.So the answer to the question "What smoothing method is better than Savitzky-Golay" is, therefore, "Not smoothing the data at all."
Nope. Revealed with numerous different datasets. Numerous different video segments. Numerous different trace locations. Numerous different smoothing methods (and no smoothing at all).You're assuming that there is a significant underlying trend masked by random noise.
Which I have. You are assuming I don'tUntil you've got some basis for that assumption
Nonsense. Suggest you re-read this thread from the first page. Highly relevant to the current discussion AGAIN.you're better off using the real data rather than a mathematical abstraction derived from it
In this specific circumstance using no smoothing is utter nonsense. Shame.Therefore, as I just said, using no smoothing at all is a better smoothing method than Savitzky-Golay in this specific circumstance.
In what way is the NIST acceleration profile data consistent with mine ?Yes, and that general trend is consistent with NIST's data and models.
What dismissal are you referring to ?Had femr2's dismissal of NIST's work been justified, we'd have had to dismiss femr2's work as well.
Be specific.The reason we don't have to dismiss femr2's work is that femr2's dismissals of NIST's analysis were, to put it as charitably as possible, hyperbole.
Aiii.True.
Aiii.True.
NIST didn't generate any detailed acceleration profile data at all. I see a repost of the issues with the NIST trace data loomingNo one has ever made a credible claim that the small differences between femr2's analyses and NIST's are significant.
I have conceded no such thing.I'm glad femr2 has finally conceded that point.
You and the poster formerly known as Major_Tom's entire JREF existence is dedicated to dismissing the NIST report.What dismissal are you referring to ?
This is it.... NIST didn't generate any detailed acceleration profile data at all. I see a repost of the issues with the NIST trace data looming...