I noticed that some x values are repeated up to five times, and that the selection of a trajectory through such a region (depending on what the algorithm does) could vary between:
c c c
vs b vs
b
b
a a a
(i.e. after dropping some number of points from the input series, the middle point either falls midway between the other two, or creates an "elbow" close to the previous or following point).
I thought about working out a way to maintain a stack, such that from the first (bottom) to the last (top) stack element, there would be at most three distinct x values (that seems to be the extent of the "oscillation" you're seeing).
Once a fourth x value appears in the next point, review the stack and output the points that comprise the smoothest trajectory. But that's a lot of work for a potentially insignificant gain in "accuracy".
UPDATE: As for your "primary" goal of "preserving the original data" as much as possible, as opposed to creating a smoothed sequence with adjustments affecting all data points: given that the input appears to be quantized, any desire to preserve that original data is actually going to preserve the nature of the measuring device, rather than the nature of the thing being measured. Maybe Certainly there's value in that, but depending on what your downstream processes are supposed to do with the data, you might rather let those processes get a "fully smoothed" version of the data, because this represents a presumably reasonable fiction, as opposed to the quantized, jagged fiction being created by your measuring device.
Another update: obviously, by preserving the original data - but not necessarily using it as-is for certain downstream processes - you'll get to do comparisons if/when you use a different device (or a different configuration of the current device). | [reply] [d/l] |
given that the input appears to be quantized,
I agree the data is quantized. It appears to be, (still waiting for confirmation from the equipment manufacturer), an artifact of the digitisation of the analogue values produced by the sensing device.
depending on what your downstream processes are supposed to do with the data,
The cleanup is required because the downstream processing -- FEM software -- interpolates between the supplied values using a cubic spline interpolation, as the simulation converges. Thus, it requires that the input data be monotonic in order that it can produce a 'single-valued cubic spline fit'.
My choice to avoid producing a "fully smoothed" fit, is because I've seen bad interactions between pre-processed fitting, and the fitting done internally by the software. These manifest themselves as interminable oscillations in the Newton-Raphson iterations resulting in extremely extended run times.
you'll get to do comparisons if/when you use a different device (or a different configuration of the current device).
The data is supplied to me; I only get one dataset per sample, but lots of (physically and chemically) different samples. I have no control over how it is produced.
With the rise and rise of 'Social' network sites: 'Computers are making people easier to use everyday'
Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
In the absence of evidence, opinion is indistinguishable from prejudice.
| [reply] |
| [reply] |