The following question is detailed in 1D, with time as the ordinal variable. Similar questions could apply in other dimensions.
In several signal processing techniques, such as blind source separation (BSS), filter banks, or deconvolution, one may wish to estimate a signal $x(t)$ and only recovers $s.x(t+d)$, a scaled and delayed estimate. Rotations and shears can be added in higher dimensions, and many other more. $s$ is a scale factor, $d$ a delay. One might even stumble with warped data ($x_{s,d,w} = s.x(t/w+d)$), as in super-resolution for instance.
In theory, one can estimate continuously $s$ and $d$ with local correlation or Fourier transforms (How to match 2 signals that have same information, though shifted and scaled). The warping $w$ might be estimated with the scale-transform or wavelet representations. I have read several BSS papers and books, asked people, been in conferences, and was not able to find a standard, or at least a usable metric.
In image (it works on signals as well), the Structural Similarity index somehow compensates offset and variance.
- Are there practical error metrics to compare the original $x(t)$ with the transformed $x_{s,d,w}(t)$ in the context of sampled signals and noise conditions? Indeed, the discretization induced by sampling complicates the comparison task (imagine for instance a $1$-sample spike on the sampling grid that would be delayed by a non-integer time), as well as the noise.
- Should one resort to asymmetric quantities such as divergences?
- Can other signal properties help (bandpass, sparse, positive, etc.)?
Forgetting about the warping, I have tried to minimize a standard $\ell_p$ norm, with $s$, $d$, and $w$ as parameters, and to smooth both signals. I am not satisfied with the complexity and the outcomes, and this is a bit tedious.
No comments:
Post a Comment