I'm working on a project where we measure the solderability of components. The measured signal is noisy. We need to process the signal in real time so that we are able to recognize the change that begins at the time of 5000 milliseconds.
My system takes sample of real value every 10 miliseconds - but it can be adjusted to slower sampling.
- How can I detect this drop at 5000 milliseconds?
- What do you think about signal/noise ratio? Should we focus and try to get better signal?
- There is a problem that every measure has different results, and sometimes the drop is even smaller than this example.
Link to data files (they are not same with ones used for plots, but they show latest system status)
- https://docs.google.com/open?id=0B3wRYK5WB4afV0NEMlZNRHJzVkk
- https://docs.google.com/open?id=0B3wRYK5WB4afZ3lIVzhubl9iV0E
- https://docs.google.com/open?id=0B3wRYK5WB4afUktnMmxfNHJsQmc
- https://docs.google.com/open?id=0B3wRYK5WB4afRmxVYjItQ09PbE0
- https://docs.google.com/open?id=0B3wRYK5WB4afU3RhYUxBQzNzVDQ
No comments:
Post a Comment