high resolution timer

Erwin Rol mailinglists at erwinrol.com
Thu Jan 19 22:22:54 UTC 2006


On Thu, 2006-01-19 at 17:04 -0500, Gu, John A. (US SSA) wrote:
> I have a ADC (Analog-to-Digital Converter, Maxim 197) integrated with a
> SBC (Single Board Computer) running the Linux Kernel. The ADC has its
> sample rate of 10K per second (100 usec interval). The ADC can only
> operate on a single trigger and read fashion. Creating a driver is a
> good way to handle it. But I don't see any proper timer or delay logic
> available for my situation. By using udelay() will block all the other
> processes. 

Does the ADC signal the CPU via a IRQ ? Or do you have to poll it to see
if it ready ? In both cases it is highly unlikely that you can do that
with a standard kernel, since you will only have a small < 100us window
to get your sample, and in the normal kernel there can be latencies of
op to several 100ms so you would lose like 1000 or more samples. 

I suggest you look at the RTAI project and reask your question on their
mailing list, cause data acquisition is not that special and people
using RTAI use it a lot. 

The principle in your case would be a realtime task with a latency less
than the valid sample window that puts the values in a FIFO, and than
another less critical task to do "something" with those values. 

- Erwin





More information about the fedora-list mailing list