Hi All,
Been doing a bit of reading regarding PI detectors that use a commonly available microcontroller such as Atmega8 or PIC. Was looking for some clarification of Analogue to Digital conversion time as to me it seems to be one of the factors holding back the amateur PI projects.
Coming in to this I was thinking these things run at 16-20Mhz it would be relatively easy sample every 1uS along the decay curve to find the difference to the "no metal present" curve and life would be great. To me now it seems you get 1 shot, sample early (getting gold), or sampling later to get (longer lasting) higher conductive targets (silver, iron, coins etc). This being due to the fact that 1 ADC takes about 150-200uS of time.
Have I got this correct? and if so since the process take such a long time, is the value returned an average of the sampled analogue value over the 150uS or how does that work?
Is there a way on these MCUs to choose the amount of time the sample is taken over? or is that something we can only do with higher power chips or an external fast ADC?
Thanks guys, I'm looking for a starting point here like I'm sure alot of others are. I believe the way forward is with a digital solution so wondering what others have done or are doing on the amateur front.
Cheers, rickodetrader
Been doing a bit of reading regarding PI detectors that use a commonly available microcontroller such as Atmega8 or PIC. Was looking for some clarification of Analogue to Digital conversion time as to me it seems to be one of the factors holding back the amateur PI projects.
Coming in to this I was thinking these things run at 16-20Mhz it would be relatively easy sample every 1uS along the decay curve to find the difference to the "no metal present" curve and life would be great. To me now it seems you get 1 shot, sample early (getting gold), or sampling later to get (longer lasting) higher conductive targets (silver, iron, coins etc). This being due to the fact that 1 ADC takes about 150-200uS of time.
Have I got this correct? and if so since the process take such a long time, is the value returned an average of the sampled analogue value over the 150uS or how does that work?
Is there a way on these MCUs to choose the amount of time the sample is taken over? or is that something we can only do with higher power chips or an external fast ADC?
Thanks guys, I'm looking for a starting point here like I'm sure alot of others are. I believe the way forward is with a digital solution so wondering what others have done or are doing on the amateur front.
Cheers, rickodetrader

Comment