Greetings! Gentlemen, I have a question. A question about the stability of the VDI scale.
In homemade metal detectors, a piece of ferrite (a piece of ferrite magnetic circuit of a transformer or antenna) is used to set the beginning of the VDI scale. At the same time, there is no such thing in branded metal detectors: they are configured from the factory and the user does not use additional items for customization.
How can this be achieved? The laws of physics have the same effect on homemade metal detectors and branded ones.
After calibrating the VDI scale with ferrite (which has its own VDI minus 90 degrees) we set the beginning (reference point) of the VDI scale for the remaining metals.
But depending on the ambient temperature, the electronic components of the metal detector change their parameters. because of this, the reference point of the VDI scale is shifted and as a result, all VDI values for other targets are shifted.
How can I stabilize the VDI scale setting? I assume that the trend of change can be traced by a change in the phase of the current in the TX circuit and by a change in the phase of the voltage in the RX circuit.
But. I understand how to implement the measurement of the signal phase in the TX circuit, it is enough to install a current measuring shunt. But how do I directly measure the voltage phase in the RX circuit (of course, if the receiving circuit is resonant and the RX capacitor is located directly in the coil)?
How do all branded metal detectors ensure the stability of the VDI scale to the user?
Maybe there is some kind of patent in this area or a scientific article?
In homemade metal detectors, a piece of ferrite (a piece of ferrite magnetic circuit of a transformer or antenna) is used to set the beginning of the VDI scale. At the same time, there is no such thing in branded metal detectors: they are configured from the factory and the user does not use additional items for customization.
How can this be achieved? The laws of physics have the same effect on homemade metal detectors and branded ones.
After calibrating the VDI scale with ferrite (which has its own VDI minus 90 degrees) we set the beginning (reference point) of the VDI scale for the remaining metals.
But depending on the ambient temperature, the electronic components of the metal detector change their parameters. because of this, the reference point of the VDI scale is shifted and as a result, all VDI values for other targets are shifted.
How can I stabilize the VDI scale setting? I assume that the trend of change can be traced by a change in the phase of the current in the TX circuit and by a change in the phase of the voltage in the RX circuit.
But. I understand how to implement the measurement of the signal phase in the TX circuit, it is enough to install a current measuring shunt. But how do I directly measure the voltage phase in the RX circuit (of course, if the receiving circuit is resonant and the RX capacitor is located directly in the coil)?
How do all branded metal detectors ensure the stability of the VDI scale to the user?
Maybe there is some kind of patent in this area or a scientific article?
Comment