# MoreDat

A Student Network for Collaboration and Sharing

Signup on MoreDat.com

What’s Next For Electronic Devices. Have you heard of FPGAs? Find out now here. And as a bonus get your free guide to writing a successful resume. Just Signup now and it’s all yours free.

# Converting Sensor output 0 to 3vDC to 0 to 1.5VDC

Hi ,

I am using a Soil moisture sensor whose Output voltage is 0 to 3vDC which i am feeding to the ADC of the MCU.

The ADC is a 10bit ADC,And the maximum reference voltage is 1.8VDC as the reference voltage is less than the out put of the sensor voltage i could not measure the sensor out put when it exceed more than 1.8v what should i do to solve this problem.

Views: 21

### Replies to This Discussion

The simpliest solution would be to use a voltage divider.  Bear in mind though that your sensor probably has very low current capability so you would need to use large value resistors.  Since we have to convert 3v to 1.8v, we need resistor values that would provide us with a ratio of 1.8/3 which is equal to 0.6.  Using our voltage divider rule [R2/(R1+R2)] (Vin) we already know the ratio of the resistors so to figure the resistor values we would set 0.6 = R2/(R1+R2).  Now we select some large value standard resistor for R2 such as 1Meg and solve for R1.  You may have to try different R2 values to calculate an R1 value that you can find.  R1 and R2 may need to be precision value resistors with tolerances of 1% in order to get good results.

If the voltage divider alone does not work out for you, then you will need to amplify the sensor input, providing more current capability to feed the voltage divider.  Of course if you amplify the signal you will probably need to alter the resistor values in the voltage divider.