electronics forum beginner
Joined: 12 Jul 2006
|Posted: Mon Aug 07, 2006 8:06 am Post subject:
Clipping "-ve" voltage of output signal to 0V
Hi all, I've got a slight problem at hand. I'm doing signal process and need to use an ADC to digitize the signal.
The signal varies between +/-5v but the differential inputs of the ADC only accepts +ve voltage.
I'm using LM358 as a inverting op-amp (Gain=-1) to handle the -ve input signal and non-inverting op-amp (Gain=1) to handle
the +ve input signal.
I've connected the V+ to +5V and V- to GND.
Since using inverting op-amp, I will be expecting -ve o/p signal when compared to the +ve i/p signal and vice versa.
When I measured the o/p signal using a scope, the +ve o/p signal is a perfect invert as compare to the -ve i/p signal.
For the supposing "-ve" o/p voltage I expect 0V since it is being "clipped" at GND, instead I've gotten a small +ve voltage hump.
Is there any additional components I need to add to 'kill' off the hump?