# too much input voltage?

#### camphor1122

Aug 12, 2014
6
Hi guys,

I have a buffer amplifier that has a single rail supply of 5v.
Suppose I have an analogue signal as the input signal to the buffer that sometimes exceeds 5v (0v-5.5v)
So in this case, ideally the output voltage won't exceed 5v and clipping/data loss will occur.
Since this is a problem. Is there a method to track the input voltage to determine if it exceeds 5v and have the output scaled down so it will not clip but at the same time have a maximum output signal?

Any help is appreciated!

#### (*steve*)

##### ¡sǝpodᴉʇuɐ ǝɥʇ ɹɐǝɥd
Moderator
Jan 21, 2010
25,505
You could attenuate the input signal slightly so that 0 - 5.5V became 0 - 4.5V. But this is no longer a unity gain buffer.

You would be best off ensuring the input does not exceed the limits for the device (often the supply rails) or by connecting diodes between the input and the supply rails to limit excursions beyond the supply rails.

#### Harald Kapp

##### Moderator
Moderator
Nov 17, 2011
13,176
That's what a compressor does. Note that the circuit will be no longer linear as the compressor changes gain depending on volume.

#### KrisBlueNZ

##### Sadly passed away in 2015
Nov 28, 2011
8,393
A compressor acts on the amplitude of an AC signal (usually an audio frequency signal). I think in this case, the input signal is just a voltage level, not an AC signal with varying amplitude, and its range may exceed the 0~5V range of the buffer amplifier.

The best answer to the question depends on a number of factors. Please describe the whole project and application in detail so we can suggest the best way(s) of detecting overvoltage.

Replies
2
Views
792
Replies
6
Views
1K
Replies
10
Views
2K
Replies
16
Views
3K
Replies
2
Views
1K