As the title states, looking for criteria on how best to match amp/amps(monos) with a preamp. Have researched this a bit in the past, and read some saying matching output voltage of a preamp to the input sensitivity of the power amp is the way to do it. Also saw where it was mentioned that if an amp has high enough voltage gain, then matching the input voltage to the output voltage of a preamp is not all that important. There may have been other factors brought up, but can't remember them now.
To sum up the research never provided me with a 'warm/fuzzy" answer. Guess I am looking for some comments on whether I understand the concepts used to match a preamp with a power amp. Maybe the best way to get meaningful feedback is to just explain the way I understand it, then have others comment:
My understanding of output voltage and input sensitivity:
Output voltage of a preamp is the voltage at which the signal leaves the line outs of a preamp. Some preamps will increase the voltage on the signal given to them (active), and others will not (passive). Input sensitivity for a power amp is the input voltage that will drive it to its maximum rated power (measured at a given speaker impedance).
Since I have a passive preamp, lets use that as an example, the dac feeding it is listed as having a 2v output. Using this value as the input voltage on the signal to the passive preamp, the max output of the preamp will be 2v, and each step down from max of the gain/volume knobs on the premp will reduce the output voltage. So the preamp should be able to output 0v -2v to the input of the power amps.
Lets say the power amps have a voltage gain of 34, and an input sensitivity of 1.44 at an 8 ohm resistance. Then the passive preamp at about 72% of the max gain, will provide the 1.44v needed to allow the amps to reach their max power rating at 8 ohm resistance.
Would also like to verify my understanding on how exactly to measure input sensitivity of a power amp...but will wait until seeing some feedback on the statements above. Also, if there are other important factors to consider when matching a preamp to a power amp, please feel free to chime in on them. Links to helpful information is also greatly appreciated.
Thanks,
John
To sum up the research never provided me with a 'warm/fuzzy" answer. Guess I am looking for some comments on whether I understand the concepts used to match a preamp with a power amp. Maybe the best way to get meaningful feedback is to just explain the way I understand it, then have others comment:
My understanding of output voltage and input sensitivity:
Output voltage of a preamp is the voltage at which the signal leaves the line outs of a preamp. Some preamps will increase the voltage on the signal given to them (active), and others will not (passive). Input sensitivity for a power amp is the input voltage that will drive it to its maximum rated power (measured at a given speaker impedance).
Since I have a passive preamp, lets use that as an example, the dac feeding it is listed as having a 2v output. Using this value as the input voltage on the signal to the passive preamp, the max output of the preamp will be 2v, and each step down from max of the gain/volume knobs on the premp will reduce the output voltage. So the preamp should be able to output 0v -2v to the input of the power amps.
Lets say the power amps have a voltage gain of 34, and an input sensitivity of 1.44 at an 8 ohm resistance. Then the passive preamp at about 72% of the max gain, will provide the 1.44v needed to allow the amps to reach their max power rating at 8 ohm resistance.
Would also like to verify my understanding on how exactly to measure input sensitivity of a power amp...but will wait until seeing some feedback on the statements above. Also, if there are other important factors to consider when matching a preamp to a power amp, please feel free to chime in on them. Links to helpful information is also greatly appreciated.
Thanks,
John
Comment