quoted from shockwave in another thread:
"Perhaps a better demonstration showing how amplifier output impedance does not impact reflected power would be using the modern solid state HF rig with an automatically calibrated SWR meter. As Bob mentioned the output can only be matched to 50 ohms at one value of output power. Designers choose this value to be at full rated power to get the best efficiency.
If you were to set the drive level on CW to produce a 100 watt carrier, this would place the output impedance of the amplifier very close to the 50 ohm design goal (assuming your radio is rated at 100 watts). Care to guess how much the output impedance of the amplifier rises to when you drop the carrier to 10 watts?
Output impedance must rise with the reduction in collector current caused by the reduced drive. Using simple ohms law, if a reduction in drive causes current to drop by ten times at the same voltage, the output impedance has gone up 10 times. Yet the automatically calibrated SWR meter looking at the load will remain rock stable throughout the drive range. "
So, the part that I'm still chewing on is this...you're adding resistance to the output from the radio to lower it enough so the amplifier could handle the drive. Doesn't this added resistance add ohms to the output impedence of the radio giving a mismatch between the output impedence of the radio versus what the amplifier is designed for as an input impedence from the radio? The amplifier is seeing the correct drive but at the wrong impedence? Is it not worth worrying about or am I totally off here?