The math doesn't add up. 10 times the power doesn't make a difference until you get to 100 watts, then only 5 times will make a difference?
Signal meters measure in dB, dBs are logarithmic.
3dB is double
6dB is quadruple
8dB is eightfold
10dB is tenfold.
Also because of the way you add dB up a 20 times increase = 13dB.
IARU standard is 6dB per S point however on a CB S meter one S point =3dB.
Going from 4W to 20W will see your signal rise 2 S points on a CB, it won't see a S3 signal go to S9+.
Going from 10W to 100W will see a S1 signal rise to no more than S4. It won't take you to S9+.
Going from 100W to 500W will only see your signal rise just a little under 2 S points.
Going from 500W to 1.5kW will only see your signal rise just one S point.
When I used to contest on ham radio I used to use 100W. I was coming in the top 20% of all entries beating 1.5kW stations. Going from 100W to 1.5kW is ~12dB, just 2 S points on ham gear and even only an additional 3 S points on a CB extra for all the problems that running that comes with running 1.5kW.
Then let's put that radio in front of a good old CB lenyar. Now difference in 6 to 20 might make more of an impact. Even more if we were talking real power and not hack job watts.
Unless you're overdriving the amp and then all it'll do is transmit however many hundred watts mostly over any other frequency but the one displayed on the channel selector. A power meter will tell you how much power you're putting out in total, it won't tell you whether all of that power is actually on the channel you're tuned to, you need a spectrum analyser if you want to do that.