Nothing at all wrong with using 75 ohm cable with a "50 ohm radio". The feedpoint impedance of a properly installed dipole is a lot closer to 75 ohms than to 50.
That really depends on the antenna in use,
like you say it won't make any difference either way using 75 ohm or 50 ohm cable on a dipole as long as all the other cable characteristics are the same,either way you will have approx 1.5:1 mismatch, be it at the antenna end using 50 ohm coax or the radio/tuner end using 75 ohm coax.
the same could be said if you use a single run of 75 ohm coax of an odd 1/4 wave multiple to a quad with roughly 100 ohm feed impedance.As it will transform the 100 ohm impedance at one end to 50 ohm at the other.
there's probably other coax impedances that could be used too depending on the antenna in use and the required impedance transformation, but 75 ohm transformers are the most common.