Okay, here's what I take exception with.
If the input and output impedances (transmitter and antenna) are the same characteristic impedance as that of the coax, then the velocity factor of the coax plays no part in the resulting SWR. The only time when the feed line's velocity factor plays any part at all is when the coax is used for impedance matching or phasing/timing. In those cases, the difference in the characteristic impedance of the feed line and the antenna's input impedance, and it's length, is the 'variable' that you are playing with.
If I'm understanding the set up correctly, just a single feed line of whatever length, supposedly having an impedance of 50 ohms, then the most likely/obvious point of contention is the antenna's input impedance. So, since it can be varied, retune the antenna. Already suggested.
An SWR difference of 0.3 is probably the difference in the normal characteristic impedance between any two 'chunks' of coax. Not to mention the possible (and common) differences in readings between any two SWR meters!
- 'Doc