In the old analog meters, there basically was only a current meter,
often 50uA full scale, and when measuring voltage a suitable series
resistor is switched in to make it draw 50uA at the full scale reading.
So a 10V range would have a total resistance of 10V/50uA = 200k
(which would be the resistance of the meter itself plus the series R).
At 10V measured voltage there is 50uA through the 200k resistance.
When looking at this, any range will have a resistance of 1V/50uA per
volt of range, hence "20K per volt". The 100V range will be 2M.
This is no longer true for a modern DVM. They usually have a 10M
series resistor on the input with selectable resistors to ground to
make a voltage divider that outputs the desired voltage for the ADC.
So, depending on the range you select, the input resistance will be
10M plus a small value that will get smaller when you select a higher
range.
Therefore there is no fixed "K per volt" input resistance anymore, and
selecting a higher range will not result in a higher resistance.
However, as already can be seen, the "20K per volt" is not really
telling the input resistance to be used in the measurement.
It depends on the selected range, and available ranges vary between
meters. One may have ranges of 10-30-100 and another maybe 10-50-200.
When you need to measure a 24V testpoint, on one meter it may be on
the 30V range (and thus 600k resistance) and on another meter it would
be the 50V range (and thus 1M resistance).
It is expected that the person doing the measurement understands how
this could affect the result, if it does at all.
(when measuring a supply voltage, there should not be a noticable
difference. when measuring in a high-impedance signal circuit, there
could be)