The voltage produced by the supply is variable and controlled by the tube itself.
If you put the two wires close together then a spark will jump the gap, it takes a certain voltage to make a spark and the voltage is obviously controlled by the distance apart of the two electrodes (red wire and earth).
If you hold the wires 10 cm apart then a much bigger spark wll happen or, if the distance is too much you will get no spark at all.
If you want to look at the voltage produced then you need an oscilloscope, not a meter. The voltage waveform shape is important adn controlled by the power supply. The big resistor you showed a picture of si not useful at all to make a potential divider.
Some arithmetic.
Let's assume that the voltage is around 30,000 volts. You want to measure it with a meter that can stand 1000 volts so you need about 1/30th of the actual voltage at your measuring point.
We need to take the 30,000 and divide it by 30 and we can do this easily with two resistors.
30,000 > resistor > test point > resistor > ground
We also have a limitation of about 20 ma because if we try to take more than that we will damage the power supply but we do need to be somewhere near that in order to have the power supply operating in its normal operating area so let's assume 15 ma.
30000 volts, 15 ma that gives a resistance of 2 million ohms using ohms law. That means that R1 + R2 = 2000000
R1 and R2 need splitting in a ration to give us 1/30th of hte voltage at the measuring point so if we make R1 1.9 Mohm and R2 66 kohm then the voltage at the test point will be 1/30 of the voltage that is being put out by the power supply.
Having set those resistor values we need to be sure that we do not overload them so with a 1.9 Mohm resistor dropping 29,,000 volts we have a power dissipated of 44 watts so it needs to be a pretty damned powerful resistor.
That is the theory of it but the practise of it is that you look how big a spark it will make to see if the pwoer supply is OK.