I've used both methods with no problems, though in all cases it was direct-drive (non-multiplexed).
If you use a NPN cathode driver, it's easy to implement current-regulation with an emitter resistor. The nice thing about NPN's is that they can be driven by low-voltage I/O. Not just 3.3V, but even lower. The base-current is on the order of 200uA, so even wimpy I/Os can drive them.
If you use a NMOS cathode driver, you can still implement current-regulation, but you will want higher gate-drive-voltage to swamp-out uncertainty due to variations in Vgs(on). I use 10-12V. Drive-current is essentially zero.
If you use a HV5530 (or similar) and want to regulate the current, you will need to do that at the anode-side. My latest clock (fourteen IN-18 tubes) does this. Each tube requires it's own regulator; cheap insurance for expensive tubes.
Only my first clock design used a resistor to limit tube current. It's worked well over the years, but being paranoid about tube characteristics changing over time due to aging, I now use current-regulators. It's probably overkill, but I enjoy analog design work. My dying wish is to see my nixie clocks outlive me, and I hope to be around another 40-ish years.