It has to do with the difference between DC power and AC power.
DC power is static. AC power is dynamic .... that is ....it is
different at every stage of the 60 cycle/hertz sine wave. As I
understand it ( and I don't entirely) you'd have to measure what the
current is at every point along the sine wave and average it, to get
the wattage of an AC device. So they do .... and they call that VA or
Volt/Amp. However it really has nothing to do with the wattage of a DC
circuit which is still measured in watts. So .... forgetting all of
that, for our purposes in the alarm trade, if you just consider that
anything that is listed as VA ..... generally speaking, is equal to
approximately 50 to 60 percent of the VA rating .... in watts.
You can google it, but it all come down to the difference in the
formulas when applying Ohms Law to AC and DC circuits. Most people
don't even know that there's an AC Ohm's law or that it's different
than DC.
I never did find out why ... that after years of specing transformers
in Watts, all of a sudden they decided to change from Watts to VA, but
I think ,,,, or ,,,,, it may have had something to do with the growth
in use of UPS power supplies, because they needed a way to determine
how much battery power (DC watts) was necessary to provide enough line
voltage output (AC watts) so people could determine what size UPS to
get. I say that because most of the explanations that you see on line
mention UPS power supplies as examples. But .... could be it's just an
easy way to show the difference.
Google it if you need greater detail. It's all in the math.