AFAIK, as you said, the bit depth refers to the wav file, not time code.
Your 80-bit description of LTC is correct, it's digital bits encoded using bi-phase modulation in the audio frequency range. So the digital time code bits are an audio signal, somehow A-D converted for the wav file, and then D-A converted for your target device which then takes the biphase signal and pulls the bits out. :-)
Someday we might get a networked time code standard :-) SMPTE was looking to update time code a couple years ago, I will follow up on that when I'm on sabbatical in the spring...
John
p.s. I had a book from one time code interface manufacturer (EECO) who
said that the origins of LTC were back to the Apollo program to
synchronize tapes.