I don't think that "flicker-free" translates to 30 fps, like on modern display. The long sustain will probably stabilise the image a lot, so, something like 12 fps may still have been perceived as flicker-free. (E.g., Spacewar! is more or less about 15 fps.)
The Type 33 Manual reads,
> Symboi Display Time
> Approximately 96 microseconds from start of first memory cycle to end of display cycle, plus 3 microseconds for each dot intensified.
General operations are described as follows:
> The computer determines the location of the left end of a horizontal Iine along which symbols are to be plotted and loads this location into the X counter and Y buffer as in the point plotting mode, but does not initiate the 35-microsecond setup delay or permit a display. It next determines the intensity level, whether incrementing for additional symbols is needed, and the character size format for the line; and loads this information into the intensity and format buffers. Then it selects the first half of the particular symbol word to be generated and determines if it is to be a subscript, and loads this into a shift register with a command that initiates the symbol matrix plotting sequence. The computer must now select the last half of the symbol word and wait for the Type 33 to give a completion signal when the first half of the symbol word has been displayed, then transfer the last half of the symbol word to the shift register with a command that starts the symbol matrix plotting sequence again from the place where it left off. The Type 33 will again return a completion signal to the computer when it has finished displaying the character and has incremented the X counter to the beginning location of the next symbol.
Loading X and Y is specified as 30 microseconds in total.
The internal timing sequence seems to be started by a PLT impulse (which discriminates this IOT command [sdb] from the usual dpy) at TP7, which also seems the base line for the internal timing.
A (negative) clear display pulse (CDP) is generated 1.1 microseconds after the IOT instruction and a load display pulse (LDP) is generated 2.2 microseconds after CDP 3.3 microseconds after the IOT instruction.)
After each of the two pattern words have been displayed a DDP (display done pules) is generated, indicating that the PDP-1 should send the next pattern or, in the second DDP, that the display is complete.
For the actual display, there are two 1 microsecond delays (combined the 2.0 microseconds matrix timing delay, where bits are shifted and tested) and a 3 microsecond intensification delay for each bit. Meaning, processing a single bit should take 5 microseconds.
As the second 1 microsecond delay triggers the intensification delay, the intensification delay is NOT generated for any unset bits. (Meaning, there's no constant timing, rather this depends on the count of set bits.)
The first of these 1 microsecond delays is the shift counter delay, generating the shift-and-count pulse (SCP) and there are 35 SCPs in total.
The over-all timing sequence is kind of complicated (two nearly identical blocks of two minor blocks each, each consisting of 6 SCP-enable signals and a final SPL signal generating the last shift-count – how does this add up to 15?) and I don't really understand it.
Anyways,
1) 30 µs …… transfer X and Y coors
(we may be here at 35 µs –for the point plotting mode, there's a 35 µs initial delay specified.)
2) intensification decoding
3) char. drop / subscript bit decoding
4) PLT 1: 3.2 + 17 x 2 µs + 3 µs for each set bit
5) DPP
6) PLT 2: 3.2 + 18 x 2 µs + 3 µs for each set bit