If you are getting very precise (micro-second level) delays in labview, then they are being executed in hardware, not by the CPU like Bonsai does (the delay is being programmed into some thing deterministic like and FPGA, microcontroller, or some specialized circuit). Anything that needs to use the host computer to generate delays is going to be non-deterministic and crappy looking.
The way Bonsai interacts with Arduino (when using the Bonsai.Arduino library) is the Frimata library. AFIAK this basically creates a local model of all the IO provided by the Arduino along with a remote-procedure-call protocol that sends the commands from interactions with the model to the actual hardware. This is cool but is going to be terrible for real-time applications or applications where the phase of digital events needs to be precise because each pin manipulation results in a massive overhead involving the host CPU, USB communication, command-decoding and then finally microcontroller port manipulation.
An easy way to improve this situation is send a serial string to the arduino to trigger an event and the have the delay defined in a custom arduino script.
e.g.
void setup() {
Serial.begin(9600);
pinMode(LED_BUILTIN, OUTPUT);
}
void loop() {
// None
}
// Turn LED on for 1 ms when any character is received
void serialEvent() {
digitalWriteFast(LED_BUILTIN, HIGH);
delay(1);
digitalWriteFast(LED_BUILTIN, LOW);
Serial.flush();
}
combined with the that attached bonsai workflow. I have not tested any of this but used a very similar approach in the past.