Hi David,
Well, I wasn't aware that it did... but the actual random forests implementation is from OpenCV, so conceivably there is some optimization in there that I didn't know about that uses the GPU. But I didn't think so.
Therefore I don't have an answer to your question; I don't think CUDA would increase the performance, but if you can benchmark it and find out this is wrong then that would be interesting to know. This situation may change for QuPath at some point, and I'd personally rather have a CUDA-compatible GPU than not have one - given the direction the field is going, and the kinds of process likely to be required in the future.
I'd be less surprised if the rendering of the image (and overlays) on screen already used a GPU, if available, to improve the repainting speed. Potentially, this might be more noticeable whenever training a classifier interactively, because the screen needs to update more frequently to recolor the objects as their classifications change.
Regards,
Pete