From a mathematical standpoint, clearly radians are a far more natural
choice of unit than anything else for trig. But it is not as certain
that they are a good choice for programming.
For implementing "sin", you first have to reduce the value modulo 2π. I
would have thought it would easier to do that if the units were turns,
as the modulo reduction would then simply be taking the fractional part
of the number. You may also want to reduce to a 0°-90°, or 0-π/2 range
for the main calculation.
If you are doing with main calculation by Taylor series, radians are the
nicest units. But I believe it is more common to use Chebyshev
polynomials, and for that there is no great advantage in radians. It's
a very long time since I looked at Chebyshevs, but I think your most
efficient method would be to map 0°-90° to the range -1 to 1 - i.e.,
centred around 45°. This brings you to an ideal unit of an eighth of a
turn.
Turns divided by some power of two would also be ideal for faster but
less accurate implementations, such as Cordic or tables with linear or
cubic interpolation.
From the user viewpoint, radians are a natural unit for a lot of
applications - as are degrees. Turns, or turns divided by a power of
two, are also very useful for many applications, but there is no
consensus about what power of two to use.
And the user viewpoint is the important one here - the value can be
scaled before doing the calculations.
So choice number one for programming languages should be radians, except
perhaps for languages aimed at beginners or kids, where degrees might be
more appropriate. And if you have a second set, degrees would be the
choice, then followed by turns. (Half turns - SinPi and CosPi - seems
odd to me.)