The conversion in the nuke-default profile relies on 1D luts to do conversions, and those 1D luts are currently defined from [-0.125, 1.125] (in the output color space)
So the expected behavior would be that linear values from [-0.0096749, 1.308] would map to [-0.125, 1.125] in srgb. The equivalent for cineon is that [-0.008942, 36.099] in linear would map to [-0.125, 1.125] in cineon.
Is this what you're seeing?
The choice of 1.125 is sort of arbitrary. I wanted to get a bit of extra range in there to preserve a limited amount of overshoot / undershoot (otherwise 0,1 would have sufficed). But if you have a use case where you need a greater range I'm happy to generate larger tables.
This issue accidentally slipped off my radar, I'll do my best to get to it asap. (independent of whether it's what you're seeing or not).
-- Jeremy