Also, why the arbitrary range of 0 … 100? Where practical, you should use the full range of integral types (bit of a problem for Int on OS X which is 64 bit) or probably 0 ..< 1 for floating point types.
And I think you should omit any dependency on UIKit. As it stands, a OS X developer can’t use your code without hacking bits out.