So there are 3 that are identical: "Const Val Type", "Type", and "Const
Functors." To veterans, this is not surprising.
What might be more surprising is that "Val Type" isn't fast. This is because
the type isn't inferable: the whole point of Val is to wrap things that aren't
types and turn them into a type, so the compiler doesn't know the type of
Val{EContinuous}. You can use "function barrier" techniques to work around
this kind of instability
(
http://docs.julialang.org/en/latest/manual/performance-tips/#separate-kernel-functions), or (as you did) define consts.
Likewise, the "Functors" approach would work out mostly fine if instead of
timing at global scope you wrote your timings like this:
function time_functors(n, fc, fs, fe)
s = 0.0
for i = 1:n
s += fc(0.15, 2.0)
s += fs(0.15, 2.0)
s += fe(0.15, 2.0)
end
s
end
time_functors(1, fc, fs, fe)
@time time_functors(10000000, fc, fs, fe)
Accumulating into s prevents the compiler from noticing that you don't
actually use these values and elide the entire loop. (This doesn't always
happen without the s, but it's good defensive practice when timing things.)
(There's actually a small penalty for the non-const functors that I don't
immediately understand, but I lack time to follow it up right now.)
--Tim
> *Results:*
>
> julia> include("enum-vs-dispatch.jl")
> enums with switch code
> 323.515 milliseconds
> Dispatch via Val Type
> 1.068 seconds (30000 k allocations: 458 MB, 1.96% gc time)
> Dispatch via Const Val Type
> 145.355 milliseconds
> *Dispatch via Type*
> * 148.047 milliseconds*