Sodium

486 views
Skip to first unread message

Stefan Karpinski

unread,
Mar 7, 2013, 1:08:46 AM3/7/13
to Julia Dev
http://labs.umbrella.com/2013/03/06/announcing-sodium-a-new-cryptographic-library/

I had previously looked at wrapping NaCl but given up on it for the reasons listed in this post, so Sodium looks like a great option for creating Julia crypto bindings if anyone's interested.

Stefan Karpinski

unread,
Mar 7, 2013, 1:28:35 AM3/7/13
to Julia Dev

Alessandro "Jake" Andrioni

unread,
Mar 13, 2013, 10:52:49 PM3/13/13
to juli...@googlegroups.com
Is anyone currently working on this? I've wrapped a couple of
functions and wrote a hexadecimal string to Vector{Uint8} converter ,
so I'd like to know if I'm duplicating someone else's work and/or any
potential for cooperation.

Thanks,
Alessandro

Viral Shah

unread,
Mar 13, 2013, 11:50:32 PM3/13/13
to juli...@googlegroups.com, jake...@gmail.com
It would be great if Clang.jl could be used here. 

-viral

Alessandro "Jake" Andrioni

unread,
Mar 14, 2013, 1:00:15 AM3/14/13
to juli...@googlegroups.com
Yeah, I saw the work done by Isaiah, and it helps, the problem is that
most of the API uses just unsigned char* for arguments and the
documentation isn't that great, so I ended up having to look up the
source of every wrapped function to understand them correctly, which
slowed me down a lot.

(also, Isaiah, I really have to thank you for your work, as it exposes
the real name exported name, so it freed me from looking into two or
three headers for all the #defines)

Isaiah Norton

unread,
Mar 14, 2013, 1:17:07 AM3/14/13
to juli...@googlegroups.com
Yup, originally it was resolving to the underlying type during generation, but keeping the typedef'd names in the function sig was very intentional. It took a (long) while to get the macro working properly, but I wanted to avoid obviating documentation that might exist somewhere, for exactly this reason. Thanks!

Isaiah Norton

unread,
Mar 14, 2013, 1:25:58 AM3/14/13
to juli...@googlegroups.com
s/obviating/obscuring/

Isaiah Norton

unread,
Mar 14, 2013, 2:24:23 AM3/14/13
to juli...@googlegroups.com
I updated the wrappings in that gist to the newest version of wrap_c. Now they should actually work!
https://gist.github.com/ihnorton/5106056


On Thu, Mar 14, 2013 at 1:00 AM, Alessandro "Jake" Andrioni <jake...@gmail.com> wrote:

Alessandro "Jake" Andrioni

unread,
Mar 14, 2013, 2:37:00 AM3/14/13
to juli...@googlegroups.com
Much better and cleaner now! I'll try to look at it seriously in the
morning, and maybe upload to github the small work I've done already.

Stefan Karpinski

unread,
Mar 14, 2013, 3:39:00 AM3/14/13
to Julia Dev
I feel like having the library name as the first argument to the @c macro would be easier to read, but I guess having it last makes it possible for it to be optional, defaulting to the main binary.

Dahua

unread,
Mar 15, 2013, 9:47:07 AM3/15/13
to juli...@googlegroups.com
Since more packages (especially those ported from C) are using these @c macro -- should we consider moving it to base? So that it need not be copy-and-pasted to every such package.

Isaiah Norton

unread,
Mar 15, 2013, 10:05:35 AM3/15/13
to juli...@googlegroups.com
@c macro would be easier to read, but I guess having it last makes it possible for it to be optional, defaulting to the main binary.

In most cases there will be only one shared library per Julia file so it is only necessary to read to the end once.

@c macro -- should we consider moving it to base?

I think the macro is a little bit inefficient right now - too many evals. I will try to make a less hand-wavy version, or just emit straight AST as Toivo has suggested. I was trying to enhance readability of the headers by having a clean, one-line C style declaration per function, but maybe this is not worth the extra cost.

Stefan Karpinski

unread,
Mar 15, 2013, 10:42:27 AM3/15/13
to Julia Dev
The @c macro is definitely not well baked enough to go into Base.

Robert Ennis

unread,
Mar 15, 2013, 4:25:09 PM3/15/13
to juli...@googlegroups.com
I've only taken a quick look, but it appears that @c and gencindex.jl are more general and straightforward that GetC.jl?  How exacly do @c and @get_c_fun compare/differ, if they do at all?  I assume that the key difference is in @c's dependence on clang being present on the host system?

--Rob

Isaiah Norton

unread,
Mar 16, 2013, 1:25:32 AM3/16/13
to juli...@googlegroups.com
Hi Robert,

 it appears that @c and gencindex.jl are more general and straightforward that GetC.jl 
 How exacly do @c and @get_c_fun compare/differ, if they do at all
 
well Clang.cindex/wrap_c is separate (below), but as far as GetC and @c they are basically doing the same thing I think.

I assume that the key difference is in @c's dependence on clang being present on the host system?

@c has no dependency, it's just a macro giving some syntax sugar for ccall/ret/args/lib

The Clang.jl wrapper generator ("wrap_c") does require Clang. In terms of generality: it has a full C and C++ parser underneath, although there are limits to what is exposed by the libclang interface (no real show stoppers for C AFAIK). Right now the generator only supports C, but the C++ parsing has a few potential uses.
 
Isaiah

Tim Holy

unread,
Mar 16, 2013, 7:36:59 AM3/16/13
to juli...@googlegroups.com
On Saturday, March 16, 2013 01:25:32 AM Isaiah Norton wrote:
> Right now the generator only supports C, but the C++ parsing
> has a few potential uses.

Definitely. In case it's any help whatsoever I recently updated the @cpp macro
and put it in a package.

--Tim

Tom Short

unread,
Mar 18, 2013, 1:22:25 PM3/18/13
to julia-dev
Just emitting straight AST as Toivo suggested is an interesting idea. It would probably make packages load a bit faster. It might also help with compatibility with Julia 0.1.



On Fri, Mar 15, 2013 at 10:05 AM, Isaiah Norton <isaiah...@gmail.com> wrote:

Robert Ennis

unread,
Apr 8, 2013, 11:31:15 PM4/8/13
to juli...@googlegroups.com
Sorry for the delay on this, but I was curious because I figure that the @c macro and friends are in a more refined state than GetC.jl at the moment.  Plus, lists of @c calls read cleaner to me and they dovetail nicely with Clang output, which has a strong base of developers behind it.  Lastly, since @c and GetC.jl are essentially doing the same thing (and I don't have a lot of time to keep an eye on GetC.jl for the next month or two, unless Jasper or someone else wants to take it over again), I propose, to the community, adopting @c as the "official" FFI for Julia.  Let me know what you think.

--Rob

Isaiah Norton

unread,
Apr 9, 2013, 12:12:52 AM4/9/13
to juli...@googlegroups.com
As I usually understand "FFI" - ccall is already precisely that for Julia (Foreign Function Interface). @c is just sugar on top, and it is short enough that I'm not sure it's worth making into a package. Of course, folks are welcome to cherry pick if it's useful (I know there are various similar macros floating around other packages). 

Also, FWIW, @c has been updated to be eval-free.

Isaiah

Alessandro "Jake" Andrioni

unread,
Apr 9, 2013, 1:35:45 AM4/9/13
to juli...@googlegroups.com
Well, I'm currently quite happy right now with ccall and Clang.jl, I just hope things won't become like the SWIG/cython/ctypes divergence that we have in Python.

Isaiah, please post here as the C++ support improves too, as I'd love to wrap NTL (it is C++ only). I plan to re-implement some of its utilities in Catalan.jl sooner or later (like the LLL lattice basis reduction algorithm), but it's always nice to have a more tested and probably more efficient implementation available.

Isaiah Norton

unread,
Apr 9, 2013, 10:32:18 AM4/9/13
to juli...@googlegroups.com
I'd love to wrap NTL (it is C++ only). I plan to re-implement some of its utilities in Catalan.jl sooner or later (like the LLL lattice basis reduction algorithm), but it's always nice to have a more tested and probably more efficient implementation available.

Is this the NTL you are referring to - 
If so, can you comment on how much templating is used in the library?

Bill Hart

unread,
Apr 9, 2013, 10:52:51 AM4/9/13
to juli...@googlegroups.com
I believe the latest NTL, only recently released, uses much more templating.

Bill.

Bill Hart

unread,
Apr 9, 2013, 10:58:23 AM4/9/13
to juli...@googlegroups.com
In fact, here is an excerpt from an email by NTL's author (private correspondence, but by no means secret) about the new NTL:

"I recently released NTL v6.  This represents two major improvements:
1) a faster small-prime FFT
2) use of templates instead of the very retro macros that were used
   (I think compilers are now pretty good with their template support)."

Bill.

Alessandro "Jake" Andrioni

unread,
Apr 9, 2013, 10:59:30 AM4/9/13
to juli...@googlegroups.com
Yes, until 6.0 there were no templates, but now they are used in some of the basic data structures (vectors, matrices, etc).

Isaiah Norton

unread,
Apr 9, 2013, 11:59:23 AM4/9/13
to juli...@googlegroups.com
Thanks for the information Bill and Jake. This looks like it could be a good test case for figuring out template support if/when I decide to take that plunge. Calling basic templated functions might be possible by extracting the (gonzo) mangled name for each specialization with clang. If that works, then some variation of the vfunc trick should work for templated member functions. There is a big caveat though: the specializations you want must be compiled into the library.

Anyway it will be neat to try this, but won't happen very soon because there is still a ways to go with the basic vfunc project.

Bill Hart

unread,
Apr 9, 2013, 12:08:01 PM4/9/13
to juli...@googlegroups.com
Templates really aren't useful in this sense. 

Consider a template meta-program which is designed to replace:

a = b*c

with 

mul(a, b, c)

such that the latter does not have to copy b*c or create a new object to replace a with.

When doing a = b*c from Julia, basically some compile time (== runtime in Julia) transformation needs to be done. In this case, you neither use the C++ template (you may just as well call the functions directly without template processing) nor do you get the performance improvement, because the equivalent of template expansion is done at runtime in Julia.

Bill.
Reply all
Reply to author
Forward
0 new messages