I saw an interesting project:"This project proposes to implement a very simple persistent storage mechanism for Julia variables so that data can be saved to and loaded from disk with a consistent interface that is agnostic of the underlying storage layer."
Hi Jeff,
> they relied on a 3rd party to containerize a Pythonprogram for transmission
That is due to the pecularities of Python's serialization module than
anything intrinsic to creating a Spark binding. (E.g. Python's pickle
format doesn't have support for serializing code and closures, so some
extra code was required.) This isn't an issue in Julia since
Base.serialize() already has the needed functionality. An initial
implementation of a Spark binding done in the same style as PySpark is
available at http://github.com/jey/Spock.jl
-Jey
What is this Moore Foundation? Are you talking about the Gordon and Betty Moore Foundation? (that's all that I could find that looked like it might fund a project on Google)
Just what is required for a company to fund a Julia Summer of Code project? (I've been advocating that the startup I'm consulting for fund a student next summer... [assuming the company is going strong, which I think it will, and that Julia is going strong, ditto]). We've been talking about what things we could make some form of open source (MIT, non-commercial only, whatever...), and how we could contribute to the Julia community.
This is both a proposal and a call for interested undergraduate and graduate students:Automatic differentiation is a technique for computing exact numerical derivatives of user-provided code, as opposed to using finite difference approximations which introduce approximation errors. These techniques have a number of applications in statistics, machine learning, optimization, and other fields. Julia as a language is particularly suitable for implementing automatic differentiation, and the existing capabilities are already beyond those of Scipy and MATLAB. We propose a project with the following components:1. Experiment with the new fast tuple and SIMD features of Julia 0.4 to develop a blazing fast stack-allocated implementation of DualNumbers with multiple epsilon components. Integrate with existing packages like Optim, JuMP, NLsolve, etc., and measure the performance gains over existing implementations.2. Combine this work with the ForwardDiff package, which aims to provide a unified interface to different techniques for forward-mode automatic differentiation, including for higher-order derivatives.3. Time permitting, take a step towards the reverse mode of automatic differentiation. Possible projects include developing a new implementation of reverse-mode AD based on the expression-graph format used by JuMP or contributing to existing packages such as ReverseDiffSource and ReverseDiffOverload.There are quite a number of interesting projects in this area (some with avenues for publication), so we can adjust the work according to the student's interests. An ideal student should be interested in experimenting with state-of-the-art techniques to make code fast. No mathematical background beyond calculus is needed. See juliadiff.org for more info.Co-mentors: Miles Lubin and Theodore PapamarkouIf this sounds cool and interesting to you, do get in touch!
Sorry, that should have been June 1.
Congratulations, looks like a great list!