If you meant vectorization (as opposed to looping) then I'm afraid the example doesn't address that.
Yeah, vectorisation.
I used datavec (from deeplearning4j) to transform big chunks of data. Datavec uses Spark which parallelise the computation. Pretty cool.
Is there plans to create a module of data transformation for cortex?
Here's an example that loads data from a csv file and generates the input vectors used in training (http://viewer.gorilla-repl.org/view.html?source=github&user=shark8me&repo=inclojure-cortex&path=cortex-examples/ws/occupancy.cljw)
If you meant vectorization (as opposed to looping) then I'm afraid the example doesn't address that.
--
You received this message because you are subscribed to a topic in the Google Groups "clojure-cortex" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/clojure-cortex/eMYFrX3cOuc/unsubscribe.
To unsubscribe from this group and all its topics, send an email to clojure-corte...@googlegroups.com.
To post to this group, send email to clojure...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/clojure-cortex/a54f6521-ea77-48df-bc10-bd7427ee4879%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
Therefore Cortex is focused on neural nets alone and will likely not include ETL functionality. You'll have to use core.matrix or another library for transformation and dataset processing.
For big data ETL, you might want to use a Clojure Spark wrapper such as Flambo or Sparkling.
Regards
Kiran