No the link works and the github doc are being deprecated/moved. The live site is best but you can submit PRs for the new docs at https://github.com/actionml/docs.actionml.comOn Jun 13, 2016, at 6:08 AM, Alex Simes <alex.s...@gmail.com> wrote:Looking at the link, I'm guessing they meant to direct to this guide:google is your friend ;)
On Sunday, June 12, 2016 at 11:12:31 PM UTC+2, Gopal Patwa wrote:above link give me this errorOops, looks like there's no route on the client or the server for url: "http://actionml.com/docs/small_ha_cluster."\
On Sunday, June 12, 2016 at 1:09:27 AM UTC-7, Federico Reggiani wrote:You can try with this http://actionml.com/docs/small_ha_cluster
org.apache.predictionio.workflow.CreateServerreferences sparkContext.yesMy answer below was needlessly verbose.On Mar 28, 2017, at 8:41 AM, Marius Rabenarivo <mariusra...@gmail.com> wrote:Is it possible to run the driver outside the host where I'll deploy the engine? I mean for deployingBut I want to run the driver outside the server where I'll run the PredictionServer.As Spark will be used only for launching there.I'm reading documentation about Spark right now for having insight on how I can do it but I want to know if someone has tried to do something similar.2017-03-28 19:34 GMT+04:00 Pat Ferrel <p...@occamsmachete.com>:Spark must be installed locally (so spark-submit will work) but Spark is only used to launch the PredictionServer. No job is run on Spark for the UR during query serving.We typically train on a Spark driver machine that is like part of the Spark cluster and deploy on a server separate from the Spark cluster. This is so that the cluster can be stopped when not training and no AWS charges are incurred.So yes you can and often there are good reasons to do so.See the Spark overview here: http://actionml.com/docs/intro_to_spark
On Mar 27, 2017, at 11:48 PM, Marius Rabenarivo <mariusra...@gmail.com> wrote:
Hello,
For the pio train command, I understand that I can use another machine with PIO, Spark Driver, Master and Worker.
But, is it possible to deploy in a machine without Spark locally installed as it is use spark-submit during deployment
andorg.apache.predictionio.workflow.CreateServerreferences sparkContext.
I'm using UR v0.4.2 and PredictionIO 0.10.0
Regards,
Marius
P.S. I also posted in the ActionML Google group forum : https://groups.google.com/forum/#!topic/actionml-user/9yNQgVIODvI
Replace Haddop by Hadoop in the previous mail2017-03-30 22:08 GMT+04:00 Marius Rabenarivo <mariusra...@gmail.com>:For the host where we run the training, do we have to put the path to ES_CONF_DIR and HADOOP_CONF_DIR in pio-env.sh even if we use remote ES and Haddop clulsters?
--
You received this message because you are subscribed to a topic in the Google Groups "actionml-user" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/actionml-user/9yNQgVIODvI/unsubscribe.
To unsubscribe from this group and all its topics, send an email to actionml-user+unsubscribe@googlegroups.com.
To post to this group, send email to action...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/actionml-user/504FC93C-1A95-4673-BCB2-1C1E8CA0D487%40occamsmachete.com.