Docker and Node-Red Flows/Credentials

605 views
Skip to first unread message

Ben Ward

unread,
Mar 30, 2015, 4:46:45 PM3/30/15
to node...@googlegroups.com
Hi all,

I've been trying to get a Docker image of Node Red working. I started quite successfully with C P Swan's build, which possibly is out of date (and never trust a monolithic binary from a security researcher ;-)  Then I discovered Dave CJ's notes on the subject: https://gist.github.com/dceejay/9435867

After some tweaks I've managed to get a Dockerfile (https://gist.github.com/crouchingbadger/c0a0e3eba4242baebd07) which downloads the right OS images, Node-red and runs it. However, I can't figure out two things:
  1. How to include a flows_<hostname>.json file from the host OS
  2. How to include a flows_<hostname>_cred.json file without accidentally exposing my credentials.
Docker may not be the right way to go, so feel free to tell me it's a waste of time.

Thanks
Ben

Dave C-J

unread,
Mar 30, 2015, 5:46:28 PM3/30/15
to node...@googlegroups.com

Ben,

Is the flows file the same or specific to each container ? If the same then you can copy it over at build time, then point to it in the run command... Ie use a fixed name. Do you need to pass in any credentials at all ? Can the user of the container reenter them from the web page ? If not then yes they'll will need to be copied over.

Ben Ward

unread,
Mar 30, 2015, 8:07:52 PM3/30/15
to node...@googlegroups.com
Hi Dave,

Thanks for responding.


On 30 March 2015 at 22:46, Dave C-J <dce...@gmail.com> wrote:

Is the flows file the same or specific to each container ?

For the moment I'm just trying to get any flows file in there. Hopefully generic.
 

If the same then you can copy it over at build time, then point to it in the run command... Ie use a fixed name.

Ok, I set the Dockerfile run options to set the user directory to /root/node-red and the flows file to /root/node-red/flows_badgermedia.json which is the one with all my flows.

Then I used  sudo docker exec -it <containerid> /bin/bash and copied both the json files over from a mounted volume.

Then I committed it and stopped the container. That didn't work - the files weren't there or weren't used. I also tried to HUP the process but that killed the container. I can't seem to make those files stick.

Do you need to pass in any credentials at all ? Can the user of the container reenter them from the web page ? If not then yes they'll will need to be copied over.

I was trying to automate the process, possibly to allow deployment of generic appliance-y things. I can't remember any more. I might just copy & paste and go to bed :-)

Thanks for the help

Ben

Dave C-J

unread,
Mar 31, 2015, 11:01:00 AM3/31/15
to node...@googlegroups.com
Ben

you need to copy them over as part of the build (using COPY within the Dockerfile). see

to answer a previous point - yes the slim versions don't include gyp etc - so won't compile any npms with native code in - but al the core Node-RED ones will (should) fail back to either pre-compiled versions or JS native ones - so the default install should still run (despite there being warnings that look like errors etc)

Claudiu O

unread,
Mar 31, 2015, 12:13:50 PM3/31/15
to node...@googlegroups.com
Ben, I don't know if this helps in any way but the way I went about this was using a volume:

docker run <some options> -v /home/pi/.node-red:/root/.node-red <image_name>

This way the flows files, new nodes, library files are shared between these directories. I am not sure if this is the best way but it works for me. I also placed a settings.js in my local dir that changes the flows file name to flows.json (because the flows file by default is named flows_<machine_name>.json and docker ID changes with each restart).

I apologize if I misunderstood your question and this doesn't help. If it does and if you are interested in more detail, I have a way too long post here.

   claudiu

Reply all
Reply to author
Forward
0 new messages