HELP!! docker/run-dev.sh problem!!!

1,106 views
Skip to first unread message
Assigned to me by joe.m....@gmail.com

Pasquale Iodice

unread,
Jan 25, 2015, 5:46:55 PM1/25/15
to lum...@googlegroups.com
Following this guide https://github.com/lumifyio/lumify , i have some problem at step 6. Step 5 is successfull with no errors but when I put this into the shell:

[root@localhost docker]# ./run-dev.sh

i have several errors that seem to be permission errors 

please help. Thanks for your support!

starting
Generating SSH1 RSA host key:                              [  OK  ]
Starting sshd:                                             [  OK  ]

Starting ZooKeeper
---------------------------------------------------------------
JMX enabled by default
Using config: /opt/zookeeper/bin/../conf/zoo.cfg
Starting zookeeper ... /opt/zookeeper/bin/zkServer.sh: line 113: /tmp/zookeeper/zookeeper_server.pid: Permission denied
FAILED TO WRITE PID

Starting Hadoop
---------------------------------------------------------------
**************** FORMATING NAMENODE ****************
Picked up _JAVA_OPTIONS: -Djava.net.preferIPv4Stack=true
15/01/25 22:38:43 INFO namenode.NameNode: STARTUP_MSG: 
/************************************************************
STARTUP_MSG: Starting NameNode
STARTUP_MSG:   host = lumify-dev/172.17.0.8
STARTUP_MSG:   args = [-format]
STARTUP_MSG:   version = 2.3.0
STARTUP_MSG:   classpath = /opt/hadoop/etc/hadoop/:/opt/hadoop/share/hadoop/common/lib/servlet-api-2.5.jar:/opt/hadoop/share/hadoop/common/lib/jackson-core-asl-1.8.8.jar:/opt/hadoop/share/hadoop/common/lib/jersey-server-1.9.jar:/opt/hadoop/share/hadoop/common/lib/commons-beanutils-1.7.0.jar:/opt/hadoop/share/hadoop/common/lib/hadoop-annotations-2.3.0.jar:/opt/hadoop/share/hadoop/common/lib/java-xmlbuilder-0.4.jar:/opt/hadoop/share/hadoop/common/lib/commons-net-3.1.jar:/opt/hadoop/share/hadoop/common/lib/jersey-core-1.9.jar:/opt/hadoop/share/hadoop/common/lib/commons-collections-3.2.1.jar:/opt/hadoop/share/hadoop/common/lib/jsch-0.1.42.jar:/opt/hadoop/share/hadoop/common/lib/commons-compress-1.4.1.jar:/opt/hadoop/share/hadoop/common/lib/jackson-xc-1.8.8.jar:/opt/hadoop/share/hadoop/common/lib/commons-cli-1.2.jar:/opt/hadoop/share/hadoop/common/lib/jackson-jaxrs-1.8.8.jar:/opt/hadoop/share/hadoop/common/lib/junit-4.8.2.jar:/opt/hadoop/share/hadoop/common/lib/paranamer-2.3.jar:/opt/hadoop/share/hadoop/common/lib/jackson-mapper-asl-1.8.8.jar:/opt/hadoop/share/hadoop/common/lib/avro-1.7.4.jar:/opt/hadoop/share/hadoop/common/lib/hadoop-auth-2.3.0.jar:/opt/hadoop/share/hadoop/common/lib/asm-3.2.jar:/opt/hadoop/share/hadoop/common/lib/httpcore-4.2.5.jar:/opt/hadoop/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/opt/hadoop/share/hadoop/common/lib/jersey-json-1.9.jar:/opt/hadoop/share/hadoop/common/lib/commons-digester-1.8.jar:/opt/hadoop/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/opt/hadoop/share/hadoop/common/lib/netty-3.6.2.Final.jar:/opt/hadoop/share/hadoop/common/lib/snappy-java-1.0.4.1.jar:/opt/hadoop/share/hadoop/common/lib/jasper-compiler-5.5.23.jar:/opt/hadoop/share/hadoop/common/lib/commons-io-2.4.jar:/opt/hadoop/share/hadoop/common/lib/jsp-api-2.1.jar:/opt/hadoop/share/hadoop/common/lib/commons-el-1.0.jar:/opt/hadoop/share/hadoop/common/lib/log4j-1.2.17.jar:/opt/hadoop/share/hadoop/common/lib/jsr305-1.3.9.jar:/opt/hadoop/share/hadoop/common/lib/commons-codec-1.4.jar:/opt/hadoop/share/hadoop/common/lib/jetty-util-6.1.26.jar:/opt/hadoop/share/hadoop/common/lib/jasper-runtime-5.5.23.jar:/opt/hadoop/share/hadoop/common/lib/commons-configuration-1.6.jar:/opt/hadoop/share/hadoop/common/lib/commons-httpclient-3.1.jar:/opt/hadoop/share/hadoop/common/lib/activation-1.1.jar:/opt/hadoop/share/hadoop/common/lib/slf4j-api-1.7.5.jar:/opt/hadoop/share/hadoop/common/lib/commons-math3-3.1.1.jar:/opt/hadoop/share/hadoop/common/lib/commons-logging-1.1.3.jar:/opt/hadoop/share/hadoop/common/lib/jetty-6.1.26.jar:/opt/hadoop/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar:/opt/hadoop/share/hadoop/common/lib/xz-1.0.jar:/opt/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar:/opt/hadoop/share/hadoop/common/lib/guava-11.0.2.jar:/opt/hadoop/share/hadoop/common/lib/stax-api-1.0-2.jar:/opt/hadoop/share/hadoop/common/lib/commons-lang-2.6.jar:/opt/hadoop/share/hadoop/common/lib/mockito-all-1.8.5.jar:/opt/hadoop/share/hadoop/common/lib/zookeeper-3.4.5.jar:/opt/hadoop/share/hadoop/common/lib/httpclient-4.2.5.jar:/opt/hadoop/share/hadoop/common/lib/jets3t-0.9.0.jar:/opt/hadoop/share/hadoop/common/lib/jaxb-api-2.2.2.jar:/opt/hadoop/share/hadoop/common/lib/xmlenc-0.52.jar:/opt/hadoop/share/hadoop/common/lib/jettison-1.1.jar:/opt/hadoop/share/hadoop/common/hadoop-common-2.3.0.jar:/opt/hadoop/share/hadoop/common/hadoop-common-2.3.0-tests.jar:/opt/hadoop/share/hadoop/common/hadoop-nfs-2.3.0.jar:/opt/hadoop/share/hadoop/hdfs:/opt/hadoop/share/hadoop/hdfs/lib/servlet-api-2.5.jar:/opt/hadoop/share/hadoop/hdfs/lib/jackson-core-asl-1.8.8.jar:/opt/hadoop/share/hadoop/hdfs/lib/jersey-server-1.9.jar:/opt/hadoop/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/opt/hadoop/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/opt/hadoop/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/opt/hadoop/share/hadoop/hdfs/lib/jackson-mapper-asl-1.8.8.jar:/opt/hadoop/share/hadoop/hdfs/lib/asm-3.2.jar:/opt/hadoop/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/opt/hadoop/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar:/opt/hadoop/share/hadoop/hdfs/lib/commons-io-2.4.jar:/opt/hadoop/share/hadoop/hdfs/lib/jsp-api-2.1.jar:/opt/hadoop/share/hadoop/hdfs/lib/commons-el-1.0.jar:/opt/hadoop/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/opt/hadoop/share/hadoop/hdfs/lib/jsr305-1.3.9.jar:/opt/hadoop/share/hadoop/hdfs/lib/commons-codec-1.4.jar:/opt/hadoop/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/opt/hadoop/share/hadoop/hdfs/lib/jasper-runtime-5.5.23.jar:/opt/hadoop/share/hadoop/hdfs/lib/commons-logging-1.1.3.jar:/opt/hadoop/share/hadoop/hdfs/lib/jetty-6.1.26.jar:/opt/hadoop/share/hadoop/hdfs/lib/guava-11.0.2.jar:/opt/hadoop/share/hadoop/hdfs/lib/commons-lang-2.6.jar:/opt/hadoop/share/hadoop/hdfs/lib/xmlenc-0.52.jar:/opt/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.3.0.jar:/opt/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.3.0-tests.jar:/opt/hadoop/share/hadoop/hdfs/hadoop-hdfs-nfs-2.3.0.jar:/opt/hadoop/share/hadoop/yarn/lib/servlet-api-2.5.jar:/opt/hadoop/share/hadoop/yarn/lib/jackson-core-asl-1.8.8.jar:/opt/hadoop/share/hadoop/yarn/lib/jersey-server-1.9.jar:/opt/hadoop/share/hadoop/yarn/lib/aopalliance-1.0.jar:/opt/hadoop/share/hadoop/yarn/lib/jersey-core-1.9.jar:/opt/hadoop/share/hadoop/yarn/lib/commons-compress-1.4.1.jar:/opt/hadoop/share/hadoop/yarn/lib/jackson-xc-1.8.8.jar:/opt/hadoop/share/hadoop/yarn/lib/commons-cli-1.2.jar:/opt/hadoop/share/hadoop/yarn/lib/jackson-jaxrs-1.8.8.jar:/opt/hadoop/share/hadoop/yarn/lib/jackson-mapper-asl-1.8.8.jar:/opt/hadoop/share/hadoop/yarn/lib/asm-3.2.jar:/opt/hadoop/share/hadoop/yarn/lib/jersey-client-1.9.jar:/opt/hadoop/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/opt/hadoop/share/hadoop/yarn/lib/jersey-json-1.9.jar:/opt/hadoop/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/opt/hadoop/share/hadoop/yarn/lib/jaxb-impl-2.2.3-1.jar:/opt/hadoop/share/hadoop/yarn/lib/commons-io-2.4.jar:/opt/hadoop/share/hadoop/yarn/lib/log4j-1.2.17.jar:/opt/hadoop/share/hadoop/yarn/lib/jsr305-1.3.9.jar:/opt/hadoop/share/hadoop/yarn/lib/commons-codec-1.4.jar:/opt/hadoop/share/hadoop/yarn/lib/jetty-util-6.1.26.jar:/opt/hadoop/share/hadoop/yarn/lib/commons-httpclient-3.1.jar:/opt/hadoop/share/hadoop/yarn/lib/activation-1.1.jar:/opt/hadoop/share/hadoop/yarn/lib/commons-logging-1.1.3.jar:/opt/hadoop/share/hadoop/yarn/lib/jetty-6.1.26.jar:/opt/hadoop/share/hadoop/yarn/lib/xz-1.0.jar:/opt/hadoop/share/hadoop/yarn/lib/javax.inject-1.jar:/opt/hadoop/share/hadoop/yarn/lib/guava-11.0.2.jar:/opt/hadoop/share/hadoop/yarn/lib/stax-api-1.0-2.jar:/opt/hadoop/share/hadoop/yarn/lib/guice-servlet-3.0.jar:/opt/hadoop/share/hadoop/yarn/lib/jline-0.9.94.jar:/opt/hadoop/share/hadoop/yarn/lib/commons-lang-2.6.jar:/opt/hadoop/share/hadoop/yarn/lib/zookeeper-3.4.5.jar:/opt/hadoop/share/hadoop/yarn/lib/jaxb-api-2.2.2.jar:/opt/hadoop/share/hadoop/yarn/lib/jettison-1.1.jar:/opt/hadoop/share/hadoop/yarn/lib/guice-3.0.jar:/opt/hadoop/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.3.0.jar:/opt/hadoop/share/hadoop/yarn/hadoop-yarn-server-common-2.3.0.jar:/opt/hadoop/share/hadoop/yarn/hadoop-yarn-client-2.3.0.jar:/opt/hadoop/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.3.0.jar:/opt/hadoop/share/hadoop/yarn/hadoop-yarn-api-2.3.0.jar:/opt/hadoop/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.3.0.jar:/opt/hadoop/share/hadoop/yarn/hadoop-yarn-server-tests-2.3.0.jar:/opt/hadoop/share/hadoop/yarn/hadoop-yarn-common-2.3.0.jar:/opt/hadoop/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.3.0.jar:/opt/hadoop/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.3.0.jar:/opt/hadoop/share/hadoop/mapreduce/lib/jackson-core-asl-1.8.8.jar:/opt/hadoop/share/hadoop/mapreduce/lib/jersey-server-1.9.jar:/opt/hadoop/share/hadoop/mapreduce/lib/hamcrest-core-1.1.jar:/opt/hadoop/share/hadoop/mapreduce/lib/aopalliance-1.0.jar:/opt/hadoop/share/hadoop/mapreduce/lib/hadoop-annotations-2.3.0.jar:/opt/hadoop/share/hadoop/mapreduce/lib/jersey-core-1.9.jar:/opt/hadoop/share/hadoop/mapreduce/lib/junit-4.10.jar:/opt/hadoop/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar:/opt/hadoop/share/hadoop/mapreduce/lib/paranamer-2.3.jar:/opt/hadoop/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.8.8.jar:/opt/hadoop/share/hadoop/mapreduce/lib/avro-1.7.4.jar:/opt/hadoop/share/hadoop/mapreduce/lib/asm-3.2.jar:/opt/hadoop/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/opt/hadoop/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar:/opt/hadoop/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar:/opt/hadoop/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar:/opt/hadoop/share/hadoop/mapreduce/lib/commons-io-2.4.jar:/opt/hadoop/share/hadoop/mapreduce/lib/log4j-1.2.17.jar:/opt/hadoop/share/hadoop/mapreduce/lib/xz-1.0.jar:/opt/hadoop/share/hadoop/mapreduce/lib/javax.inject-1.jar:/opt/hadoop/share/hadoop/mapreduce/lib/guice-servlet-3.0.jar:/opt/hadoop/share/hadoop/mapreduce/lib/guice-3.0.jar:/opt/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2.3.0.jar:/opt/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.3.0.jar:/opt/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.3.0-tests.jar:/opt/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.3.0.jar:/opt/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.3.0.jar:/opt/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.3.0.jar:/opt/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.3.0.jar:/opt/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.3.0.jar:/opt/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.3.0.jar:/opt/hadoop/contrib/capacity-scheduler/*.jar
STARTUP_MSG:   build = http://svn.apache.org/repos/asf/hadoop/common -r 1567123; compiled by 'jenkins' on 2014-02-11T13:40Z
STARTUP_MSG:   java = 1.7.0_71
************************************************************/
15/01/25 22:38:43 INFO namenode.NameNode: registered UNIX signal handlers for [TERM, HUP, INT]
Formatting using clusterid: CID-cd8c7ad5-062f-4b17-975c-cba881919254
15/01/25 22:38:44 INFO namenode.FSNamesystem: fsLock is fair:true
15/01/25 22:38:44 INFO namenode.HostFileManager: read includes:
HostSet(
)
15/01/25 22:38:44 INFO namenode.HostFileManager: read excludes:
HostSet(
)
15/01/25 22:38:44 INFO blockmanagement.DatanodeManager: dfs.block.invalidate.limit=1000
15/01/25 22:38:44 INFO blockmanagement.DatanodeManager: dfs.namenode.datanode.registration.ip-hostname-check=true
15/01/25 22:38:45 INFO util.GSet: Computing capacity for map BlocksMap
15/01/25 22:38:45 INFO util.GSet: VM type       = 64-bit
15/01/25 22:38:45 INFO util.GSet: 2.0% max memory 889 MB = 17.8 MB
15/01/25 22:38:45 INFO util.GSet: capacity      = 2^21 = 2097152 entries
15/01/25 22:38:45 INFO blockmanagement.BlockManager: dfs.block.access.token.enable=false
15/01/25 22:38:45 INFO blockmanagement.BlockManager: defaultReplication         = 3
15/01/25 22:38:45 INFO blockmanagement.BlockManager: maxReplication             = 512
15/01/25 22:38:45 INFO blockmanagement.BlockManager: minReplication             = 1
15/01/25 22:38:45 INFO blockmanagement.BlockManager: maxReplicationStreams      = 2
15/01/25 22:38:45 INFO blockmanagement.BlockManager: shouldCheckForEnoughRacks  = false
15/01/25 22:38:45 INFO blockmanagement.BlockManager: replicationRecheckInterval = 3000
15/01/25 22:38:45 INFO blockmanagement.BlockManager: encryptDataTransfer        = false
15/01/25 22:38:45 INFO blockmanagement.BlockManager: maxNumBlocksToLog          = 1000
15/01/25 22:38:45 INFO namenode.FSNamesystem: fsOwner             = root (auth:SIMPLE)
15/01/25 22:38:45 INFO namenode.FSNamesystem: supergroup          = supergroup
15/01/25 22:38:45 INFO namenode.FSNamesystem: isPermissionEnabled = true
15/01/25 22:38:45 INFO namenode.FSNamesystem: HA Enabled: false
15/01/25 22:38:45 INFO namenode.FSNamesystem: Append Enabled: true
15/01/25 22:38:45 INFO util.GSet: Computing capacity for map INodeMap
15/01/25 22:38:45 INFO util.GSet: VM type       = 64-bit
15/01/25 22:38:45 INFO util.GSet: 1.0% max memory 889 MB = 8.9 MB
15/01/25 22:38:45 INFO util.GSet: capacity      = 2^20 = 1048576 entries
15/01/25 22:38:45 INFO namenode.NameNode: Caching file names occuring more than 10 times
15/01/25 22:38:45 INFO util.GSet: Computing capacity for map cachedBlocks
15/01/25 22:38:45 INFO util.GSet: VM type       = 64-bit
15/01/25 22:38:45 INFO util.GSet: 0.25% max memory 889 MB = 2.2 MB
15/01/25 22:38:45 INFO util.GSet: capacity      = 2^18 = 262144 entries
15/01/25 22:38:45 INFO namenode.FSNamesystem: dfs.namenode.safemode.threshold-pct = 0.9990000128746033
15/01/25 22:38:45 INFO namenode.FSNamesystem: dfs.namenode.safemode.min.datanodes = 0
15/01/25 22:38:45 INFO namenode.FSNamesystem: dfs.namenode.safemode.extension     = 30000
15/01/25 22:38:45 INFO namenode.FSNamesystem: Retry cache on namenode is enabled
15/01/25 22:38:45 INFO namenode.FSNamesystem: Retry cache will use 0.03 of total heap and retry cache entry expiry time is 600000 millis
15/01/25 22:38:45 INFO util.GSet: Computing capacity for map Namenode Retry Cache
15/01/25 22:38:45 INFO util.GSet: VM type       = 64-bit
15/01/25 22:38:45 INFO util.GSet: 0.029999999329447746% max memory 889 MB = 273.1 KB
15/01/25 22:38:45 INFO util.GSet: capacity      = 2^15 = 32768 entries
15/01/25 22:38:45 WARN namenode.NameNode: Encountered exception during format: 
java.io.IOException: Cannot create directory /tmp/hadoop-root/dfs/name/current
at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.clearDirectory(Storage.java:311)
at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:523)
at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:544)
at org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:147)
at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:829)
at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1218)
at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1325)
15/01/25 22:38:45 FATAL namenode.NameNode: Exception in namenode join
java.io.IOException: Cannot create directory /tmp/hadoop-root/dfs/name/current
at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.clearDirectory(Storage.java:311)
at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:523)
at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:544)
at org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:147)
at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:829)
at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1218)
at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1325)
15/01/25 22:38:45 INFO util.ExitUtil: Exiting with status 1
15/01/25 22:38:45 INFO namenode.NameNode: SHUTDOWN_MSG: 
/************************************************************
SHUTDOWN_MSG: Shutting down NameNode at lumify-dev/172.17.0.8
************************************************************/
Picked up _JAVA_OPTIONS: -Djava.net.preferIPv4Stack=true
Starting namenodes on [lumify-dev]
lumify-dev: chown: changing ownership of `/opt/hadoop/logs': Permission denied
lumify-dev: starting namenode, logging to /opt/hadoop/logs/hadoop-root-namenode-lumify-dev.out
lumify-dev: /opt/hadoop/sbin/hadoop-daemon.sh: line 157: /tmp/hadoop-root-namenode.pid: Permission denied
lumify-dev: /opt/hadoop/sbin/hadoop-daemon.sh: line 151: /opt/hadoop/logs/hadoop-root-namenode-lumify-dev.out: Permission denied
lumify-dev: head: cannot open `/opt/hadoop/logs/hadoop-root-namenode-lumify-dev.out' for reading: No such file or directory
lumify-dev: /opt/hadoop/sbin/hadoop-daemon.sh: line 166: /opt/hadoop/logs/hadoop-root-namenode-lumify-dev.out: Permission denied
lumify-dev: /opt/hadoop/sbin/hadoop-daemon.sh: line 167: /opt/hadoop/logs/hadoop-root-namenode-lumify-dev.out: Permission denied
localhost: chown: changing ownership of `/opt/hadoop/logs': Permission denied
localhost: starting datanode, logging to /opt/hadoop/logs/hadoop-root-datanode-lumify-dev.out
localhost: /opt/hadoop/sbin/hadoop-daemon.sh: line 157: /tmp/hadoop-root-datanode.pid: Permission denied
localhost: /opt/hadoop/sbin/hadoop-daemon.sh: line 151: /opt/hadoop/logs/hadoop-root-datanode-lumify-dev.out: Permission denied
localhost: head: cannot open `/opt/hadoop/logs/hadoop-root-datanode-lumify-dev.out' for reading: No such file or directory
localhost: /opt/hadoop/sbin/hadoop-daemon.sh: line 166: /opt/hadoop/logs/hadoop-root-datanode-lumify-dev.out: Permission denied
localhost: /opt/hadoop/sbin/hadoop-daemon.sh: line 167: /opt/hadoop/logs/hadoop-root-datanode-lumify-dev.out: Permission denied
Starting secondary namenodes [0.0.0.0]
0.0.0.0: starting secondarynamenode, logging to /opt/hadoop/logs/hadoop-root-secondarynamenode-lumify-dev.out
0.0.0.0: chown: changing ownership of `/opt/hadoop/logs': Permission denied
0.0.0.0: /opt/hadoop/sbin/hadoop-daemon.sh: line 157: /tmp/hadoop-root-secondarynamenode.pid: Permission denied
0.0.0.0: /opt/hadoop/sbin/hadoop-daemon.sh: line 151: /opt/hadoop/logs/hadoop-root-secondarynamenode-lumify-dev.out: Permission denied
0.0.0.0: head: cannot open `/opt/hadoop/logs/hadoop-root-secondarynamenode-lumify-dev.out' for reading: No such file or directory
0.0.0.0: /opt/hadoop/sbin/hadoop-daemon.sh: line 166: /opt/hadoop/logs/hadoop-root-secondarynamenode-lumify-dev.out: Permission denied
0.0.0.0: /opt/hadoop/sbin/hadoop-daemon.sh: line 167: /opt/hadoop/logs/hadoop-root-secondarynamenode-lumify-dev.out: Permission denied
Picked up _JAVA_OPTIONS: -Djava.net.preferIPv4Stack=true
starting yarn daemons
chown: missing operand after `/opt/hadoop/logs'
Try `chown --help' for more information.
starting resourcemanager, logging to /opt/hadoop/logs/yarn--resourcemanager-lumify-dev.out
/opt/hadoop/sbin/yarn-daemon.sh: line 125: /tmp/yarn--resourcemanager.pid: Permission denied
/opt/hadoop/sbin/yarn-daemon.sh: line 124: /opt/hadoop/logs/yarn--resourcemanager-lumify-dev.out: Permission denied
head: cannot open `/opt/hadoop/logs/yarn--resourcemanager-lumify-dev.out' for reading: No such file or directory
/opt/hadoop/sbin/yarn-daemon.sh: line 129: /opt/hadoop/logs/yarn--resourcemanager-lumify-dev.out: Permission denied
/opt/hadoop/sbin/yarn-daemon.sh: line 130: /opt/hadoop/logs/yarn--resourcemanager-lumify-dev.out: Permission denied
localhost: chown: changing ownership of `/opt/hadoop/logs': Permission denied
localhost: starting nodemanager, logging to /opt/hadoop/logs/yarn-root-nodemanager-lumify-dev.out
localhost: /opt/hadoop/sbin/yarn-daemon.sh: line 125: /tmp/yarn-root-nodemanager.pid: Permission denied
localhost: /opt/hadoop/sbin/yarn-daemon.sh: line 124: /opt/hadoop/logs/yarn-root-nodemanager-lumify-dev.out: Permission denied
localhost: head: cannot open `/opt/hadoop/logs/yarn-root-nodemanager-lumify-dev.out' for reading: No such file or directory
localhost: /opt/hadoop/sbin/yarn-daemon.sh: line 129: /opt/hadoop/logs/yarn-root-nodemanager-lumify-dev.out: Permission denied
localhost: /opt/hadoop/sbin/yarn-daemon.sh: line 130: /opt/hadoop/logs/yarn-root-nodemanager-lumify-dev.out: Permission denied
Picked up _JAVA_OPTIONS: -Djava.net.preferIPv4Stack=true
safemode: Call From lumify-dev/172.17.0.8 to lumify-dev:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused

Starting Accumulo
---------------------------------------------------------------
Picked up _JAVA_OPTIONS: -Djava.net.preferIPv4Stack=true
ls: Call From lumify-dev/172.17.0.8 to lumify-dev:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
Creating accumulo user in hdfs
Picked up _JAVA_OPTIONS: -Djava.net.preferIPv4Stack=true
mkdir: Call From lumify-dev/172.17.0.8 to lumify-dev:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
Picked up _JAVA_OPTIONS: -Djava.net.preferIPv4Stack=true
chown: Call From lumify-dev/172.17.0.8 to lumify-dev:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
Starting monitor on lumify-dev
/opt/accumulo-1.6.1/bin/start-server.sh: line 78: /opt/accumulo-1.6.1/logs/monitor_lumify-dev.out: Permission denied
Starting tablet servers .... done
Picked up _JAVA_OPTIONS: -Djava.net.preferIPv4Stack=true
Starting tablet server on lumify-dev
/opt/accumulo-1.6.1/bin/start-server.sh: line 78: /opt/accumulo-1.6.1/logs/tserver_lumify-dev.out: Permission denied
2015-01-25 22:39:22,722 [vfs.UniqueFileReplicator] WARN : Unexpected error creating directory /tmp/accumulo-vfs-cache-716@lumify-dev-root
2015-01-25 22:39:22,760 [vfs.UniqueFileReplicator] WARN : Unexpected error creating directory /tmp/accumulo-vfs-cache-716@lumify-dev-root
2015-01-25 22:39:24,621 [fs.VolumeManagerImpl] WARN : dfs.datanode.synconclose set to false in hdfs-site.xml: data loss is possible on system reset or power loss
2015-01-25 22:39:24,628 [server.Accumulo] INFO : Attempting to talk to zookeeper
Thread "org.apache.accumulo.master.state.SetGoalState" died Failed to connect to zookeeper (localhost:2181) within 2x zookeeper timeout period 30000
java.lang.RuntimeException: Failed to connect to zookeeper (localhost:2181) within 2x zookeeper timeout period 30000
at org.apache.accumulo.fate.zookeeper.ZooSession.connect(ZooSession.java:117)
at org.apache.accumulo.fate.zookeeper.ZooSession.getSession(ZooSession.java:161)
at org.apache.accumulo.fate.zookeeper.ZooReader.getSession(ZooReader.java:35)
at org.apache.accumulo.fate.zookeeper.ZooReaderWriter.getZooKeeper(ZooReaderWriter.java:50)
at org.apache.accumulo.fate.zookeeper.ZooReader.getChildren(ZooReader.java:59)
at org.apache.accumulo.server.Accumulo.waitForZookeeperAndHdfs(Accumulo.java:246)
at org.apache.accumulo.master.state.SetGoalState.main(SetGoalState.java:44)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.accumulo.start.Main$1.run(Main.java:141)
at java.lang.Thread.run(Thread.java:745)
Starting master on lumify-dev
/opt/accumulo-1.6.1/bin/start-server.sh: line 78: /opt/accumulo-1.6.1/logs/master_lumify-dev.out: Permission denied
Starting garbage collector on lumify-dev
/opt/accumulo-1.6.1/bin/start-server.sh: line 78: /opt/accumulo-1.6.1/logs/gc_lumify-dev.out: Permission denied
Starting tracer on lumify-dev
/opt/accumulo-1.6.1/bin/start-server.sh: line 78: /opt/accumulo-1.6.1/logs/tracer_lumify-dev.out: Permission denied

Starting Elasticsearch
---------------------------------------------------------------

Starting RabbitMQ
---------------------------------------------------------------
Picked up _JAVA_OPTIONS: -Djava.net.preferIPv4Stack=true
The following plugins have been enabled:
  mochiweb
  webmachine
  rabbitmq_web_dispatch
  amqp_client
  rabbitmq_management_agent
  rabbitmq_management
Offline change; changes will take effect at broker restart.

Starting Lumify Config
---------------------------------------------------------------
Picked up _JAVA_OPTIONS: -Djava.net.preferIPv4Stack=true
log4j:ERROR setFile(null,true) call failed.
java.io.FileNotFoundException: /opt/elasticsearch/logs/elasticsearch.log (Permission denied)
at java.io.FileOutputStream.open(Native Method)
at java.io.FileOutputStream.<init>(FileOutputStream.java:221)
at java.io.FileOutputStream.<init>(FileOutputStream.java:142)
at org.apache.log4j.FileAppender.setFile(FileAppender.java:294)
at org.apache.log4j.FileAppender.activateOptions(FileAppender.java:165)
at org.apache.log4j.DailyRollingFileAppender.activateOptions(DailyRollingFileAppender.java:223)
at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)
at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172)
at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104)
at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:842)
at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
at org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:648)
at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:514)
at org.apache.log4j.PropertyConfigurator.configure(PropertyConfigurator.java:440)
at org.elasticsearch.common.logging.log4j.LogConfigurator.configure(LogConfigurator.java:105)
at org.elasticsearch.bootstrap.Bootstrap.setupLogging(Bootstrap.java:94)
at org.elasticsearch.bootstrap.Bootstrap.main(Bootstrap.java:178)
at org.elasticsearch.bootstrap.Elasticsearch.main(Elasticsearch.java:32)
log4j:ERROR Either File or DatePattern options are not set for appender [file].
log4j:ERROR setFile(null,true) call failed.
java.io.FileNotFoundException: /opt/elasticsearch/logs/elasticsearch_index_indexing_slowlog.log (Permission denied)
at java.io.FileOutputStream.open(Native Method)
at java.io.FileOutputStream.<init>(FileOutputStream.java:221)
at java.io.FileOutputStream.<init>(FileOutputStream.java:142)
at org.apache.log4j.FileAppender.setFile(FileAppender.java:294)
at org.apache.log4j.FileAppender.activateOptions(FileAppender.java:165)
at org.apache.log4j.DailyRollingFileAppender.activateOptions(DailyRollingFileAppender.java:223)
at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)
at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172)
at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104)
at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:842)
at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
at org.apache.log4j.PropertyConfigurator.parseCatsAndRenderers(PropertyConfigurator.java:672)
at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:516)
at org.apache.log4j.PropertyConfigurator.configure(PropertyConfigurator.java:440)
at org.elasticsearch.common.logging.log4j.LogConfigurator.configure(LogConfigurator.java:105)
at org.elasticsearch.bootstrap.Bootstrap.setupLogging(Bootstrap.java:94)
at org.elasticsearch.bootstrap.Bootstrap.main(Bootstrap.java:178)
at org.elasticsearch.bootstrap.Elasticsearch.main(Elasticsearch.java:32)
log4j:ERROR Either File or DatePattern options are not set for appender [index_indexing_slow_log_file].
log4j:ERROR setFile(null,true) call failed.
java.io.FileNotFoundException: /opt/elasticsearch/logs/elasticsearch_index_search_slowlog.log (Permission denied)
at java.io.FileOutputStream.open(Native Method)
at java.io.FileOutputStream.<init>(FileOutputStream.java:221)
at java.io.FileOutputStream.<init>(FileOutputStream.java:142)
at org.apache.log4j.FileAppender.setFile(FileAppender.java:294)
at org.apache.log4j.FileAppender.activateOptions(FileAppender.java:165)
at org.apache.log4j.DailyRollingFileAppender.activateOptions(DailyRollingFileAppender.java:223)
at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)
at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172)
at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104)
at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:842)
at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
at org.apache.log4j.PropertyConfigurator.parseCatsAndRenderers(PropertyConfigurator.java:672)
at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:516)
at org.apache.log4j.PropertyConfigurator.configure(PropertyConfigurator.java:440)
at org.elasticsearch.common.logging.log4j.LogConfigurator.configure(LogConfigurator.java:105)
at org.elasticsearch.bootstrap.Bootstrap.setupLogging(Bootstrap.java:94)
at org.elasticsearch.bootstrap.Bootstrap.main(Bootstrap.java:178)
at org.elasticsearch.bootstrap.Elasticsearch.main(Elasticsearch.java:32)
log4j:ERROR Either File or DatePattern options are not set for appender [index_search_slow_log_file].
{1.4.1}: Initialization Failed ...
- ElasticsearchIllegalStateException[Failed to obtain node lock, is the following location writable?: [/opt/elasticsearch/data/elasticsearch]]
IOException[failed to obtain lock on /opt/elasticsearch/data/elasticsearch/nodes/49]
IOException[Cannot create directory: /opt/elasticsearch/data/elasticsearch/nodes/49]
mkdir: Call From lumify-dev/172.17.0.8 to lumify-dev:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
Picked up _JAVA_OPTIONS: -Djava.net.preferIPv4Stack=true
mkdir: Call From lumify-dev/172.17.0.8 to lumify-dev:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
Picked up _JAVA_OPTIONS: -Djava.net.preferIPv4Stack=true
mkdir: Call From lumify-dev/172.17.0.8 to lumify-dev:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
Picked up _JAVA_OPTIONS: -Djava.net.preferIPv4Stack=true
mkdir: Call From lumify-dev/172.17.0.8 to lumify-dev:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
Picked up _JAVA_OPTIONS: -Djava.net.preferIPv4Stack=true
put: Call From lumify-dev/172.17.0.8 to lumify-dev:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
Picked up _JAVA_OPTIONS: -Djava.net.preferIPv4Stack=true
put: Call From lumify-dev/172.17.0.8 to lumify-dev:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
Picked up _JAVA_OPTIONS: -Djava.net.preferIPv4Stack=true
put: Call From lumify-dev/172.17.0.8 to lumify-dev:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
Picked up _JAVA_OPTIONS: -Djava.net.preferIPv4Stack=true
chmod: Call From lumify-dev/172.17.0.8 to lumify-dev:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused


Jeff Kunkle

unread,
Jan 26, 2015, 9:38:34 AM1/26/15
to lum...@googlegroups.com
Hi Pasquale, what host OS are you running? Windows, Linux, Mac? Are you running all these commands within a VM?
...

Pasquale Iodice

unread,
Jan 26, 2015, 10:48:09 AM1/26/15
to lum...@googlegroups.com
Hi jeff thanks for your support. I am running centos 7 in VMware. Is the VM the problem? I'am now installing CentOS 7 on my HD. Do you think is better if I try without VM? Thanks.

Joe Ferner

unread,
Jan 26, 2015, 10:57:08 AM1/26/15
to lum...@googlegroups.com
Yes. I do not think docker likes to be run inside a VM. Try running the command from outside the VM.

Pasquale Iodice

unread,
Jan 26, 2015, 4:42:33 PM1/26/15
to lum...@googlegroups.com
I tried to install cento 7 on my HD e rebuilt all. When i try run-dev.sh i have the same problems.

Joe Ferner

unread,
Jan 26, 2015, 4:58:30 PM1/26/15
to lum...@googlegroups.com
what are the permissions on "docker/lumify-dev-persistent"

ls -la docker/lumify-dev-persistent

David Singley

unread,
Jan 26, 2015, 5:06:38 PM1/26/15
to lum...@googlegroups.com
please run:

docker ps

find the container id of the lumifyio/dev image

then run:

docker inspect <the image id>

please reply with provide the output

Pasquale Iodice

unread,
Jan 27, 2015, 5:35:37 AM1/27/15
to lum...@googlegroups.com

[pasquale@localhost Scrivania]$ ls -la /home/pasquale/Scrivania/lumify-master/docker/lumify-dev-persistent


I have:


totale 4

drwxrwxr-x. 5 pasquale pasquale 36 26 gen 22.02 .

drwxrwxr-x. 4 pasquale pasquale 4096 26 gen 22.02 ..

drwxrwxr-x. 6 pasquale pasquale 66 26 gen 22.02 opt

drwxrwxr-x. 3 pasquale pasquale 22 26 gen 22.02 tmp

drwxrwxr-x. 5 pasquale pasquale 38 26 gen 22.02 var


[pasquale@localhost Scrivania]$ docker ps


I have:


2015/01/27 11:20:07 Get http:///var/run/docker.sock/v1.15/containers/json: dial unix /var/run/docker.sock: no such file or directory


so I first run :


[pasquale@localhost Scrivania]$ sudo systemctl start docker


and then:


[pasquale@localhost Scrivania]$ sudo docker ps -l


I have:


CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES

134b9f90d31f lumifyio/dev:latest "/opt/docker-entrypo 13 hours ago tender_almeida


then i run:


[pasquale@localhost Scrivania]$ sudo docker inspect 134b9f90d31f


I have:


[{

"AppArmorProfile": "",

"Args": [

"start"

],

"Config": {

"AttachStderr": true,

"AttachStdin": true,

"AttachStdout": true,

"Cmd": [

"start"

],

"CpuShares": 0,

"Cpuset": "",

"Domainname": "",

"Entrypoint": [

"/opt/docker-entrypoint.sh"

],

"Env": [

"PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/opt/jdk/bin:/opt/maven/bin:/opt/zookeeper/bin:/opt/hadoop/bin:/opt/accumulo/bin:/opt/elasticsearch/bin:/opt/rabbitmq/sbin:/opt/jetty/bin",

"JAVA_HOME=/opt/jdk",

"_JAVA_OPTIONS=-Djava.net.preferIPv4Stack=true",

"MVN_HOME=/opt/maven",

"ZOOKEEPER_HOME=/opt/zookeeper",

"HADOOP_PREFIX=/opt/hadoop",

"HADOOP_COMMON_HOME=/opt/hadoop",

"HADOOP_HDFS_HOME=/opt/hadoop",

"HADOOP_MAPRED_HOME=/opt/hadoop",

"HADOOP_YARN_HOME=/opt/hadoop",

"HADOOP_CONF_DIR=/opt/hadoop/etc/hadoop",

"YARN_CONF_DIR=/opt/hadoop/etc/hadoop"

],

"ExposedPorts": {

"15672/tcp": {},

"2181/tcp": {},

"22/tcp": {},

"2888/tcp": {},

"3888/tcp": {},

"50010/tcp": {},

"50020/tcp": {},

"50030/tcp": {},

"50060/tcp": {},

"50070/tcp": {},

"50075/tcp": {},

"50090/tcp": {},

"50091/tcp": {},

"50095/tcp": {},

"5672/tcp": {},

"5673/tcp": {},

"8020/tcp": {},

"8032/tcp": {},

"8042/tcp": {},

"8080/tcp": {},

"8088/tcp": {},

"8443/tcp": {},

"9000/tcp": {},

"9200/tcp": {},

"9300/tcp": {},

"9997/tcp": {},

"9999/tcp": {}

},

"Hostname": "lumify-dev",

"Image": "lumifyio/dev",

"Memory": 0,

"MemorySwap": 0,

"NetworkDisabled": false,

"OnBuild": null,

"OpenStdin": true,

"PortSpecs": null,

"StdinOnce": true,

"Tty": true,

"User": "",

"Volumes": {

"/opt/elasticsearch-1.4.1/data/": {},

"/opt/jetty/webapps": {},

"/opt/lumify": {},

"/opt/lumify-source": {},

"/opt/rabbitmq_server-3.4.1/var": {},

"/tmp": {},

"/var/lib/hadoop-hdfs": {},

"/var/local/hadoop": {},

"/var/log": {}

},

"WorkingDir": "/home/root/lumify"

},

"Created": "2015-01-26T21:11:48.154694479Z",

"Driver": "devicemapper",

"ExecDriver": "native-0.2",

"HostConfig": {

"Binds": [

"/home/pasquale/Scrivania/lumify-master/docker/lumify-dev-persistent/var/local/hadoop:/var/local/hadoop",

"/home/pasquale/Scrivania/lumify-master/docker/lumify-dev-persistent/opt/elasticsearch/data:/opt/elasticsearch/data",

"/home/pasquale/Scrivania/lumify-master/docker/lumify-dev-persistent/opt/rabbitmq/var:/opt/rabbitmq/var",

"/home/pasquale/Scrivania/lumify-master/docker/lumify-dev-persistent/opt/lumify:/opt/lumify",

"/home/pasquale/Scrivania/lumify-master/docker/lumify-dev-persistent/opt/jetty/webapps:/opt/jetty/webapps",

"/home/pasquale/Scrivania/lumify-master/docker/..:/opt/lumify-source",

"/home/pasquale/Scrivania/lumify-master/docker/lumify-dev-persistent/tmp:/tmp",

"/home/pasquale/Scrivania/lumify-master/docker/lumify-dev-persistent/var/lib/hadoop-hdfs:/var/lib/hadoop-hdfs",

"/home/pasquale/Scrivania/lumify-master/docker/lumify-dev-persistent/var/log:/var/log"

],

"CapAdd": null,

"CapDrop": null,

"ContainerIDFile": "",

"Devices": [],

"Dns": null,

"DnsSearch": null,

"ExtraHosts": null,

"Links": null,

"LxcConf": [],

"NetworkMode": "bridge",

"PortBindings": {

"15672/tcp": [

{

"HostIp": "",

"HostPort": "15672"

}

],

"2181/tcp": [

{

"HostIp": "",

"HostPort": "2181"

}

],

"50010/tcp": [

{

"HostIp": "",

"HostPort": "50010"

}

],

"50020/tcp": [

{

"HostIp": "",

"HostPort": "50020"

}

],

"50030/tcp": [

{

"HostIp": "",

"HostPort": "50030"

}

],

"50060/tcp": [

{

"HostIp": "",

"HostPort": "50060"

}

],

"50070/tcp": [

{

"HostIp": "",

"HostPort": "50070"

}

],

"50075/tcp": [

{

"HostIp": "",

"HostPort": "50075"

}

],

"50090/tcp": [

{

"HostIp": "",

"HostPort": "50090"

}

],

"50095/tcp": [

{

"HostIp": "",

"HostPort": "50095"

}

],

"5672/tcp": [

{

"HostIp": "",

"HostPort": "5672"

}

],

"5673/tcp": [

{

"HostIp": "",

"HostPort": "5673"

}

],

"8020/tcp": [

{

"HostIp": "",

"HostPort": "8020"

}

],

"8032/tcp": [

{

"HostIp": "",

"HostPort": "8032"

}

],

"8042/tcp": [

{

"HostIp": "",

"HostPort": "8042"

}

],

"8080/tcp": [

{

"HostIp": "",

"HostPort": "8080"

}

],

"8088/tcp": [

{

"HostIp": "",

"HostPort": "8088"

}

],

"8443/tcp": [

{

"HostIp": "",

"HostPort": "8443"

}

],

"9000/tcp": [

{

"HostIp": "",

"HostPort": "9000"

}

],

"9200/tcp": [

{

"HostIp": "",

"HostPort": "9200"

}

],

"9300/tcp": [

{

"HostIp": "",

"HostPort": "9300"

}

],

"9997/tcp": [

{

"HostIp": "",

"HostPort": "9997"

}

],

"9999/tcp": [

{

"HostIp": "",

"HostPort": "9999"

}

]

},

"Privileged": false,

"PublishAllPorts": false,

"RestartPolicy": {

"MaximumRetryCount": 0,

"Name": ""

},

"SecurityOpt": null,

"VolumesFrom": null

},

"HostnamePath": "",

"HostsPath": "",

"Id": "134b9f90d31f13e067e7de23b1e3d10e2d8f0de3f66755b09fefffc5fd16ab06",

"Image": "ba6dcab7124b70f11f755fc1f20566f80284ba54c85f4636057c2fe72f62d578",

"MountLabel": "system_u:object_r:svirt_sandbox_file_t:s0:c375,c1010",

"Name": "/tender_almeida",

"NetworkSettings": {

"Bridge": "",

"Gateway": "",

"IPAddress": "",

"IPPrefixLen": 0,

"MacAddress": "",

"PortMapping": null,

"Ports": null

},

"Path": "/opt/docker-entrypoint.sh",

"ProcessLabel": "system_u:system_r:svirt_lxc_net_t:s0:c375,c1010",

"ResolvConfPath": "",

"State": {

"ExitCode": 0,

"FinishedAt": "0001-01-01T00:00:00Z",

"Paused": false,

"Pid": 0,

"Restarting": false,

"Running": false,

"StartedAt": "0001-01-01T00:00:00Z"

},

"Volumes": null,

"VolumesRW": null

Pasquale Iodice

unread,
Jan 27, 2015, 6:15:37 AM1/27/15
to lum...@googlegroups.com
When i put ./run-dev.sh i have an SELinux alarm. I attached the screenshot.
Schermata del 2015-01-27 12:11:22.png

David Singley

unread,
Jan 27, 2015, 10:07:43 AM1/27/15
to lum...@googlegroups.com

Pasquale Iodice

unread,
Jan 29, 2015, 5:46:20 AM1/29/15
to lum...@googlegroups.com
OK. I'm now at the step 7 of  https://github.com/lumifyio/lumify  guide. I resolved step 6 running on ununt 14.04.
Now when I put

 mvn package -P web-war -pl web/war -am -DskipTests -Dsource.skip=true

I have:

Picked up _JAVA_OPTIONS: -Djava.net.preferIPv4Stack=true
[INFO] Scanning for projects...
[ERROR] Could not find the selected project in the reactor: web/war -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:


Thanks of all.

Joe Ferner

unread,
Jan 29, 2015, 9:58:00 AM1/29/15
to lum...@googlegroups.com
Are you running that command from the root of where you cloned lumify?

Pasquale Iodice

unread,
Jan 29, 2015, 10:09:15 AM1/29/15
to lum...@googlegroups.com
Sorry but I am a newbie.
After   run-dev.sh

This is my user now:    bash-4.1#

I putted the command here

bash-4.1# mvn package -P web-war -pl web/war -am -DskipTests -Dsource.skip=true

   

Pasquale Iodice

unread,
Jan 29, 2015, 10:19:29 AM1/29/15
to lum...@googlegroups.com

If I try from lumify-master folder :

pasquale@ubuntu:~/Desktop/lumify-master$ mvn package -P web-war -pl web/war -am -DskipTests -Dsource.skip=true

I have:

Warning: JAVA_HOME environment variable is not set.
[INFO] Scanning for projects...
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Build Order:
[INFO] 
[INFO] Lumify Root: Build Tools
[INFO] Lumify
[INFO] Lumify: Web
[INFO] Lumify: Web: Client API
[INFO] Lumify: Core
[INFO] Lumify: Core
[INFO] Lumify: Core: Plugins
[INFO] Lumify: Core: Plugin: Model: BigTable
[INFO] Lumify: Core: Plugin: Model: RabbitMQ
[INFO] Lumify: Core: Plugin: Model: Secure Graph
[INFO] Lumify: Web: Base
[INFO] Lumify: Web: War
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Lumify Root: Build Tools 0.4.1-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-resources-plugin:2.3:resources (default-resources) @ lumify-build-tools ---
[WARNING] Using platform encoding (UTF-8 actually) to copy filtered resources, i.e. build is platform dependent!
[INFO] Copying 1 resource
[INFO] 
[INFO] --- maven-compiler-plugin:2.0.2:compile (default-compile) @ lumify-build-tools ---
[INFO] No sources to compile
[INFO] 
[INFO] --- maven-resources-plugin:2.3:testResources (default-testResources) @ lumify-build-tools ---
[WARNING] Using platform encoding (UTF-8 actually) to copy filtered resources, i.e. build is platform dependent!
[INFO] skip non existing resourceDirectory /home/pasquale/Desktop/lumify-master/build-tools/src/test/resources
[INFO] 
[INFO] --- maven-compiler-plugin:2.0.2:testCompile (default-testCompile) @ lumify-build-tools ---
[INFO] No sources to compile
[INFO] 
[INFO] --- maven-surefire-plugin:2.10:test (default-test) @ lumify-build-tools ---
[INFO] Tests are skipped.
[INFO] 
[INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ lumify-build-tools ---
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Lumify 0.4.1-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- buildnumber-maven-plugin:1.2:create (create-build-number) @ lumify ---
[INFO] Checking for local modifications: skipped.
[INFO] Updating project files from SCM: skipped.
[INFO] Executing: /bin/sh -c cd /home/pasquale/Desktop/lumify-master && git rev-parse --verify HEAD
[INFO] Working directory: /home/pasquale/Desktop/lumify-master
[INFO] Storing buildNumber: null at timestamp: 1422544650471
[INFO] Executing: /bin/sh -c cd /home/pasquale/Desktop/lumify-master && git rev-parse --verify HEAD
[INFO] Working directory: /home/pasquale/Desktop/lumify-master
[INFO] Storing buildScmBranch: UNKNOWN_BRANCH
[INFO] 
[INFO] --- buildnumber-maven-plugin:1.2:create-timestamp (create-formatted-timestamp) @ lumify ---
[INFO] 
[INFO] --- template-resgen-maven-plugin:0.11:generate-resource (gen-build-properties) @ lumify ---
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Lumify: Web 0.4.1-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- buildnumber-maven-plugin:1.2:create (create-build-number) @ lumify-web-group ---
[INFO] Checking for local modifications: skipped.
[INFO] Updating project files from SCM: skipped.
[INFO] Executing: /bin/sh -c cd /home/pasquale/Desktop/lumify-master/web && git rev-parse --verify HEAD
[INFO] Working directory: /home/pasquale/Desktop/lumify-master/web
[INFO] Storing buildNumber: null at timestamp: 1422544652128
[INFO] Executing: /bin/sh -c cd /home/pasquale/Desktop/lumify-master/web && git rev-parse --verify HEAD
[INFO] Working directory: /home/pasquale/Desktop/lumify-master/web
[INFO] Storing buildScmBranch: UNKNOWN_BRANCH
[INFO] 
[INFO] --- buildnumber-maven-plugin:1.2:create-timestamp (create-formatted-timestamp) @ lumify-web-group ---
[INFO] 
[INFO] --- template-resgen-maven-plugin:0.11:generate-resource (gen-build-properties) @ lumify-web-group ---
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Lumify: Web: Client API 0.4.1-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Lumify Root: Build Tools .......................... SUCCESS [3.050s]
[INFO] Lumify ............................................ SUCCESS [2.316s]
[INFO] Lumify: Web ....................................... SUCCESS [0.029s]
[INFO] Lumify: Web: Client API ........................... FAILURE [0.704s]
[INFO] Lumify: Core ...................................... SKIPPED
[INFO] Lumify: Core ...................................... SKIPPED
[INFO] Lumify: Core: Plugins ............................. SKIPPED
[INFO] Lumify: Core: Plugin: Model: BigTable ............. SKIPPED
[INFO] Lumify: Core: Plugin: Model: RabbitMQ ............. SKIPPED
[INFO] Lumify: Core: Plugin: Model: Secure Graph ......... SKIPPED
[INFO] Lumify: Web: Base ................................. SKIPPED
[INFO] Lumify: Web: War .................................. SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 10.525s
[INFO] Finished at: Thu Jan 29 07:17:32 PST 2015
[INFO] Final Memory: 24M/132M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal on project lumify-client-api: Could not resolve dependencies for project io.lumify:lumify-client-api:jar:0.4.1-SNAPSHOT: Could not find artifact com.sun:tools:jar:0 at specified path /usr/lib/jvm/java-7-openjdk-amd64/jre/../lib/tools.jar -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :lumify-client-api


Joe Ferner

unread,
Jan 29, 2015, 10:50:19 AM1/29/15
to lum...@googlegroups.com
Looks like you are running openjdk. You'll need to install Oracle's JDK to compile.

Joe Ferner

unread,
Jan 29, 2015, 10:50:33 AM1/29/15
to lum...@googlegroups.com
I'll update the readme to make it more clear

Pasquale Iodice

unread,
Jan 29, 2015, 12:35:09 PM1/29/15
to lum...@googlegroups.com

I installed Oracle JDK but I have build failure message at the end

[INFO] Reactor Summary:
[INFO] 
[INFO] Lumify Root: Build Tools .......................... SUCCESS [3.375s]
[INFO] Lumify ............................................ SUCCESS [2.503s]
[INFO] Lumify: Web ....................................... SUCCESS [0.391s]
[INFO] Lumify: Web: Client API ........................... SUCCESS [34.773s]
[INFO] Lumify: Core ...................................... SUCCESS [0.567s]
[INFO] Lumify: Core ...................................... SUCCESS [16:37.452s]
[INFO] Lumify: Core: Plugins ............................. SUCCESS [0.254s]
[INFO] Lumify: Core: Plugin: Model: BigTable ............. SUCCESS [1:25.800s]
[INFO] Lumify: Core: Plugin: Model: RabbitMQ ............. SUCCESS [53.909s]
[INFO] Lumify: Core: Plugin: Model: Secure Graph ......... SUCCESS [2:55.895s]
[INFO] Lumify: Web: Base ................................. SUCCESS [2:30.100s]
[INFO] Lumify: Web: War .................................. FAILURE [6:57.628s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 32:11.264s
[INFO] Finished at: Thu Jan 29 09:31:42 PST 2015
[INFO] Final Memory: 64M/261M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.2.1:exec (WebApp Dependencies and Minification) on project lumify-web-war: Command execution failed. Process exited with an error: 3 (Exit value: 3) -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :lumify-web-war


David Singley

unread,
Jan 29, 2015, 12:50:46 PM1/29/15
to lum...@googlegroups.com
there's likely an error before the maven summary you've included. take a closer look for error messages about grunt, npm, or bower

Pasquale Iodice

unread,
Jan 30, 2015, 8:23:49 AM1/30/15
to lum...@googlegroups.com
Davidand Joe thanks for your support. I am now at step  

11.Inside the docker image run Jetty:
  1. /opt/jetty/bin/jetty.sh start
Sorry but I am a newbie. Pls can you tell me what I have to put into my shell.

Pasquale Iodice

unread,
Jan 30, 2015, 8:27:37 AM1/30/15
to lum...@googlegroups.com
Thisis my docker images list:


REPOSITORY          TAG                 IMAGE ID            CREATED             VIRTUAL SIZE
lumifyio/dev        latest              463e4ba84400        25 hours ago        1.595 GB
<none>              <none>              99b9e8dbebb5        27 hours ago        1.595 GB
ubuntu              latest              5ba9dab47459        42 hours ago        188.3 MB
centos              centos6             510cf09a7986        3 weeks ago         202.6 MB

Pasquale Iodice

unread,
Jan 30, 2015, 9:29:40 AM1/30/15
to lum...@googlegroups.com
I tried it reading docker documentatio.  In this way:

sudo docker run -t -i lumifyio/dev /bin/bash

bash-4.1# /opt/jetty/bin/jetty.sh start

Now i have this result:

Starting Jetty: Picked up _JAVA_OPTIONS: -Djava.net.preferIPv4Stack=true
. 2015-01-30 14:06:29.375:INFO::main: Logging initialized @2992ms
2015-01-30 14:06:29.843:WARN:oejs.HomeBaseWarning:main: This instance of Jetty is not running from a separate {jetty.base} directory, this is not recommended.  See documentation at http://www.eclipse.org/jetty/documentation/current/startup.html
2015-01-30 14:06:29.891:INFO::main: Redirecting stderr/stdout to /opt/jetty-distribution-9.2.6.v20141205/logs/2015_01_30.stderrout.log
OK Fri Jan 30 14:06:33 UTC 2015


If i try to open a browser and go to: http://lumify-dev:8080/    I have serer not found. But re-running 
bash-4.1# /opt/jetty/bin/jetty.sh start

I have
Starting Jetty: Already Running 33!

Please help. Thanks a lot


Joe Ferner

unread,
Jan 30, 2015, 9:57:38 AM1/30/15
to lum...@googlegroups.com
Couple things to try...

Run netstat -anop | grep LISTEN Look for 8080

Another thing would be to look in the Lumify logs. In your docker container look in the directory /opt/lumify/logs. This should have the Lumify web application log.

Pasquale Iodice

unread,
Jan 30, 2015, 1:27:05 PM1/30/15
to lum...@googlegroups.com
bash-4.1#  netstat -anop | grep LISTEN Look for 8080

grep: Look: No such file or directory
grep: for: No such file or directory
grep: 8080: No such file or directory

and I have not the directory /opt/lumify/logs

Pasquale Iodice

unread,
Jan 31, 2015, 3:54:52 AM1/31/15
to lum...@googlegroups.com
Ehm sorry..

bash-4.1# /opt/jetty/bin/jetty.sh start

Starting Jetty: Already Running 36!


bash-4.1#  netstat -anop | grep LISTEN  
            
tcp        0      0 0.0.0.0:8080                0.0.0.0:*                   LISTEN      -                   off (0.00/0/0)
tcp        0      0 0.0.0.0:8443                0.0.0.0:*                   LISTEN      -                   off (0.00/0/0)

Pasquale Iodice

unread,
Jan 31, 2015, 4:38:23 AM1/31/15
to lum...@googlegroups.com
I tried to run 

/opt/start.sh  within the docker's image and I had

Starting Lumify Config
---------------------------------------------------------------
Picked up _JAVA_OPTIONS: -Djava.net.preferIPv4Stack=true
Picked up _JAVA_OPTIONS: -Djava.net.preferIPv4Stack=true
Picked up _JAVA_OPTIONS: -Djava.net.preferIPv4Stack=true
Picked up _JAVA_OPTIONS: -Djava.net.preferIPv4Stack=true
Picked up _JAVA_OPTIONS: -Djava.net.preferIPv4Stack=true
put: `/opt/lumify-source/config/opencv/haarcascade_frontalface_alt.xml': No such file or directory
Picked up _JAVA_OPTIONS: -Djava.net.preferIPv4Stack=true
put: `/opt/lumify-source/config/opennlp/*': No such file or directory
Picked up _JAVA_OPTIONS: -Djava.net.preferIPv4Stack=true
put: `/opt/lumify-source/config/knownEntities/dictionaries/*': No such file or directory
Picked up _JAVA_OPTIONS: -Djava.net.preferIPv4Stack=true
bash-4.1# /opt/jetty/bin/jetty.sh start
Starting Jetty: Picked up _JAVA_OPTIONS: -Djava.net.preferIPv4Stack=true
2015-01-31 09:32:05.073:INFO::main: Logging initialized @1138ms
2015-01-31 09:32:05.168:WARN:oejs.HomeBaseWarning:main: This instance of Jetty is not running from a separate {jetty.base} directory, this is not recommended.  See documentation at http://www.eclipse.org/jetty/documentation/current/startup.html
2015-01-31 09:32:05.218:INFO::main: Redirecting stderr/stdout to /opt/jetty-distribution-9.2.6.v20141205/logs/2015_01_31.stderrout.log
OK Sat Jan 31 09:32:07 UTC 2015

Jeff Kunkle

unread,
Feb 5, 2015, 8:49:57 AM2/5/15
to lum...@googlegroups.com
Hi Pasquale. Do you have someone local to you with Docker experience that can help debug some of the Docker issues? I think you'd get things resolved quicker than we can do via email.

Pasquale Iodice

unread,
Feb 5, 2015, 9:16:31 AM2/5/15
to lum...@googlegroups.com
Thanks a lot for your help. I finally installed lumify and now I would understand how to begin with it. Ps: if you think that can be helpful, I can post my step by step guide to install lumify on Ubuntu 14.04. I'd like to make a contribution

Joe Ferner

unread,
Feb 5, 2015, 9:58:31 AM2/5/15
to lum...@googlegroups.com
That would be great. If you have any tips can you add it to the documentation and submit a github pull request.

suhas meena

unread,
Apr 6, 2015, 5:11:19 AM4/6/15
to lum...@googlegroups.com
Hi Joe,
              I am using centos 6.2. Kindly put your steps to install lumify and that too docker in more detail.

suhas meena

unread,
Apr 21, 2015, 4:37:33 AM4/21/15
to lum...@googlegroups.com
Hi
       i am getting following error while i execute mvn command inside lumify server which i get after executing docker/run-dev.sh. Kindly help me asap am i doing some thing wrong moreover when i execute java -version i get jdk 7 even my system does not have jdk 7 installed kindly help me. The error is as follows.

[ERROR] Could not find the selected project in the reactor: web/war -> [Help 1]

[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:







On Thursday, February 5, 2015 at 8:28:31 PM UTC+5:30, Joe Ferner wrote:

suhas meena

unread,
Apr 21, 2015, 4:40:52 AM4/21/15
to lum...@googlegroups.com
[ERROR] Could not find the selected project in the reactor: web/war -> [Help 1]

[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:

balu

unread,
Jun 1, 2015, 10:21:16 PM6/1/15
to lum...@googlegroups.com
Hi Pasquale,

 I'm in process of setting up the Lumify instance. If you have completed the installation guide, could you please share with us? it'll be of great help for the whole community who's trying to set this up.

Thanks in Advance,
Siva. 
Reply all
Reply to author
Forward
0 new messages