17/06/15 08:11:20 INFO yarn.Client: Application report for application_1497278430484_0012 (state: ACCEPTED)
17/06/15 08:11:21 INFO yarn.Client: Application report for application_1497278430484_0012 (state: ACCEPTED)
17/06/15 08:11:22 INFO yarn.Client: Application report for application_1497278430484_0012 (state: ACCEPTED)
17/06/15 08:11:23 INFO yarn.Client: Application report for application_1497278430484_0012 (state: ACCEPTED)
17/06/15 08:11:24 INFO yarn.Client: Application report for application_1497278430484_0012 (state: ACCEPTED)
17/06/15 08:11:25 INFO yarn.Client: Application report for application_1497278430484_0012 (state: ACCEPTED)
17/06/15 08:11:26 INFO yarn.Client: Application report for application_1497278430484_0012 (state: ACCEPTED)
17/06/15 08:11:27 INFO yarn.Client: Application report for application_1497278430484_0012 (state: ACCEPTED)
17/06/15 08:11:28 INFO yarn.Client: Application report for application_1497278430484_0012 (state: FINISHED)
17/06/15 08:11:28 INFO yarn.Client:
client token: N/A
diagnostics: Uncaught exception: org.apache.hadoop.yarn.exceptions.InvalidResourceRequestException: Invalid resource request, requested virtual cores < 0, or requested virtual cores > max configured, requestedVirtualCores=7, maxVirtualCores=4
at org.apache.hadoop.yarn.server.resourcemanager.scheduler.SchedulerUtils.validateResourceRequest(SchedulerUtils.java:288)
at org.apache.hadoop.yarn.server.resourcemanager.scheduler.SchedulerUtils.normalizeAndValidateRequest(SchedulerUtils.java:248)
at org.apache.hadoop.yarn.server.resourcemanager.scheduler.SchedulerUtils.normalizeAndvalidateRequest(SchedulerUtils.java:264)
at org.apache.hadoop.yarn.server.resourcemanager.RMServerUtils.normalizeAndValidateRequests(RMServerUtils.java:206)
at org.apache.hadoop.yarn.server.resourcemanager.ApplicationMasterService.allocate(ApplicationMasterService.java:464)
at org.apache.hadoop.yarn.api.impl.pb.service.ApplicationMasterProtocolPBServiceImpl.allocate(ApplicationMasterProtocolPBServiceImpl.java:60)
at org.apache.hadoop.yarn.proto.ApplicationMasterProtocol$ApplicationMasterProtocolService$2.callBlockingMethod(ApplicationMasterProtocol.java:99)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:447)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:989)
at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:845)
at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:788)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1807)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2455)
ApplicationMaster host: 192.168.100.7
ApplicationMaster RPC port: 0
queue: default
start time: 1497514278785
final status: FAILED
tracking URL: http://siem.novalocal:8088/proxy/application_1497278430484_0012/
user: hogzilla
Exception in thread "main" org.apache.spark.SparkException: Application application_1497278430484_0012 finished with failed status
at org.apache.spark.deploy.yarn.Client.run(Client.scala:1180)
at org.apache.spark.deploy.yarn.Client$.main(Client.scala:1226)
at org.apache.spark.deploy.yarn.Client.main(Client.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:743)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
17/06/15 08:11:28 INFO util.ShutdownHookManager: Shutdown hook called
17/06/15 08:11:28 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-b370f33b-062b-408c-b01c-706c1f0fc806
~/bin/stop-hogzilla.sh~/bin/start-hogzilla.sh
sflow {
collector { ip=127.0.0.1 udpport=6343 }
pcap { dev=eth0 }
tcp {}
}# ps aux | grep -i flowhogzilla 3078 0.0 0.0 13208 2196 ? S Jun12 0:00 /bin/bash /home/hogzilla/bin/start-sflow2hz.shroot 27725 0.8 0.0 89724 2272 pts/3 Sl+ 08:06 0:03 hsflowd -P -dd -f /etc/hsflowd.confhogzilla 27920 0.0 0.0 4148 676 pts/4 S+ 08:08 0:00 sflowtool -p 6343 -lhogzilla 27921 0.0 0.0 41204 2140 pts/4 S+ 08:08 0:00 /home/hogzilla/bin/sflow2hz -h 127.0.0.1 -p 9090
To view this discussion on the web visit https://groups.google.com/d/msgid/hogzilla/20774990-8959-4c87-bb5b-5aac15580b1e%40googlegroups.com.--
You received this message because you are subscribed to the Google Groups "Hogzilla Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to hogzilla+unsubscribe@googlegroups.com.
To post to this group, send email to hogz...@googlegroups.com.
17/06/29 08:58:20 INFO yarn.Client: Application report for application_1497278430484_0073 (state: FINISHED)17/06/29 08:58:20 INFO yarn.Client: client token: N/A diagnostics: User class threw exception: org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=36, exceptions:Thu Jun 29 08:58:19 UTC 2017, null, java.net.SocketTimeoutException: callTimeout=60000, callDuration=68475: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: com/yammer/metrics/core/Gauge row 'hogzilla_flows,,00000000000000' on table 'hbase:meta' at region=hbase:meta,,1.1588230740, hostname=siem.novalocal,16201,1497278446413, seqNum=0
ApplicationMaster host: 192.168.100.7 ApplicationMaster RPC port: 0 queue: default start time: 1498726582092 final status: FAILED user: hogzillaException in thread "main" org.apache.spark.SparkException: Application application_1497278430484_0073 finished with failed status at org.apache.spark.deploy.yarn.Client.run(Client.scala:1180) at org.apache.spark.deploy.yarn.Client$.main(Client.scala:1226) at org.apache.spark.deploy.yarn.Client.main(Client.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:743) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)17/06/29 08:58:20 INFO util.ShutdownHookManager: Shutdown hook called17/06/29 08:58:20 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-dc2a87b2-0a8d-4fe2-8595-af5572b10f02
To unsubscribe from this group and stop receiving emails from it, send an email to hogzilla+u...@googlegroups.com.