problem with class loading

234 views
Skip to first unread message

Victor

unread,
Sep 16, 2012, 9:34:49 PM9/16/12
to ve...@googlegroups.com
Hi 

I have been experimenting with Vertx. When loading a hadoop configuration, I run into the following error. It seems that loading or hadoop configuration class caused conflict.  I would appreciate some help here on how to overcome this problem.

Exception in Java verticle script 
java.lang.LinkageError: loader constraint violation: loader (instance of org/vertx/java/deploy/impl/ParentLastURLClassLoader) previously initiated loading for a different type with name "org/w3c/dom/Document"
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:791)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at org.vertx.java.deploy.impl.ParentLastURLClassLoader.loadClass(ParentLastURLClassLoader.java:60)
at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:1335)
at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:1251)
at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:1192)
at org.apache.hadoop.conf.Configuration.set(Configuration.java:461)
at com.nokia.geovisualizer.vertx.HttpServer.start(HttpServer.java:17)
at org.vertx.java.deploy.impl.VerticleManager$9.run(VerticleManager.java:642)
at org.vertx.java.core.impl.Context$2.run(Context.java:118)
at org.jboss.netty.channel.socket.nio.AbstractNioWorker.processEventQueue(AbstractNioWorker.java:361)
at org.jboss.netty.channel.socket.nio.AbstractNioWorker.run(AbstractNioWorker.java:245)
at org.jboss.netty.channel.socket.nio.NioWorker.run(NioWorker.java:35)
at org.jboss.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:102)
at org.jboss.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603)
at java.lang.Thread.run(Thread.java:722)


Here is the code that cause the above error:
HttpServer.java

import java.io.UnsupportedEncodingException;

Import java.util.Properties;

import org.apache.hadoop.conf.Configuration;

import org.vertx.java.core.Handler;

import org.vertx.java.core.http.HttpServerRequest;

import org.vertx.java.deploy.Verticle;

public class HttpServer extends Verticle {


public void start() {

org.apache.hadoop.conf.Configuration hadoop_conf = new org.apache.hadoop.conf.Configuration();

hadoop_conf.set("good", "value");    // this is the line 17

...

Thanks

Victor

Brian Lalor

unread,
Sep 16, 2012, 9:39:38 PM9/16/12
to ve...@googlegroups.com
On Sep 16, 2012, at 9:34 PM, Victor <mingd...@gmail.com> wrote:

Exception in Java verticle script 
java.lang.LinkageError: loader constraint violation: loader (instance of org/vertx/java/deploy/impl/ParentLastURLClassLoader) previously initiated loading for a different type with name "org/w3c/dom/Document"

You've probably got org.w3c.dom.Document in a JAR in your module's own lib dir; since it's already provided by the JRE and has been previously loaded, Vert.x's classloader is trying to load it from that JAR and causing the above exception.  I bet if you remove that JAR from your module's lib dir the problem will go away.

Victor

unread,
Sep 17, 2012, 12:19:01 PM9/17/12
to ve...@googlegroups.com
blalor:

you are awesome! Thank you!

Victor

On Sunday, September 16, 2012 9:39:41 PM UTC-4, blalor wrote:

Brian Lalor

unread,
Sep 17, 2012, 12:19:44 PM9/17/12
to ve...@googlegroups.com
On Sep 17, 2012, at 12:19 PM, Victor <mingd...@gmail.com> wrote:

you are awesome! Thank you!

Happy to help. :-)
Reply all
Reply to author
Forward
0 new messages