>>> Stuart Lewis <
s.l...@auckland.ac.nz> 7/11/2011 6:06 PM >>>
Hi Joshua,
The error message below suggests that you are running the openjdk. Instead, try installing the official Sun JDK:
-
https://wiki.duraspace.org/display/DSPACE/Installing+DSpace+1.7+on+Ubuntu#InstallingDSpace1.7onUbuntu-%28OPTIONAL%29ChangetousingSun%2FOracleJavaJDK
This will hopefully help.
Thanks,
Stuart Lewis
Digital Development Manager
Te Tumu Herenga The University of Auckland Library
Auckland Mail Centre, Private Bag 92019, Auckland 1142, New Zealand
Ph:
+64 (0)9 373 7599 x81928On 12/07/2011, at 3:33 AM, Joshua Gomez wrote:
> I ran across the following error last week while trying to run the media filter (I'm using DSpace 1.7.2, on Ubuntu 10.04.2):
>
> Exception in thread "Thread-2" java.lang.UnsatisfiedLinkError: /usr/lib/jvm/java-6-openjdk/jre/lib/amd64/libmanagement.so: /usr/lib/jvm/java-6-openjdk/jre/lib/amd64/libmanagement.so: cannot open shared object file: Too many open files
> at java.lang.ClassLoader$NativeLibrary.load(Native Method)
> at java.lang.ClassLoader.loadLibrary0(ClassLoader.java:1750)
> at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1667)
> at java.lang.Runtime.loadLibrary0(Runtime.java:840)
> at java.lang.System.loadLibrary(System.java:1047)
> at sun.security.action.LoadLibraryAction.run(LoadLibraryAction.java:67)
> at sun.security.action.LoadLibraryAction.run(LoadLibraryAction.java:47)
> at java.security.AccessController.doPrivileged(Native Method)
> at sun.management.ManagementFactory.<clinit>(ManagementFactory.java:485)
> at java.lang.management.ManagementFactory.getPlatformMBeanServer(ManagementFactory.java:521)
> at org.dspace.kernel.DSpaceKernelManager.unregisterMBean(DSpaceKernelManager.java:178)
> at org.dspace.servicemanager.DSpaceKernelImpl.destroy(DSpaceKernelImpl.java:211)
> at org.dspace.servicemanager.DSpaceKernelImpl.doDestroy(DSpaceKernelImpl.java:233)
> at org.dspace.servicemanager.DSpaceKernelImpl$1.run(DSpaceKernelImpl.java:78)
>
> It appears this error is not a problem with DSpace so much as the server environment, but I'm wondering if anyone else has come across such an error before?
>
> This error occurred after the new items had been filtered and the script was in the process of updating the index. I recently added about a thousand items that had very poor OCR (they were very old documents) and our index word limit is not set, so I am wondering if the index has become too large due to all the new "words" from the bad OCR. Many of the pdfs we added are pretty large so I'm also wondering if it's just a memory issue.
>
> Any thoughts?