Hi everyone,
for a while I've been experiencing memory issues when marshalling large files.
I've monitored this usage with some crude profiling:
25,000 locations: 72MB
50,000 locations: 140MB
So I've looked for ways to reduce this problem. One approach is to marshal the file in chunks instead of all at once. Here are some useful links:
Example:
JAXBContext context = JAXBContext.newInstance(type);
Marshaller m = context.createMarshaller();
m.setProperty(Marshaller.JAXB_FRAGMENT, Boolean.TRUE);
java.io.StringWriter sw = new java.io.StringWriter();
XMLStreamWriter xmlOut = XMLOutputFactory.newFactory().createXMLStreamWriter(sw);
xmlOut.writeStartDocument("UTF-8", "1.0");
xmlOut.writeStartElement("kml");
xmlOut.writeNamespace("xal", "urn:oasis:names:tc:ciq:xsdschema:xAL:2.0");
xmlOut.writeStartElement("Document");
// iterate through your placemarks here
Placemark placemark = new Placemark()
...
m.marshal(placemark, xmlOut);
xmlOut.writeEndElement(); // Document
xmlOut.writeEndElement(); // kml
xmlOut.close();
This is an intermediate solution that sacrifices elegance but this way I've been able to reduce memory usage at least by 60%:
25,000 locations: 20MB
50,000 locations: 40MB
I believe a similar approach can be used when parsing large documents.
I hope someone finds this useful
Emanuele