Taro-san,
Thanks for suggesting the solution.
> A major concern is the DB extraction performance; my approach may be slow
> for large volumes of DB files.
I've done a tiny experiment based on org.sqlite.SQLiteJDBCLoader and
it worked reasonably fast for my purpose.
It took me about 10-11 seconds (Pentinum4, 3GHz, 1GB RAM, Windows XP,
JDK6, Runtime.getRuntime().totalMemory() -
Runtime.getRuntime().freeMemory() = 500KB ) to extract the 90MB DB
file.
Below is the code I used FYI.
public class Unjar {
public static void main(String[] args) {
long t0 = System.currentTimeMillis();
String tempFolder = new
File(System.getProperty("java.io.tmpdir")).getAbsolutePath();
File extractedFile = new File(tempFolder, "sqlite.db");
try {
InputStream reader = Unjar.class.getResourceAsStream("/sqlite.db");
FileOutputStream writer = new FileOutputStream(extractedFile);
byte[] buffer = new byte[1024];
int bytesRead = 0;
while ((bytesRead = reader.read(buffer)) != -1) {
writer.write(buffer, 0, bytesRead);
}
writer.close();
reader.close();
} catch ( Exception e ) {
e.printStackTrace();
}
long t1 = System.currentTimeMillis();
System.out.println( "Extraction done in "+(double)(t1-t0)/1000D + " sec." );
}
}
Thanks,
-Hideki