Hi,
I want to parse a json string of size about 5986 bytes. But, I want the parsing to be really fast, looking for a transaction per second record of atleast 70k. That means, 7ok strings of 5896 bytes parsed per second. Also, I would have 70k different json being parsed each second.
All my hopes are on Jackson as it is a more matured library. Is there a way I can achieve this through Jackson. I tried the ObjectMapper but it takes a lot of time in parsing the string.
Please let me know if what I'm trying to achieve is not feasible or there is a different way to do it.
The machine configuration on which I tested was - intel core15 2.5GHZ, 8GB ram.
the very basic code that I ran is as below -
long startTime = System.currentTimeMillis();
JsonFactory factory = new JsonFactory();
ObjectMapper mapper = new ObjectMapper(factory);
JsonNode root = null;
for(int i=0; i<70000; i++)
{
try {
root = mapper.readTree(jsonString);
} catch (JsonProcessingException e)
{ e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
long endTime = System.currentTimeMillis(); System.out.println("jackson processing ---" + (endTime-startTime));
Hi,
I want to parse a json string of size about 5986 bytes. But, I want the parsing to be really fast, looking for a transaction per second record of atleast 70k. That means, 7ok strings of 5896 bytes parsed per second. Also, I would have 70k different json being parsed each second.
All my hopes are on Jackson as it is a more matured library. Is there a way I can achieve this through Jackson. I tried the ObjectMapper but it takes a lot of time in parsing the string.
Please let me know if what I'm trying to achieve is not feasible or there is a different way to do it.
The machine configuration on which I tested was - intel core15 2.5GHZ, 8GB ram.
the very basic code that I ran is as below -
long startTime = System.currentTimeMillis();
JsonFactory factory = new JsonFactory();
ObjectMapper mapper = new ObjectMapper(factory);
JsonNode root = null;for(int i=0; i<70000; i++)
{
try {
root = mapper.readTree(jsonString);
} catch (JsonProcessingException e)
{ e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
long endTime = System.currentTimeMillis(); System.out.println("jackson processing ---" + (endTime-startTime));
--
You received this message because you are subscribed to the Google Groups "jackson-user" group.
To unsubscribe from this group and stop receiving emails from it, send an email to jackson-user...@googlegroups.com.
To post to this group, send email to jackso...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
Hi Tatu,thanks for the extensive reply. This really helps.However, the problem I face is that I do not have the POJO's for this json. I just simply get the json string from an api. And as for your question about the generation of the strings, that is taken care by the api. So, in short, the api fires very second and sends me 70k json strings with different value because of which I have to parse each and every json and fetch the values.Its like sitting at the server for a bunch of very busy stores across the continent on thanksgiving day and the every time an item is billed, I get the json and have to process it.So, it seems that the only option left for me is to create the code using the streaming api,right?
And yes, I'm using a multicore system and the strings would be sent to different threads to process. This is just for testing purpose.
On Thursday, October 30, 2014 3:53:59 PM UTC+5:30, gangum wrote:Hi,
I want to parse a json string of size about 5986 bytes. But, I want the parsing to be really fast, looking for a transaction per second record of atleast 70k. That means, 7ok strings of 5896 bytes parsed per second. Also, I would have 70k different json being parsed each second.All my hopes are on Jackson as it is a more matured library. Is there a way I can achieve this through Jackson. I tried the ObjectMapper but it takes a lot of time in parsing the string.
Please let me know if what I'm trying to achieve is not feasible or there is a different way to do it.
The machine configuration on which I tested was - intel core15 2.5GHZ, 8GB ram.
the very basic code that I ran is as below -
long startTime = System.currentTimeMillis();
JsonFactory factory = new JsonFactory();
ObjectMapper mapper = new ObjectMapper(factory);
JsonNode root = null;for(int i=0; i<70000; i++)
{
try {
root = mapper.readTree(jsonString);
} catch (JsonProcessingException e)
{ e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
long endTime = System.currentTimeMillis(); System.out.println("jackson processing ---" + (endTime-startTime));
Tatu, first of all, thanks for ur interest in my post, this is really helpfull :)Well, its just that I'm short on time for delivery and parsing the json string is the only option I have right now. Another thing that adds to the complexity is that this json is created using a utility that generates apache avro compliant json. Streaming
api seems a bit pain to right as the nesting involved in my json is very deep and I
was planning to avoid that effort, but I guess thats what I have to do for now. Maybe later I'll get hold of the Avro generated classes and that could make my life easier :)
Still, where can I find the benchmarking classes? I'm sure there must be some performance tests for strings, bytes of various sizes. What was the optimal performance using ObjectMapper for strings? Really curious as to what I'm trying to achieve is possible or not because I tested other api's as well but none of them where able to give the tps I need, not sure if I'm headed the right direction... :(