How to save runtime code coverage data to cloud based storage

159 views
Skip to first unread message

yutia...@snapchat.com

unread,
May 11, 2017, 8:57:56 PM5/11/17
to JaCoCo and EclEmma Users
I'm working on a project to collect code coverage for end to end mode server side code. We used to use the cobertura to achieve this but interested in transfer to Jacoco.

Our service is running on the Google appengine which bring us a lot of trouble like no file read/write allowed, no socket server allowed. These limitation makes it has only one solution to read the test result file is through Google's cloud platform storage. Something like we send the code coverage data in byte array to somewhere. Cobertura github wiki provide a java reflect method that collect data and we try to using servlet to trigger this collection and data transmission.

Google appengine not allow using java agent force us to use the offline instrument. I just tried the ant task for Jacoco and got the instrumented classes but wonder if there is similar thing we could do to trigger data collection and transmission in servlet with Jacoco.

Marc R. Hoffmann

unread,
May 12, 2017, 1:53:42 AM5/12/17
to jac...@googlegroups.com
Hi,

JaCoCo offers an runtime API for this:
http://www.jacoco.org/jacoco/trunk/doc/api/org/jacoco/agent/rt/package-summary.html

With

byte[] execdata = RT.getAgent().getExecutionData(false);

you can get the execution data within the App engine and transfer/store
it by a apropriate mechanism.

If you get JaCoCo running with Goole AppEngine please let us know!

Regards,
-marc

yutia...@snapchat.com

unread,
May 13, 2017, 6:36:05 PM5/13/17
to JaCoCo and EclEmma Users
Hi Marc,

Thanks for your suggestion! I tried the get executiondata and successfully run the offline instrument -> deploy to GAE -> collection data. I got the .exec file in the google cloud storage, but the:
Caused by: java.io.IOException: Invalid execution data file.
at org.jacoco.core.data.ExecutionDataReader.read(ExecutionDataReader.java:89)
at org.jacoco.core.tools.ExecFileLoader.load(ExecFileLoader.java:59)
at org.jacoco.ant.ReportTask.loadExecutionData(ReportTask.java:514)
... 25 more
when I try to get the report from the exec file. This file is send through the google's api and I try to write the byte[] in the output stream and save as a .exec file. Not sure whether the invalid file is due to this byte[] to file save process. We used to try to save the coboertura data file by write the whole data object and it can parse after the donwload.

Yutian Song

Marc R. Hoffmann

unread,
May 14, 2017, 11:35:33 AM5/14/17
to jac...@googlegroups.com
Hi Yutian,

the byte[] content is exactly what JaCoCo expects as an exec file.
Please double-check that you write the byte content as is, without any
modifications or additions. E.g. with standard Java APIs:

java.io.OutputStream.write(byte[])

Maybe you can show the snippet how you write the data?

Also the stacktrace should give some more details in "Caused by ..."
further down.

Regards,
-marc

yutia...@snapchat.com

unread,
May 14, 2017, 1:03:14 PM5/14/17
to JaCoCo and EclEmma Users
Hi Marc,

Thanks for your reply at weekend! Here is the code I used to send data to google cloud storage:
GcsService gcsService = GcsServiceFactory.createGcsService(RetryParams.getDefaultInstance());
GcsFilename gcsFileName = new GcsFilename(bucketName, fileName);
GcsFileOptions.Builder fileOptionsBuilder = new GcsFileOptions.Builder();

GcsFileOptions fileOptions = fileOptionsBuilder.build();
GcsOutputChannel outputChannel = gcsService.createOrReplace(gcsFileName, fileOptions);
ObjectOutputStream outStream = new ObjectOutputStream(Channels.newOutputStream(outputChannel));
final ExecutionDataWriter writer = new ExecutionDataWriter(outStream);
log.info("start to write to GCS");
data.collect(writer, writer, reset);
log.info("finish write to gcs");

The lib for gcs is from maven central, google's release:
<dependencies>
<dependency>
<groupId>com.google.appengine.tools</groupId>
<artifactId>appengine-gcs-client</artifactId>
<version>0.5</version>
</dependency>
</dependencies>

I don't know how to attach the data file here but it is only 2k in size which made me think I may not using it correctly, the although the getExecute data didn't send error message but the agent seems not start based on what I receive. I will try to use reflect method to see whether it is the cross module dump issue since the trigger to trigger the get data is in a dependency module's servlet.

Yutian Song

Marc R. Hoffmann

unread,
May 14, 2017, 1:11:50 PM5/14/17
to jac...@googlegroups.com
Hi,

first of all the ObjectOutputStrean must not be used. It might add some
headers which will definitely corrupt the exec file. Just write the data
to a plain OutputStream:

IAgent agent = RT.getAgent()
byte[] data = agent.getExecutionData(false)

<Google Stuff>

OutputStream outStream = Channels.newOutputStream(outputChannel);
outStream.write(data);
outStream.close()

Regards,
-marc

yutia...@snapchat.com

unread,
May 15, 2017, 1:48:56 PM5/15/17
to JaCoCo and EclEmma Users
Hi Marc,

I have changed the code and tried the reflect to trigger dump too but the exec file keep 1.91KB in size and not readable. I think the size maybe tell something, we used to have the jacoco report, the exec file at least with 300KB, the current one tells me that even the agent get data find the running jacoco agent but the data it return not match the one we plan to collect(by the ant instrument), I'm not sure is there anywhere I can use to track data in runtime like print something so I can quickly verify on it.

Regards,

Yutian Song

Marc R. Hoffmann

unread,
May 15, 2017, 2:12:09 PM5/15/17
to jac...@googlegroups.com
Hi Yutian,

what value has data.length on the deployed app? Is it the exact length
you see later as the written exec file? Can you please post the new
version of your code which retrieves the execution data from the agent
and writes it as a GAE resource?

It is possible to dump the data content within the app Engine using
JaCoCo APIs, see this example:

http://www.jacoco.org/jacoco/trunk/doc/examples/java/ExecDump.java

Just wrap the data array with an ByteArrayInputStream instead of using a
FileInputStream.

But as said before: The problem in your case is the transport of the
data[] array to the server where you create the report.

Also I asked you to show the full stack trace (incl. "Caused By") so we
might see what is broken here.

Cheers,
-marc

yutia...@snapchat.com

unread,
May 15, 2017, 4:34:54 PM5/15/17
to JaCoCo and EclEmma Users
Hi Marc,

Thanks for your replying! I will try and send you the detail information as soon as I got them. I'm a little busy today, but will have you update tmr!

Yutian Song

yutia...@snapchat.com

unread,
May 16, 2017, 5:23:42 PM5/16/17
to JaCoCo and EclEmma Users
Hi Marc,

I got the valid data now! I think I got the issue, the previous trigger for getexecution data method is within a dependency module, because I didn't have the compiled lib file instrumented, the agent I got is from the dependent module. The ant task for report is configured to use the target module class file, so there is a mismatch in class file result in the invalid data file error. Issue gone when I put the trigger for dump data within the target module.

I think the current issue is gone but I still need to take a look on how to get the offline instrument working on the dependent module since we do have a lot of service depend on different other common module.

Thank you so much on the patient explanation and your weekend time!

Yutian Song

Marc R. Hoffmann

unread,
May 16, 2017, 5:25:26 PM5/16/17
to jac...@googlegroups.com
Glad to hear that at least the exec file issue got resolved!

Cheers,
-marc
Reply all
Reply to author
Forward
This conversation is locked
You cannot reply and perform actions on locked conversations.
0 new messages