Our service is running on the Google appengine which bring us a lot of trouble like no file read/write allowed, no socket server allowed. These limitation makes it has only one solution to read the test result file is through Google's cloud platform storage. Something like we send the code coverage data in byte array to somewhere. Cobertura github wiki provide a java reflect method that collect data and we try to using servlet to trigger this collection and data transmission.
Google appengine not allow using java agent force us to use the offline instrument. I just tried the ant task for Jacoco and got the instrumented classes but wonder if there is similar thing we could do to trigger data collection and transmission in servlet with Jacoco.
Thanks for your suggestion! I tried the get executiondata and successfully run the offline instrument -> deploy to GAE -> collection data. I got the .exec file in the google cloud storage, but the:
Caused by: java.io.IOException: Invalid execution data file.
at org.jacoco.core.data.ExecutionDataReader.read(ExecutionDataReader.java:89)
at org.jacoco.core.tools.ExecFileLoader.load(ExecFileLoader.java:59)
at org.jacoco.ant.ReportTask.loadExecutionData(ReportTask.java:514)
... 25 more
when I try to get the report from the exec file. This file is send through the google's api and I try to write the byte[] in the output stream and save as a .exec file. Not sure whether the invalid file is due to this byte[] to file save process. We used to try to save the coboertura data file by write the whole data object and it can parse after the donwload.
Yutian Song
Thanks for your reply at weekend! Here is the code I used to send data to google cloud storage:
GcsService gcsService = GcsServiceFactory.createGcsService(RetryParams.getDefaultInstance());
GcsFilename gcsFileName = new GcsFilename(bucketName, fileName);
GcsFileOptions.Builder fileOptionsBuilder = new GcsFileOptions.Builder();
GcsFileOptions fileOptions = fileOptionsBuilder.build();
GcsOutputChannel outputChannel = gcsService.createOrReplace(gcsFileName, fileOptions);
ObjectOutputStream outStream = new ObjectOutputStream(Channels.newOutputStream(outputChannel));
final ExecutionDataWriter writer = new ExecutionDataWriter(outStream);
log.info("start to write to GCS");
data.collect(writer, writer, reset);
log.info("finish write to gcs");
The lib for gcs is from maven central, google's release:
<dependencies>
<dependency>
<groupId>com.google.appengine.tools</groupId>
<artifactId>appengine-gcs-client</artifactId>
<version>0.5</version>
</dependency>
</dependencies>
I don't know how to attach the data file here but it is only 2k in size which made me think I may not using it correctly, the although the getExecute data didn't send error message but the agent seems not start based on what I receive. I will try to use reflect method to see whether it is the cross module dump issue since the trigger to trigger the get data is in a dependency module's servlet.
Yutian Song
I have changed the code and tried the reflect to trigger dump too but the exec file keep 1.91KB in size and not readable. I think the size maybe tell something, we used to have the jacoco report, the exec file at least with 300KB, the current one tells me that even the agent get data find the running jacoco agent but the data it return not match the one we plan to collect(by the ant instrument), I'm not sure is there anywhere I can use to track data in runtime like print something so I can quickly verify on it.
Regards,
Yutian Song
Thanks for your replying! I will try and send you the detail information as soon as I got them. I'm a little busy today, but will have you update tmr!
Yutian Song
I got the valid data now! I think I got the issue, the previous trigger for getexecution data method is within a dependency module, because I didn't have the compiled lib file instrumented, the agent I got is from the dependent module. The ant task for report is configured to use the target module class file, so there is a mismatch in class file result in the invalid data file error. Issue gone when I put the trigger for dump data within the target module.
I think the current issue is gone but I still need to take a look on how to get the offline instrument working on the dependent module since we do have a lot of service depend on different other common module.
Thank you so much on the patient explanation and your weekend time!
Yutian Song