Testing large file downloads.

2,611 views
Skip to first unread message

Aris Green

unread,
Jun 8, 2012, 12:10:44 PM6/8/12
to gat...@googlegroups.com
I am new to scala and gatling.  I have gone over the examples and some introduction to scala.  I am trying to determine of gatling could be used for testing involving large file downloads.  The current tests we have hash files (~50 to 200 MiB) as they are downloaded.  The file is not saved, but the hash is validated.  This will not work if the http request in the gatling test buffers the entire response before the user can access it.  We need to access the response in chunks using a stream.

Rather than spending a week learning scala and gatling, I wanted to see if this was possible.  If not, I'll try The Grinder  The current load test we use is just hand coded C# that is not run under any tool.  I want to move toward an open source tool.  

Stéphane Landelle

unread,
Jun 8, 2012, 12:22:39 PM6/8/12
to gat...@googlegroups.com
Hi,

We hadn't thought of this use case, so no, it's not doable with the current API.
AsyncHttpClient AsyncHandler provides a callback where you can get a byte array of a chunk every time one arrives, so that's probably something that would suit your needs.
If your not in a very hurry and can wait for a next version, we can give it a thought.

Cheers,

Stephane

2012/6/8 Aris Green <aris....@gmail.com>

Stéphane Landelle

unread,
Jun 12, 2012, 1:53:38 AM6/12/12
to gat...@googlegroups.com
Hi there,

Just to let you know that I've pushed checksum checks support in master code, with md5 and sha1 built-ins:

I need to perform further tests, but this should be out in 1.2.2 to be released by the end of the week.

Cheers,

Stephane

2012/6/8 Stéphane Landelle <slan...@excilys.com>

Stephen Kuenzli

unread,
Jun 12, 2012, 9:59:27 PM6/12/12
to gat...@googlegroups.com
Awesome!  I was just thinking about how to implement this feature, and there it is.  I plan to use it.

Stephen


On Monday, June 11, 2012 10:53:38 PM UTC-7, Stéphane Landelle wrote:
Hi there,

Just to let you know that I've pushed checksum checks support in master code, with md5 and sha1 built-ins:

I need to perform further tests, but this should be out in 1.2.2 to be released by the end of the week.

Cheers,

Stephane

2012/6/8 Stéphane Landelle
Hi,

We hadn't thought of this use case, so no, it's not doable with the current API.
AsyncHttpClient AsyncHandler provides a callback where you can get a byte array of a chunk every time one arrives, so that's probably something that would suit your needs.
If your not in a very hurry and can wait for a next version, we can give it a thought.

Cheers,

Stephane


2012/6/8 Aris Green
I am new to scala and gatling.  I have gone over the examples and some introduction to scala.  I am trying to determine of gatling could be used for testing involving large file downloads.  The current tests we have hash files (~50 to 200 MiB) as they are downloaded.  The file is not saved, but the hash is validated.  This will not work if the http request in the gatling test buffers the entire response before the user can access it.  We need to access the response in chunks using a stream.

Stéphane Landelle

unread,
Jun 14, 2012, 10:11:23 AM6/14/12
to gat...@googlegroups.com
Just to let you know: Gatling 1.2.2 is out, so you can use those new check now, just go grab it!

2012/6/13 Stephen Kuenzli <stephen...@qualimente.com>

Aris Green

unread,
Jun 27, 2012, 11:49:32 AM6/27/12
to gat...@googlegroups.com
I appreciate all the feedback.  Sorry I have not got back.  I went ahead and started using The Grinder, but I release, that once I understand Scala better and how to use Gatling, Gatling may have more to offer.  Essentially, what I needed to do was do download a lot of files, hash them as I download, and not save them to disk.  I was check the hash values against know correct values to test for success.  So the logic is this:  download files from several processes and threads in chunks, updating a hash for each chunk, but not saving the chunk.  This saves the whole file from being buffered into memory.  I bet its possible to do in Scala, its just that I already know Python so Jython was not too hard.  In Scala, I'd need to do the downloads and hash in a loop.

Many thanks,
Aris

Aris Green

unread,
Jun 27, 2012, 11:50:33 AM6/27/12
to gat...@googlegroups.com

Stéphane Landelle

unread,
Jun 27, 2012, 12:00:36 PM6/27/12
to gat...@googlegroups.com
Hello Aris,

As I explained it, Gatling now supports checksum checks out of the box:
For example, you can write something like:

http("Downloading filel").get("http://my-host/my-path").check(md5.is("0xA59E79AB53EEF2883D72B8F8398C9AC3"))

Ouf course the checksum is updated when a file part is received, not computed against the whole body.
Generally speaking, Gatling discards file parts unless you need it for something else, for example checking the whole body with a regex.

So in your case, as you only want to compute the checksum, file parts will be indeed discarded.

Cheers,

Stéphane


2012/6/27 Aris Green <aris....@gmail.com>

Aris Green

unread,
Jun 28, 2012, 12:56:34 AM6/28/12
to gat...@googlegroups.com
Thanks, that is what I would have needed.  I guess I did not see that earlier.  I did see the note about not loading the whole response body in memory.  

Stéphane Landelle

unread,
Jun 28, 2012, 2:12:30 AM6/28/12
to gat...@googlegroups.com
Hello Aris,

I hope you will have the time to try Gatling someday.
Anyway, this feature was worth asking for and other users are glad to have it, so thank you.

Regards,

Stéphane

2012/6/28 Aris Green <aris....@gmail.com>

Gaurav Seth

unread,
Aug 3, 2017, 5:05:36 AM8/3/17
to Gatling User Group, slan...@excilys.com
Hello Stephanne

I am using gatling to download large files upto 10gb but i noticed i am only able to download files upto 500mb and anything over like 1gb gives error. I am using sbt project and have allocated 12gb memory

val javaMemoryFlags = List("-Xmx12g")
javaOptions in GatlingKeys.Gatling := GatlingKeys.overrideDefaultJavaOptions(javaMemoryFlags:_*),


Below is the call. As the part of the test i need to download the file to disk but it is not able to download more than 500mb file.  Please can you suggest me . I am able to upload hugh files

def downloadFile(): HttpRequestBuilder = {
http("Download file")
.get(s"$baseUrl/sdes/download/${fileToDownload}")
.check(status.is(200), status.not(500))
.extraInfoExtractor { extraInfo => List(getExtraInfo(extraInfo)) }
}

  def getExtraInfo(extraInfo: ExtraInfo): Unit = {
println("************** Printing Response metrics for the file download*******************************************")
println("Request URL:" + extraInfo.request.getUrl)
println("HAS RESPONSE Body------>:" + extraInfo.response.hasResponseBody)
println(" RESPONSE Body length----->:" + extraInfo.response.bodyLength)
println(" RESPONSE Body isreceived----->:" + extraInfo.response.isReceived)
println(" File downloaded in seconds ----->:" + (extraInfo.response.responseTimeInMillis) / 1000)
println(" Response status code----->:" + extraInfo.response.statusCode)
val downloadFileResponse: String = extraInfo.response.body.string
val timestamp = new java.text.SimpleDateFormat("MM_dd_yyyy_h:mm:ss").format(new Date());
val newDownloadFileName = s"${timestamp}_${fileToDownload}"
println("New file name is --->" + newDownloadFileName)
writeToFile(downloadFileResponse, s"/tmp/download/${timestamp}_${fileToDownload}")
}

def writeToFile(response: String, filePath: String) = {
var writer = null
try {
var writer: BufferedWriter = new BufferedWriter(new OutputStreamWriter(new FileOutputStream(filePath)))
writer.write(response)
writer.close()
} catch {
case e: IOException =>
}
}
}





ava.lang.OutOfMemoryError: Java heap space

Dumping heap to java_pid4863.hprof ...



Regards
Gaurav
Reply all
Reply to author
Forward
Message has been deleted
0 new messages