Fwd: Re: Parallel AES Project

14 views
Skip to first unread message

Vijay Saraswat

unread,
Dec 6, 2010, 5:30:06 AM12/6/10
to coms49...@googlegroups.com, amk...@columbia.edu, ppp...@columbia.edu
What is the error? What is the size? Are you trying to read the same
file at the same time from multiple places?

You should not see any such error...

Please copy the coms49...@googlegroups.com email list.

Best,
Vijay

Yes we definitely plan to run on multiple places. Currently we are
facing an issue in scaling up input size. As the plaintext size goes
beyond a certain limit, we get an error saying the file cannot be
opened. We suspect it has got to do with some limit on the number of
streams opened on a handle and/or the number of handles opened on a file.

We intend to approach Prof Saraswat and you after class tomorrow.

Thanks,

Pranay

Quoting Martha Kim <mar...@cs.columbia.edu>:

> This is a single-place AES encrypter. If the single place version is
> scaling well (is it?) you should move on to multiplace. When we ran
it we
> saw 3x speedup for NTHREADS=8 over NTHREADS=1. You will want to use
larger
> inputs as right now the tests run in under a second. That is not a great
> deal of work to be parallelizing. Do you plan to run on mulitple places?
> Once you have attained good single place scaling you can move on to that
> direction. Be sure to use office hours to help if necessary.
>


Pranay Prabhakar

unread,
Dec 10, 2010, 6:57:11 PM12/10/10
to Vijay Saraswat, mak...@columbia.edu, coms49...@googlegroups.com, amk...@columbia.edu

The error we are getting is:

x10.io.FileNotFoundException: <FileName>

This error we are getting for multiple files, which already exist. This errors start appearing after the plaintext size goes beyond a certain value.

We are running our code on a single place. Following is the only part of the code which is parallelized and throws the exception:

I am attaching the code snippet for your perusal, in the file parallelAESSnippet.txt

Thanks,

Pranay

parallelAESSnippet.txt

Pranay Prabhakar

unread,
Dec 10, 2010, 6:58:14 PM12/10/10
to Vijay Saraswat, mak...@columbia.edu, coms49...@googlegroups.com, amk...@columbia.edu

The error we are getting is:

x10.io.FileNotFoundException: <FileName>

This error we are getting for multiple files, which already exist. This errors start appearing after the plaintext size goes beyond a certain value.

We are running our code on a single place. Following is the only part of the code which is parallelized and throws the exception:

I am attaching the code snippet for your perusal, in the file parallelAESSnippet.txt

Thanks,

Pranay

Quoting Vijay Saraswat <vi...@saraswat.org>:

parallelAESSnippet.txt

Pranay Prabhakar

unread,
Dec 10, 2010, 7:00:01 PM12/10/10
to Vijay Saraswat, mak...@columbia.edu, coms49...@googlegroups.com, amk...@columbia.edu

The error we are getting is: x10.io.FileNotFoundException: <FileName>

This error we are getting for multiple files, which already exist. This errors start appearing after the plaintext size goes beyond a certain value.

We are running our code on a single place. Following is the only part of the code which is parallelized and throws the exception:

I am attaching the code snippet for your perusal, in the file parallelAESSnippet.txt

Thanks,

Pranay

Pranay

Quoting Vijay Saraswat <vi...@saraswat.org>:

parallelAESSnippet.txt

Pranay Prabhakar

unread,
Dec 10, 2010, 7:00:50 PM12/10/10
to Vijay Saraswat, mak...@columbia.edu, coms49...@googlegroups.com, amk...@columbia.edu

The error we are getting is: x10.io.FileNotFoundException: <FileName>

This error we are getting for multiple files, which already exist. This errors start appearing after the plaintext size goes beyond a certain value.

We are running our code on a single place. Following is the only part of the code which is parallelized and throws the exception:

I am attaching the code snippet for your perusal, in the file parallelAESSnippet.txt

Thanks,

Pranay

Pranay

Quoting Vijay Saraswat <vi...@saraswat.org>:

parallelAESSnippet.txt

Vijay Saraswat

unread,
Dec 11, 2010, 6:22:39 AM12/11/10
to coms49...@googlegroups.com
Pranay ---

Ensure you are not reading the same file in multiple asyncs simultaneously. This might cause a problem on some operating systems.

Does your code always fail, or does it work in some cases (e.g. when the file sizes are smail)?

Pranay Prabhakar

unread,
Dec 11, 2010, 11:34:06 AM12/11/10
to coms49...@googlegroups.com, Vijay Saraswat, mar...@cs.columbia.edu, kul.ab...@gmail.com

I apologize for the multiple copies of the previous email. Something
went wrong with the client.

Since we are running a block-encryption routine, we have one Plaintext
file, and the idea is to fetch data blocks and process them
independently. So reading the same file at different offsets is what
we are doing.

Also, we are getting the error only when the Plaintext scales beyond
1024 k. Up to tha value, things works well.

The code is being run on Athos.

Thanks,
Pranay


Quoting Vijay Saraswat <vi...@saraswat.org>:

> Pranay ---
>
> Ensure you are not reading the same file in multiple asyncs
> simultaneously. This might cause a problem on some operating systems.
>
> Does your code always fail, or does it work in some cases (e.g. when
> the file sizes are smail)?
>
> On 12/10/2010 7:00 PM, Pranay Prabhakar wrote:
>>

>> The error we are getting is: x10.io.FileNotFoundException: <FileName/>/

Vijay Saraswat

unread,
Dec 11, 2010, 12:51:41 PM12/11/10
to coms49...@googlegroups.com
I dont think that ill work. Read entire file in, in a single async, into
an array. Then use multiple asyncs to process array in parallel.

Pranay Prabhakar

unread,
Dec 11, 2010, 2:23:01 PM12/11/10
to coms49...@googlegroups.com, Vijay Saraswat

Hello,

If the plaintext is of the order of Giga Bytes, won't this approach
cause overflows? Or should we take a a small portion of the file each
time in memory in only one thread and process that portion and repeat
this process for the entire file?

Martha Kim

unread,
Dec 11, 2010, 2:47:10 PM12/11/10
to coms49...@googlegroups.com, Vijay Saraswat
Hi Pranay,

Is it essential that you process GB files to get good speedups?  Can you process something smaller (so that Vijay's approach will work) and still show speedups?

If you must process a file that is multiple GB in size, you can work in stages, reading it in in chunks that will fit in memory and operating as Vijay proposed.  Read the first chunk of the file into memory, operate on it in parallel with multiple asyncs.  When done, read in the next chunk and repeat.

Martha
Reply all
Reply to author
Forward
0 new messages