write to s3 gae example

28 views
Skip to first unread message

cmin

unread,
Jun 4, 2009, 8:07:02 AM6/4/09
to jclouds
I've been trying to use this library with s3. I'm using this with
commons-fileupload FileItemStream . here is example

FileItemIterator iter = fileUpload.getItemIterator(request);
Map<String, InputStream> smap = context.createInputStreamMap
("bucketname");
while (iter.hasNext()) {
FileItemStream item = iter.next();
InputStream stream = item.openStream();
if (!item.isFormField()) {
smap.put("name",stream);
}
stream.close();
}
This is the exception I'm receiving. Also I am testing this in my
local development environment.
[java] org.jclouds.aws.s3.S3ResponseException: S3Error{code='null',
message='null', requestId='89303FB8A0273A88',
requestToken='v8ylCBnCcelQThjOA9/
fIeTu99eFUcoocV73OPzrjcj8nGKJTV31byvlFSmtiWrI', context='{}'}

Any help would be appreciated.
thanks

Adrian Cole

unread,
Jun 4, 2009, 2:09:52 PM6/4/09
to jcl...@googlegroups.com
Hello.

Sorry to hear your are having trouble.  Can you please provide more details about the java version  and version of the gae SDK you are using?

Cheers,
-Adrian

cmin

unread,
Jun 4, 2009, 2:52:08 PM6/4/09
to jclouds
Hi Adrian

thanks for the reply , I'm using java 1.5 on mac, and the latestet gae
sdk 1.2.1.

I saw in the gae logs, that its getting
.<stderr>: [Fatal Error] :-1:-1: Premature end of file.

I will try switching from the commons-upload streaming api, to the
more traditional DiskFileItemFactory implementation, and report back
to you if this made a difference. Also could be the flash item I'm
using to upload with.

thanks

Adrian Cole

unread,
Jun 4, 2009, 3:16:42 PM6/4/09
to jcl...@googlegroups.com
No problem.

Also, you need to ensure that you are creating your context properly
so that it uses the google app engine plugin:

------- from http://code.google.com/p/jclouds/wiki/Maven#Google_App_Engine_for_Java_Sample
S3 from Google App Engine

There is no change to the api, in order to switch your code to use
URLFetchService from within GAE. However, you do have to configure
your connection differently:

S3Context context =
S3ContextFactory.createS3Context(accesskeyid,secretkey, new
URLFetchServiceClientModule());
Here are the dependencies needed to use google's UrlFetchService?::

<dependency>
<groupId>org.jclouds</groupId>
<artifactId>jclouds-s3</artifactId>
<version>1.0-beta-1</version>
</dependency>
<dependency>
<groupId>org.jclouds</groupId>
<artifactId>jclouds-gae</artifactId>
<version>1.0-beta-1</version>
</dependency>

cmin

unread,
Jun 4, 2009, 6:38:52 PM6/4/09
to jclouds
Hi Adrian

I have included that and the log4j module.
From the google documentation, http://code.google.com/appengine/kb/java.html#fileforms
the streaming api is recommended.
Can you point me out to a unit test or example of posting to s3.
The current example I've found is only to list buckets.

On Jun 4, 12:16 pm, Adrian Cole <adrian.f.c...@gmail.com> wrote:
> No problem.
>
> Also, you need to ensure that you are creating your context properly
> so that it uses the google app engine plugin:
>
> ------- fromhttp://code.google.com/p/jclouds/wiki/Maven#Google_App_Engine_for_Jav...

Adrian Cole

unread,
Jun 5, 2009, 2:48:54 AM6/5/09
to jcl...@googlegroups.com
Hello, again!

As I'm sure you've noticed, there is no sample that posts to a bucket
via gae at the moment. I have created an issue [1] to have one made
availabile, since google is advertising the method you mention as a
recommended practice.


In the mean time, here are related integration tests:
http://code.google.com/p/jclouds/source/browse/tags/1.0-beta-1/s3/src/test/java/org/jclouds/aws/s3/commands/PutObjectIntegrationTest.java
http://code.google.com/p/jclouds/source/browse/tags/1.0-beta-1/s3/src/test/java/org/jclouds/aws/s3/S3InputStreamMapTest.java

I hope this helps!
-Adrian
[1] http://code.google.com/p/jclouds/issues/detail?id=50

Adrian Cole

unread,
Jun 7, 2009, 6:33:23 AM6/7/09
to jcl...@googlegroups.com
Update on this issue.  After some digging, it seems there is a problem with GAE manipulating the Host header of requests.  This would make most requests fail, and is pretty severe.  I'm currently working around this issue [1] and will cut a new beta shortly thereafter.

Regards,
-Adrian
[1] http://code.google.com/p/jclouds/issues/detail?id=52

Adrian Cole

unread,
Jun 7, 2009, 10:51:41 PM6/7/09
to jcl...@googlegroups.com, mina...@gmail.com
I have deployed a new snapshot (1.0-SNAPSHOT) which includes a major workaround to a GAE limitation.  This will give you a chance to test this while we prepare to cut a new beta.

In the mean time, we are going to reproduce your use case in an integration test so that we can ensure this always works moving forward[1]

Thanks for your patience.
-Adrian

[1]http://code.google.com/p/jclouds/issues/detail?id=50

cmin

unread,
Jun 7, 2009, 11:02:36 PM6/7/09
to jclouds
Hi Adrian

Thanks for the update, I will grab the latest, and let you know.
Appreciate it.

On Jun 7, 7:51 pm, Adrian Cole <adrian.f.c...@gmail.com> wrote:
> I have deployed a new snapshot (1.0-SNAPSHOT) which includes a major
> workaround to a GAE limitation.  This will give you a chance to test this
> while we prepare to cut a new beta.
>
> In the mean time, we are going to reproduce your use case in an integration
> test so that we can ensure this always works moving forward[1]
>
> Thanks for your patience.
> -Adrian
>
> [1]http://code.google.com/p/jclouds/issues/detail?id=50
>
> On Sun, Jun 7, 2009 at 12:33 PM, Adrian Cole <adrian.f.c...@gmail.com>wrote:
>
> > Update on this issue.  After some digging, it seems there is a problem with
> > GAE manipulating the Host header of requests.  This would make most requests
> > fail, and is pretty severe.  I'm currently working around this issue [1] and
> > will cut a new beta shortly thereafter.
>
> > Regards,
> > -Adrian
> > [1]http://code.google.com/p/jclouds/issues/detail?id=52
>
> > On Fri, Jun 5, 2009 at 8:48 AM, Adrian Cole <adrian.f.c...@gmail.com>wrote:
>
> >> Hello, again!
>
> >> As I'm sure you've noticed, there is no sample that posts to a bucket
> >> via gae at the moment.  I have created an issue [1] to have one made
> >> availabile, since google is advertising the method you mention as a
> >> recommended practice.
>
> >> In the mean time, here are related integration tests:
>
> >>http://code.google.com/p/jclouds/source/browse/tags/1.0-beta-1/s3/src...
>
> >>http://code.google.com/p/jclouds/source/browse/tags/1.0-beta-1/s3/src...
>
> >> I hope this helps!
> >> -Adrian
> >> [1]http://code.google.com/p/jclouds/issues/detail?id=50
>

cmin

unread,
Jun 8, 2009, 4:02:01 AM6/8/09
to jclouds
Hi Adrian

I tried out the latest snapshot, I have verified that upload works in
the hosted environment,
but I am receiving a time out exception - following is the stacktrace
local dev environment

[java] java.io.IOException: http method PUT against URL
https://<host>.amazonaws.com:443/image timed out.
[java] at
com.google.appengine.api.urlfetch.URLFetchServiceImpl.handleApplicationException
(URLFetchServiceImpl.java:56)
[java] at
com.google.appengine.api.urlfetch.URLFetchServiceImpl.fetch
(URLFetchServiceImpl.java:30)
[java] at org.jclouds.gae.URLFetchServiceClient.submit
(URLFetchServiceClient.java:90)
[java] at org.jclouds.gae.URLFetchServiceClient.submit
(URLFetchServiceClient.java:60)
[java] at
org.jclouds.aws.s3.internal.LiveS3Connection.putObject
(LiveS3Connection.java:133)
[java] at
org.jclouds.aws.s3.internal.LiveS3Connection.putObject
(LiveS3Connection.java:123)
[java] at
org.jclouds.aws.s3.internal.LiveS3InputStreamMap.putInternal
(LiveS3InputStreamMap.java:261)
[java] at org.jclouds.aws.s3.internal.LiveS3InputStreamMap.put
(LiveS3InputStreamMap.java:245)
[java] at org.jclouds.aws.s3.internal.LiveS3InputStreamMap.put
(LiveS3InputStreamMap.java:49)


any help is appreciated

Adrian Cole

unread,
Jun 8, 2009, 4:37:52 AM6/8/09
to jcl...@googlegroups.com, jclou...@googlegroups.com
Good to hear on the hosted environment!  On the dev environment, I've noticed that the SDK uses commons httpclient. 

You could check this out by turning up some debug: http://hc.apache.org/httpclient-3.x/logging.html

what size is the image?  What happens if you use non-ssl instead?

S3ContextFactory.createContext(AWSAccessKeyId, AWSSecretAccessKey).withHttpSecure(
               false).withHttpPort(80);

Best of luck,
-Adrian

cmin

unread,
Jun 8, 2009, 5:42:02 AM6/8/09
to jclouds
Hi Adrian

It appears the timeout maybe related to this

http://code.google.com/p/googleappengine/issues/detail?id=1358

As of now the timeout can't be set. This makes it very difficult to
test using s3.
Thanks again.

On Jun 8, 1:37 am, Adrian Cole <adrian.f.c...@gmail.com> wrote:
> Good to hear on the hosted environment!  On the dev environment, I've
> noticed that the SDK uses commons httpclient.
>
> You could check this out by turning up some debug:http://hc.apache.org/httpclient-3.x/logging.html
>
> what size is the image?  What happens if you use non-ssl instead?
>
> S3ContextFactory.createContext(AWSAccessKeyId,
> AWSSecretAccessKey).withHttpSecure(
>                false).withHttpPort(80);
>
> Best of luck,
> -Adrian
>
Reply all
Reply to author
Forward
0 new messages