Precomputed data server by Amazon S3

237 views
Skip to first unread message

tahar...@gmail.com

unread,
Aug 28, 2017, 9:12:38 AM8/28/17
to Neuroglancer
Hi

I asked a few weeks ago about serving precomputed data stored on localhost, you gave me precious advices and actually it works. 

I am actually trying to put my data on Amazon S3 (Public access) but I am facing some issues : 

-The data is actually served by Amazon S3 with public access : 


-The data works well on localhost but not on amazon S3

-I added this code in the bucket policy : 

            "Sid": "AllowPublicRead",
            "Effect": "Allow",
            "Principal": {
                "AWS": "*"
            },
            "Action": "s3:GetObject",
            "Resource": "arn:aws:s3:::elasticbeanstalk-eu-west-1-945165447763/*"

-I have this CORS configuration 

<?xml version="1.0" encoding="UTF-8"?>
<CORSConfiguration xmlns="http://s3.amazonaws.com/doc/2006-03-01/">
<CORSRule>
    <AllowedOrigin>*</AllowedOrigin>
    <AllowedMethod>GET</AllowedMethod>
    <MaxAgeSeconds>3000</MaxAgeSeconds>
    <AllowedHeader>Authorization</AllowedHeader>
</CORSRule>
<CORSRule>
    <AllowedOrigin>*</AllowedOrigin>
    <AllowedMethod>PUT</AllowedMethod>
    <MaxAgeSeconds>3000</MaxAgeSeconds>
    <AllowedHeader>Content-Type</AllowedHeader>
    <AllowedHeader>x-amz-acl</AllowedHeader>
    <AllowedHeader>origin</AllowedHeader>
</CORSRule>
</CORSConfiguration>

-Actually neuroglancer is able to retrieve the info file from the S3 server but it is unable to download the chunk files. When I debug the html responses I get many HTTP 404 not found errors like this one: 


In fact I am not sure but in order to make work it would be better to have 

Thank you a lot for your help.

Tahar

Jeremy Maitin-Shepard

unread,
Aug 28, 2017, 2:11:25 PM8/28/17
to tahar...@gmail.com, Neuroglancer
If I understand correctly, you have uploaded the chunks as:


but Neuroglancer is expecting them as:


The solution would be to simply rename the files.

Currently there isn't support for customizing the URL format for retrieving the chunks --- potentially that would be a useful feature if you are interested in implementing it.  The same issue should occur with serving them from localhost, though.

--
You received this message because you are subscribed to the Google Groups "Neuroglancer" group.
To unsubscribe from this group and stop receiving emails from it, send an email to neuroglancer+unsubscribe@googlegroups.com.
To post to this group, send email to neurog...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/neuroglancer/4ed9c5a5-9467-4b77-bb5e-a93a6fedef83%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

tahar...@gmail.com

unread,
Aug 29, 2017, 4:10:25 AM8/29/17
to Neuroglancer, tahar...@gmail.com
Hi

Thanks a lot for this advice. I will try this solution. 

Thanks,

Tahar
To unsubscribe from this group and stop receiving emails from it, send an email to neuroglancer...@googlegroups.com.
Message has been deleted

tahar...@gmail.com

unread,
Aug 30, 2017, 4:15:11 AM8/30/17
to Neuroglancer, tahar...@gmail.com
Hi Thanks for your advices. In fact I think I am facing a deeper problem, please let explain it

-From Localhost, I am perfectly able to retrieve the chunks and visualize them without any problem. I followed this architecture : 

Root/Path/raw/col/slices.gz similar to this link : Root/raw/112nm/1024-1536/512-1024/0-2.gz

But when I try to do the same with my amazon bucket I got this error : 


So I have been debugging the backend.ts file :  https://github.com/google/neuroglancer/blob/master/src/neuroglancer/datasource/precomputed/backend.ts and found this line : 

line 46 : 

 path = `${parameters.path}/${chunkPosition[0]}-${chunkPosition[0] + chunkDataSize[0]}_` +${chunkPosition[1]}-${chunkPosition[1] + chunkDataSize[1]}_`+`${chunkPosition[2]}-${chunkPosition[2] + chunkDataSize[2]}`;


So I changed the patterm to match my architecture patterns, I mean 

path = `${parameters.path}/${chunkPosition[0]}-${chunkPosition[0] + chunkDataSize[0]}/` +`${chunkPosition[1]}-${chunkPosition[1] + chunkDataSize[1]}/` +`${chunkPosition[2]}/${chunkPosition[2] + chunkDataSize[2]}.gz`;

I just replaced "_" with "/" and added a .gz in the end. 

-Localhost neuroglancer crashed : Error : 

Error retrieving chunk precomputed:volume:http://localhost:8085/precomputed/raw//112nm:4,2,0: Error: Raw-format chunk is 374149 bytes, but 524288 * 1 = 524288 bytes are expected. 

This error appears for aws bucket served data and localhost but at least it is able to find the chunks. 


Thanks a lot for your support, 

Tahar


Le lundi 28 août 2017 20:11:25 UTC+2, Jeremy Maitin-Shepard a écrit :
To unsubscribe from this group and stop receiving emails from it, send an email to neuroglancer...@googlegroups.com.

tahar...@gmail.com

unread,
Aug 30, 2017, 8:26:28 AM8/30/17
to Neuroglancer
Please let me explain better : 

1. I renamed my files as you suggested (e.g.,http://s3-eu-west-1.amazonaws.com/elasticbeanstalk-eu-west-1-945165447763/raw/112nm/1024-1536_512-1024_0-2).
1.a. With local data: the error is

 XMLHttpRequest cannot load http://localhost:8085/precomputed/raw_wo_underscore/28nm/5632-6144_4096-4608_0-2. No 'Access-Control-Allow-Origin' header is present on the requested resource. Origin 'http://localhost:8080' is therefore not allowed access. The response had HTTP status code 404.

1.b With data on Amazon: the error is 

Error retrieving chunk precomputed:volume:https://s3-eu-west-1.amazonaws.com/elasticbeanstalk-eu-west-1-945165447763/raw_wo_underscore//112nm:2,2,0: Error: Raw-format chunk is 412630 bytes, but 524288 * 1 = 524288 bytes are expected.

2. About your remark "The same issue should occur with serving them from localhost, though". Actually no, it was working correctly with local data with this pattern raw/112nm/1024-1536/512-1024/0-2.gz.

3. I tried to modify  https://github.com/google/neuroglancer/blob/master/src/neuroglancer/datasource/precomputed/backend.ts at line 46: 

 path = `${parameters.path}/${chunkPosition[0]}-${chunkPosition[0] + chunkDataSize[0]}_` +${chunkPosition[1]}-${chunkPosition[1] + chunkDataSize[1]}_`+`${chunkPosition[2]}-${chunkPosition[2] + chunkDataSize[2]}`; to

path = `${parameters.path}/${chunkPosition[0]}-${chunkPosition[0] + chunkDataSize[0]}/` +${chunkPosition[1]}-${chunkPosition[1] + chunkDataSize[1]}/`+`${chunkPosition[2]}-${chunkPosition[2] + chunkDataSize[2]}.gz`;

I just replaced "_" with "/" and added a .gz in the end, since it was working nice with local data but : 

-Neuroglancer with local data does no longer work 
-AWS served data does not appear 

I got this error for both served data : 

Error retrieving chunk precomputed:volume:http://localhost:8085/precomputed/raw//112nm:4,2,0: Error: Raw-format chunk is 374149 bytes, but 524288 * 1 = 524288 bytes are expected. 

This error appears for aws bucket served data and localhost but at least it is able to find the chunks. 

Thanks a lot for your help, 

Tahar

Jeremy Maitin-Shepard

unread,
Aug 30, 2017, 2:11:23 PM8/30/17
to Tahar Nguira, Neuroglancer
I'm a bit confused how neuroglancer is able to deal with data in the pattern: raw/112nm/1024-1536/512-1024/0-2.gz

Perhaps you are using some code external to the official neuroglancer project to support this?  Can you tell me the full URL in your browser when using neuroglancer with local data in that pattern, so that I can see what it is doing?

As far as this error: Error: Raw-format chunk is 374149 bytes, but 524288 * 1 = 524288 bytes are expected.

That is because the precomputed data source does not itself handle gzip encoding.  However, if you set all of your files on S3 to have the following extra header:
Content-Encoding: gzip
then the browser will automatically gzip-decode them on the fly when they are retrieved and it should work.

To unsubscribe from this group and stop receiving emails from it, send an email to neuroglancer+unsubscribe@googlegroups.com.

To post to this group, send email to neurog...@googlegroups.com.

Jeremy Maitin-Shepard

unread,
Aug 30, 2017, 2:15:58 PM8/30/17
to Tahar Nguira, Neuroglancer
By the way, there is documentation of the precomputed data format here:

tahar...@gmail.com

unread,
Aug 31, 2017, 10:21:29 AM8/31/17
to Neuroglancer, tahar...@gmail.com
Dear Jeremy, 

Yes it is very confusing so let me tell you how I did it : 

cd neuroglancer
npm run dev-server //runs a server on localhost:8080
docker run --name neuroglancer -v /home/finch/Documents/renders/neuroglancer/precomputed:/precomputed:ro -p 8085:80 -d ylep/neuroglancer
npm run build:watch

and then from the browser (Chrome ):
http://localhost:8080
and then
precomputed://http://localhost:8085/precomputed/raw .. And then everything works nice
Data structure : Root/raw/...nm/1024-1536/512-1024/0-2

Concerning the Content-Encoding: gzip, Thanks a lot for this suggestion. I will check it as soon as possible and let you know. 

Best regards, 

Tahar

Message has been deleted

tahar...@gmail.com

unread,
Aug 31, 2017, 11:39:05 AM8/31/17
to Neuroglancer, tahar...@gmail.com
Perhaps you are using some code external to the official neuroglancer project to support this?  Can you tell me the full URL in your browser when using neuroglancer with local data in that pattern, so that I can see what it is doing?

Dear Jeremy, Thanks a lot for your availability : Let me please reply to this question : 

-Perhaps you are using some code external to the official neuroglancer project to support this?  Can you tell me the full URL in your browser when using neuroglancer with local data in that pattern, so that I can see what it is doing?

Concerning this question, I don't use any external code, just I follow the official guidelines stated in :

and 

Thanks, 

Tahar

Jeremy Maitin-Shepard

unread,
Aug 31, 2017, 12:45:56 PM8/31/17
to Tahar Nguira, Neuroglancer
I looked into this a bit more.  What is happening is that the webserver started by your docker container is doing these transformations automatically.  See the gzip_static and alias directive here:


--
You received this message because you are subscribed to the Google Groups "Neuroglancer" group.
To unsubscribe from this group and stop receiving emails from it, send an email to neuroglancer+unsubscribe@googlegroups.com.
To post to this group, send email to neurog...@googlegroups.com.

tahar...@gmail.com

unread,
Sep 1, 2017, 4:30:51 AM9/1/17
to Neuroglancer, tahar...@gmail.com
Dear Jeremy, 

It works now. Thanks a lot for your help. In fact it was due to the content encoding. 

Best regards, 

Tahar
Reply all
Reply to author
Forward
0 new messages