Re: NBLAST Performance

67 views
Skip to first unread message

Gregory Jefferis

unread,
Mar 3, 2018, 12:19:21 AM3/3/18
to David Chong Hermes, nat-...@googlegroups.com
You need to convert the neurons from nm scale to microns by dividing by 1000.

Say you have neurons n1 and n2 that you have fetched from catmaid.

dp1=dotprops(n1/1e3, resample=1,k=5)

does everything you need in one go. Likewise dp2. Then pass them to nblast.

Best wishes,

Greg.

Sent from my iPhone

> On 3 Mar 2018, at 03:44, David Chong Hermes <dt...@cam.ac.uk> wrote:
>
> Dear Dr. Jefferis,
>
> I am a Part II student in Dr. Liria Masuda-Nakagawa's lab, and I have been working with data from the Drosophila L1 larva connectome project on CATMAID. I was interested in using NBLAST to perform morphology comparisons in order to do some clustering and I obtained an example script which demonstrated how to pull data using the R API developed by your lab and format the data into the dps format that is required for NBLAST usage. I managed to get NBLAST to work but it took quite a long time, on the order of hours, to convert the data to dps and perform a comparison of a pair of neurons. I was wondering if this behaviour is normal, and if not what might be the problem.
>
> Thank you.
>
> Sincerely,
> David Chong
>

Jingpeng Wu

unread,
Oct 30, 2018, 11:18:12 AM10/30/18
to nat-user
I have similar performance issues. It tooks more than 4 hours for classify about 1000 neurons from EM. It probably have too many nodes due to the high resolution of EM. 

Greg Jefferis

unread,
Jan 7, 2019, 1:52:55 PM1/7/19
to nat-user
Dear Jingpeng, I am not sure if you resolved your issue and at what stage it occurred. David Chong's issue was in preparing the neurons in dotprops format for NBLAST because they were not calibrated in units of microns. I am not sure if your issue was at this stage or when calculating all by all NBLAST similarity scores for hierarchical clustering. The time taken to compute these all by all scores scales with n^2 (where n is the number of neurons) and m*m log m where m is the number of segments per neuron. For 1000 neurons with 1000 segments apiece I would expect one cpu to take a few mins to calculate. You can use multiple cores to do the calculation:

# see https://github.com/jefferis/p1neurons for some example neurons
library(p1neurons)
doMC::registerDoMC(cores=8)
library(nat.nblast)
p1aba=nblast_allbyall(p1s, .parallel=T)

If you have many very small parts of the neuron, then you can consider simplifying them before running NBLAST. See https://jefferis.github.io/elmr/reference/simplify_neuron.html. This removes all but the largest n branches. n of 10-20 will already gives me the key structures of most of my neurons. You should also make sure when using the dotprops function that you resample so that your segments are of a length appropriate for the natural curvature of the neuron. I find 1 micron works well for fly neurons, but you might want a bigger number for bigger neurons.

Best wishes,

Greg.

Greg Jefferis

unread,
May 20, 2019, 4:50:56 PM5/20/19
to nat-user
Dear Jingpeng,

For reference, there is a fairly recent function

http://jefferis.github.io/elmr/reference/prune_twigs.html

That we use to remove many small branches from EM traced neutrons prior to nblast.

Best, Greg.

Reply all
Reply to author
Forward
0 new messages