small-baselines(error)

286 views
Skip to first unread message

beibei chen

unread,
Aug 2, 2011, 6:14:27 AM8/2/11
to mai...@googlegroups.com

Dear Hooper teams and Mainsar's members,

   Sorry to bother you , I'm processing my large datasets and large
interesting region by  StaMPS. I have 29 ASAR SLCs, my
interesting region is 21000lines*4800pixels, and I use “small-baselines”
method, I selected 116 small-baselines pairs.

   I used “mt_prep 0.6 8 7 50 200” to form 56 patches. I have a wrong tips
when I processing “stamps(4,4)”, it follows like this:


########################################
############ StaMPS Step 4 #############
########################################

Weeding selected pixels...
   PARM: weed_time_win=180
   PARM: weed_standard_dev=0.3
   PARM: weed_max_noise=0.1
   PARM: weed_zero_elevation='n'
   PARM: small_baseline_flag='y'
182777 low D_A PS, 0 high D_A PS

step_name =

DROP NOISY

Opening psweed.1.node.
Constructing Delaunay triangulation by divide-and-conquer method.
Delaunay milliseconds:  330

Writing psweed.2.node.
Writing psweed.2.ele.
Writing psweed.2.edge.

Output milliseconds:  1593
Total running milliseconds:  2110

Statistics:

  Input vertices: 182777

  Mesh vertices: 182777
  Mesh triangles: 365521
  Mesh edges: 548297
  Mesh exterior boundary edges: 31

   10000 PS of 182777 processed
   20000 PS of 182777 processed
   30000 PS of 182777 processed
   40000 PS of 182777 processed
   50000 PS of 182777 processed
   60000 PS of 182777 processed
   70000 PS of 182777 processed
   80000 PS of 182777 processed
   90000 PS of 182777 processed
   100000 PS of 182777 processed
   110000 PS of 182777 processed
   120000 PS of 182777 processed
   130000 PS of 182777 processed
   140000 PS of 182777 processed
   150000 PS of 182777 processed
   160000 PS of 182777 processed
   170000 PS of 182777 processed
   180000 PS of 182777 processed
0 PS kept after dropping noisy pixels
psver currently: 1
psver now set to: 2
/media/Linux/PSProject/INSAR_20051214/SMALL_BASELINES/PATCH_41
psver currently: 1
psver now set to: 1

########################################
############ StaMPS Step 4 #############
########################################

Weeding selected pixels...
   PARM: weed_time_win=180
   PARM: weed_standard_dev=0.3
   PARM: weed_max_noise=0.1
   PARM: weed_zero_elevation='n'
   PARM: small_baseline_flag='y'
222477 low D_A PS, 0 high D_A PS

step_name =

DROP NOISY

Opening psweed.1.node.
Constructing Delaunay triangulation by divide-and-conquer method.
Delaunay milliseconds:  414

Writing psweed.2.node.
Writing psweed.2.ele.
Writing psweed.2.edge.

Output milliseconds:  2854
Total running milliseconds:  3495

Statistics:

  Input vertices: 222477

  Mesh vertices: 222477
  Mesh triangles: 444922
  Mesh edges: 667398
  Mesh exterior boundary edges: 30

??? Error using ==> unknown
Out of memory. Type HELP MEMORY for your options.

Error in ==> ps_weed at 285
    dph_space=(ph_weed(edges(:,3),:).*conj(ph_weed(edges(:,2),:)));

Error in ==> stamps at 147
            ps_weed(0,1);

>>
Did I have to divide more patches or less patches when processing “mt_prep”?
I have mt_prep 0.6 8 8 50 200 and mt_prep 0.6 7 6 50 200 and mt_prep 0.6 6 5 50 200 .also have error:"Out of memory"

I have set weed_standard_dev to 0.8/0.6/0.4,set weed_max_noise to 0.6/0.4/0.2.also have error:"Out of memory"

I am very worry.I have spent 4 months.

                                                                          beibei  chen

David Bekaert

unread,
Aug 4, 2011, 3:06:05 AM8/4/11
to mai...@googlegroups.com
Hi,
 
Your triangulated network has more edges compared to the previous one.
So, I think it is because of the amount of PS candidates.
Check if the other patches, were it goes right, also have lesser edges in the triangulated network.
 
The weeding requries a triangulated network, and it is just after this step where you run out of memory.
The selection based on the weed_std and weed_max_noise comes after the pace where you get the memory error.
So variation of these parameters should not affect the out of memory error you are getting.
 
What you could try to do is to reduce the number of PS candidates by modifying percent_random in step 3
Do this first for the patch,having the error alone and see if it makes a difference.
This will most likely result in a patch like feature after merging.
 
Best would be to process the full dataset with same parameters.
So perhaps you can then decrease the patch size more (this would require reprocessing starting from mt_prep).
You could also try to select a different value for the threshold of the amplitude difference dispersion.
 
David
 
PS weed_max_noise is reqlly low. weed_standard_dev is reasonable 
 

 
--
You received this message because you are subscribed to the Google Groups "MAINSAR" group.
To post to this group, send email to mai...@googlegroups.com.
To unsubscribe from this group, send email to mainsar+u...@googlegroups.com.
For more options, visit this group at http://groups.google.com/group/mainsar?hl=en.

beibei chen

unread,
Aug 4, 2011, 3:28:18 AM8/4/11
to mai...@googlegroups.com
Dear David,
 
   thank  you  very much ,I will try it following your advise。
  
    Best wishes!
 
                                                     beibei  chen
2011/8/4 David Bekaert <bekaer...@gmail.com>

LI YINGHUI

unread,
Aug 10, 2011, 6:58:35 AM8/10/11
to mai...@googlegroups.com
Hi,
Do you use the latest version of stamps ? The parameter 'weed_time_win' is set to 730 instead of 180 in the latest version. I'm just curious, maybe this is not a problem.

2011/8/4 beibei chen <chenbe...@gmail.com>

beibei chen

unread,
Aug 11, 2011, 1:55:11 AM8/11/11
to mai...@googlegroups.com
thank you very much for your help!
best wishes!
 
                  smile


 
2011/8/10 LI YINGHUI <lyh...@gmail.com>
Reply all
Reply to author
Forward
0 new messages