Network inversion MemoryError

125 views
Skip to first unread message

Robert Zinke

unread,
Jul 31, 2019, 4:55:25 PM7/31/19
to MintPy

Hello,

I am building a timeseries using MintPy for ~370 Sentinel-1 interferograms processed as ARIA GUWN products (based on ISCE software), representing >50 epochs in time. I set up the smallbaselineApp to run on our server, which has ~128G RAM. I set the process to run, and it goes smoothly through the initial steps and most of the invert_network step -- that is, it cleanly processes 210/210 patches -- then fails with the following error message:
...
------- Processing Patch 210 out of 210 --------------
reading unwrapPhase in (0, 8360, 7200, 8400) * 372 ...
use input reference phase
skip pixels with zero/nan value in all interferograms
number of pixels to invert: 127770 out of 288000 (44.4%)
inverting pixels with valid phase in all  ifgrams (127609 pixels) ...
inverting pixels with valid phase in some ifgrams (161 pixels) ...
[==================================================] 161/161 pixels
--------------------------------------------------
converting phase to range
calculating perpendicular baseline timeseries
Traceback (most recent call last):
  File "/u/sarh0/rzinke/tools/MintPy/mintpy/smallbaselineApp.py", line 1069, in <module>
    main()
  File "/u/sarh0/rzinke/tools/MintPy/mintpy/smallbaselineApp.py", line 1059, in main
    app.run(steps=inps.runSteps, plot=inps.plot)
  File "/u/sarh0/rzinke/tools/MintPy/mintpy/smallbaselineApp.py", line 1003, in run
    self.run_network_inversion(sname)
  File "/u/sarh0/rzinke/tools/MintPy/mintpy/smallbaselineApp.py", line 538, in run_network_inversion
    mintpy.ifgram_inversion.main(scp_args.split())
  File "/u/sarh0/rzinke/tools/MintPy/mintpy/ifgram_inversion.py", line 1261, in main
    ifgram_inversion(inps.ifgramStackFile, inps)
  File "/u/sarh0/rzinke/tools/MintPy/mintpy/ifgram_inversion.py", line 1213, in ifgram_inversion
    write2hdf5_file(ifgram_file, metadata, ts, temp_coh, ts_std, num_inv_ifg, suffix='', inps=inps)
  File "/u/sarh0/rzinke/tools/MintPy/mintpy/ifgram_inversion.py", line 545, in write2hdf5_file
    ts_obj.write2hdf5(data=ts, dates=date_list, bperp=pbase, metadata=metadata)
  File "/u/sarh0/rzinke/tools/MintPy/mintpy/objects/stack.py", line 283, in write2hdf5
    data = np.array(data, dtype=np.float32)
MemoryError

I take it to mean that there is not enough RAM available calculate the perpendicular baseline timeseries. Are there any fixes for this? Any help is greatly appreciated!

Please let me know if you need additional information.

Cheers,
Rob

Yunjun Zhang

unread,
Jul 31, 2019, 6:50:31 PM7/31/19
to MintPy
Hi Rob,

It looks like your interferograms are very large swaths (I’m guessing ~2G each maybe) and therefore you are hitting the memory threshold of your machine and our current available implementation. A block by block writing to disk is currently under development and will solve the memory issue. With the limited number of developers we can not guarantee any time soon to release this version. However, we are doing our best to include the inversion part as soon as we can. 

I have committed a simple change (https://github.com/insarlab/MintPy/pull/163), which should reduce the memory usage to some extent. Please update your code to the latest development version to use it. In the meantime if possible try to process a smaller region by setting up the bounding box while running ARIA-tools or later running MintPy.

Yunjun

--
You received this message because you are subscribed to the Google Groups "MintPy" group.
To unsubscribe from this group and stop receiving emails from it, send an email to mintpy+un...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/mintpy/bd5f4df9-9cca-4cce-8991-155d57e5f5d2%40googlegroups.com.

Robert Zinke

unread,
Jul 31, 2019, 6:56:39 PM7/31/19
to MintPy
Hi Yunjun,

Thanks very much for the explanation and the GitHub commit. I will update my code version and experiment to find the best region size to use.

Cheers,
Rob

On Wednesday, July 31, 2019 at 3:50:31 PM UTC-7, Yunjun Zhang wrote:
Hi Rob,

It looks like your interferograms are very large swaths (I’m guessing ~2G each maybe) and therefore you are hitting the memory threshold of your machine and our current available implementation. A block by block writing to disk is currently under development and will solve the memory issue. With the limited number of developers we can not guarantee any time soon to release this version. However, we are doing our best to include the inversion part as soon as we can. 

I have committed a simple change (https://github.com/insarlab/MintPy/pull/163), which should reduce the memory usage to some extent. Please update your code to the latest development version to use it. In the meantime if possible try to process a smaller region by setting up the bounding box while running ARIA-tools or later running MintPy.

Yunjun

To unsubscribe from this group and stop receiving emails from it, send an email to min...@googlegroups.com.

Robert Zinke

unread,
Jul 31, 2019, 7:52:51 PM7/31/19
to MintPy
One quick follow on. Before crashing with the memory error, MintPy produced three .h5 files (avePhaseVelocity.h5, aveSpatialCoh.h5, maskConnComp.h5). I can view these files in QGIS and the results appear valid. Are these considered "final results" in the processing routine, or are further actions typically performed on them?

Thanks,
Rob



On Wednesday, July 31, 2019 at 3:50:31 PM UTC-7, Yunjun Zhang wrote:
Hi Rob,

It looks like your interferograms are very large swaths (I’m guessing ~2G each maybe) and therefore you are hitting the memory threshold of your machine and our current available implementation. A block by block writing to disk is currently under development and will solve the memory issue. With the limited number of developers we can not guarantee any time soon to release this version. However, we are doing our best to include the inversion part as soon as we can. 

I have committed a simple change (https://github.com/insarlab/MintPy/pull/163), which should reduce the memory usage to some extent. Please update your code to the latest development version to use it. In the meantime if possible try to process a smaller region by setting up the bounding box while running ARIA-tools or later running MintPy.

Yunjun

To unsubscribe from this group and stop receiving emails from it, send an email to min...@googlegroups.com.

Yunjun Zhang

unread,
Jul 31, 2019, 8:44:45 PM7/31/19
to MintPy
Hi Rob,

They are results from previous steps; and there will be more file generated along the way afterwards. There are some descriptions in the notebook tutorials (https://nbviewer.jupyter.org/github/insarlab/MintPy/blob/master/docs/tutorials/smallbaselineApp.ipynb), I recommend you to take a look at it.

Yunjun

To unsubscribe from this group and stop receiving emails from it, send an email to mintpy+un...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/mintpy/0509d258-d279-449c-a887-c70604c77af6%40googlegroups.com.

Robert Zinke

unread,
Jul 31, 2019, 8:46:45 PM7/31/19
to MintPy
Will do. Thanks!


On Wednesday, July 31, 2019 at 5:44:45 PM UTC-7, Yunjun Zhang wrote:
Hi Rob,

They are results from previous steps; and there will be more file generated along the way afterwards. There are some descriptions in the notebook tutorials (https://nbviewer.jupyter.org/github/insarlab/MintPy/blob/master/docs/tutorials/smallbaselineApp.ipynb), I recommend you to take a look at it.

Yunjun

Yunjun Zhang

unread,
Aug 13, 2019, 6:38:49 PM8/13/19
to MintPy
Hi Rob,

Heresh has committed the "block-by-block writing to disk" version of the network inversion part to GitHub (https://github.com/insarlab/MintPy/pull/181). Update to the latest development version on Github to try it if you are interested.

Cheers,

Yunjun

Robert Zinke

unread,
Aug 14, 2019, 11:32:22 PM8/14/19
to MintPy
Hello,

Thanks very much for working on this! I pulled the most recent version of MintPy and re-ran the process. I unfortunately encountered the same memory error:

******************** step - deramp ********************
Remove for each acquisition a phase ramp: linear
remove_ramp.py timeseries.h5 -s linear -m maskTempCoh.h5 -o timeseries_ramp.h5 --update
--------------------------------------------------
update mode: ON
1) output file timeseries_ramp.h5 NOT found.
run or skip: run.
remove linear ramp from file: timeseries.h5
read mask file: maskTempCoh.h5
reading data ...

Traceback (most recent call last):
  File "/u/sarh0/rzinke/tools/MintPy/mintpy/smallbaselineApp.py", line 1069, in <module>
    main()
  File "/u/sarh0/rzinke/tools/MintPy/mintpy/smallbaselineApp.py", line 1059, in main
    app.run(steps=inps.runSteps, plot=inps.plot)
  File "/u/sarh0/rzinke/tools/MintPy/mintpy/smallbaselineApp.py", line 1012, in run
    self.run_phase_deramping(sname)
  File "/u/sarh0/rzinke/tools/MintPy/mintpy/smallbaselineApp.py", line 776, in run_phase_deramping
    mintpy.remove_ramp.main(scp_args.split())
  File "/u/sarh0/rzinke/tools/MintPy/mintpy/remove_ramp.py", line 122, in main
    datasetName=inps.dset)
  File "/u/sarh0/rzinke/tools/MintPy/mintpy/utils/utils1.py", line 675, in run_deramp
    data = readfile.read(fname)[0]
  File "/u/sarh0/rzinke/tools/MintPy/mintpy/utils/readfile.py", line 204, in read
    data = read_hdf5_file(fname, datasetName=datasetName, box=box)
  File "/u/sarh0/rzinke/tools/MintPy/mintpy/utils/readfile.py", line 280, in read_hdf5_file
    data = ds[slice_flag, box[1]:box[3], box[0]:box[2]]
  File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper
  File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper
  File "/u/sar-r0/rzinke/python/miniconda3/envs/MintPy/lib/python3.7/site-packages/h5py/_hl/dataset.py", line 562, in __getitem__
    arr = numpy.ndarray(mshape, new_dtype, order='C')
MemoryError

Yunjun Zhang

unread,
Aug 15, 2019, 12:06:31 AM8/15/19
to MintPy
Hi Rob,

First I am glad that the processing passed through “invert_network” step. There are several other steps, including “deramp”, that have similar memory limitation, which caused the error you have. The similar block-by-block processing and writing strategy needs to be applied in all those steps in order to be free from this issue completely. It’s in our to-do-list. 

For now please try to, before loading data, a) process a smaller region by setting up the bonding box (recommended) or b) apply extra multilooking to your dataset to reduce the spatial resolution. 

Yunjun

To unsubscribe from this group and stop receiving emails from it, send an email to mintpy+un...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/mintpy/4a669eac-bc45-4598-966e-0d87793fe500%40googlegroups.com.

Heresh Fattahi

unread,
Aug 15, 2019, 12:29:22 AM8/15/19
to Yunjun Zhang, MintPy
Thank you Rob for the update. This is goos news. Actually your processing has passed the inversion, which as Yunjun mentioned is the only step which we modified so far. Since the fix works for the inversion we will continue for other steps.

Heresh
 

Robert Zinke

unread,
Aug 15, 2019, 12:40:09 AM8/15/19
to MintPy
Great, glad to hear it!
I have processed a smaller area with fewer dates and the data set finished the entire process.

Thanks,
Rob

On Wednesday, August 14, 2019 at 10:29:22 PM UTC-6, hersh.fattahi wrote:
Thank you Rob for the update. This is goos news. Actually your processing has passed the inversion, which as Yunjun mentioned is the only step which we modified so far. Since the fix works for the inversion we will continue for other steps.

Heresh
On Aug 14, 2019, at 9:06 PM, Yunjun Zhang <yunju...@gmail.com> wrote:

Hi Rob,

First I am glad that the processing passed through “invert_network” step. There are several other steps, including “deramp”, that have similar memory limitation, which caused the error you have. The similar block-by-block processing and writing strategy needs to be applied in all those steps in order to be free from this issue completely. It’s in our to-do-list. 

For now please try to, before loading data, a) process a smaller region by setting up the bonding box (recommended) or b) apply extra multilooking to your dataset to reduce the spatial resolution. 

Yunjun


--
You received this message because you are subscribed to the Google Groups "MintPy" group.
To unsubscribe from this group and stop receiving emails from it, send an email to min...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/mintpy/7A2E56A9-037A-4766-978B-2333486CBCDC%40gmail.com.

Reply all
Reply to author
Forward
0 new messages