error while running scenario damage demo

38 views
Skip to first unread message

Hyeuk Ryu

unread,
Nov 13, 2016, 10:06:10 PM11/13/16
to OpenQuake Users
Hi,

I've run the scenario damage demo in the demos, and got the following error message. Any thoughts?
Thanks.

Regards,

Hyeuk

(open.quake) ~/open.quake/demos/risk/ScenarioDamage$ oq run job_hazard.ini,job_risk.ini --export=csv,xml
[2016-11-14 13:57:40,690 #679 INFO] Using engine version 2.1.0
[2016-11-14 13:57:40,690 #679 INFO] Using hazardlib version 0.21.0
[2016-11-14 13:57:40,707 #679 INFO] Reading the exposure
[2016-11-14 13:57:43,334 #679 INFO] Read 9144 assets
[2016-11-14 13:58:04,702 #679 INFO] exported gmf_data: [u'/tmp/gmf-ChiouYoungs2008()-PGA_679.csv']
[2016-11-14 13:58:04,705 #679 INFO] exported realizations: [u'/tmp/realizations_679.csv']
[2016-11-14 13:58:04,714 #679 INFO] Internal size of the GMFs: 1.5 MB
[2016-11-14 13:58:12,254 #679 INFO] exported gmf_data: [u'/tmp/gmf-rlz-000_679.xml']
[2016-11-14 13:58:12,292 #680 INFO] Read 1432 hazard site(s)
[2016-11-14 13:58:12,301 #680 INFO] Reading the exposure
[2016-11-14 13:58:15,189 #680 INFO] Read 9144 assets within the region_constraint and discarded 0 assets outside the region
[2016-11-14 13:58:16,115 #680 WARNING] Associated 9144 assets to 2273 sites, 0 discarded
[2016-11-14 13:58:35,500 #680 INFO] Built 21 risk inputs
[2016-11-14 13:58:35,502 #680 INFO] Submitting  "scenario_damage" tasks
[2016-11-14 13:58:35,921 #680 INFO] Sent 2.95 MB of data in 21 task(s)
[2016-11-14 13:58:37,306 #680 INFO] scenario_damage   4%
[2016-11-14 13:58:37,311 #680 CRITICAL] 
Traceback (most recent call last):
  File "/Users/hyeuk/open.quake/lib/python2.7/site-packages/openquake/calculators/base.py", line 192, in run
    self.result = self.execute()
  File "/Users/hyeuk/open.quake/lib/python2.7/site-packages/openquake/calculators/base.py", line 663, in execute
    res = starmap(self.core_task.__func__, all_args).reduce()
  File "/Users/hyeuk/open.quake/lib/python2.7/site-packages/openquake/commonlib/parallel.py", line 459, in reduce
    for res in iter_result:
  File "/Users/hyeuk/open.quake/lib/python2.7/site-packages/openquake/commonlib/parallel.py", line 265, in __iter__
    self.save_task_data(mon)
  File "/Users/hyeuk/open.quake/lib/python2.7/site-packages/openquake/commonlib/parallel.py", line 278, in save_task_data
    mon.flush()
  File "/Users/hyeuk/open.quake/lib/python2.7/site-packages/openquake/baselib/performance.py", line 173, in flush
    child.flush()
  File "/Users/hyeuk/open.quake/lib/python2.7/site-packages/openquake/baselib/performance.py", line 178, in flush
    hdf5.extend3(self.hdf5path, 'performance_data', data)
  File "/Users/hyeuk/open.quake/lib/python2.7/site-packages/openquake/baselib/hdf5.py", line 76, in extend3
    extend(dset, array)
  File "/Users/hyeuk/open.quake/lib/python2.7/site-packages/openquake/baselib/hdf5.py", line 62, in extend
    dset.resize((newlength,) + array.shape[1:])
  File "/Users/hyeuk/open.quake/lib/python2.7/site-packages/h5py/_hl/dataset.py", line 350, in resize
    self.id.set_extent(size)
  File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper (/Users/travis/build/MacPython/h5py-wheels/h5py/h5py/_objects.c:2687)
  File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper (/Users/travis/build/MacPython/h5py-wheels/h5py/h5py/_objects.c:2645)
  File "h5py/h5d.pyx", line 274, in h5py.h5d.DatasetID.set_extent (/Users/travis/build/MacPython/h5py-wheels/h5py/h5py/h5d.c:4070)
ValueError: Unable to set extend dataset (No write intent on file)
Traceback (most recent call last):
  File "/Users/hyeuk/open.quake/bin/oq", line 11, in <module>
    sys.exit(oq())
  File "/Users/hyeuk/open.quake/lib/python2.7/site-packages/openquake/commands/__main__.py", line 50, in oq
    parser.callfunc()
  File "/Users/hyeuk/open.quake/lib/python2.7/site-packages/openquake/baselib/sap.py", line 186, in callfunc
    return self.func(**vars(namespace))
  File "/Users/hyeuk/open.quake/lib/python2.7/site-packages/openquake/baselib/sap.py", line 245, in main
    return func(**kw)
  File "/Users/hyeuk/open.quake/lib/python2.7/site-packages/openquake/commands/run.py", line 150, in run
    _run(job_ini, concurrent_tasks, pdb, loglevel, hc, exports, params)
  File "/Users/hyeuk/open.quake/lib/python2.7/site-packages/openquake/commands/run.py", line 121, in _run
    exports, params, monitor)
  File "/Users/hyeuk/open.quake/lib/python2.7/site-packages/openquake/commands/run.py", line 87, in run2
    hazard_calculation_id=hc_id, **params)
  File "/Users/hyeuk/open.quake/lib/python2.7/site-packages/openquake/calculators/base.py", line 192, in run
    self.result = self.execute()
  File "/Users/hyeuk/open.quake/lib/python2.7/site-packages/openquake/calculators/base.py", line 663, in execute
    res = starmap(self.core_task.__func__, all_args).reduce()
  File "/Users/hyeuk/open.quake/lib/python2.7/site-packages/openquake/commonlib/parallel.py", line 459, in reduce
    for res in iter_result:
  File "/Users/hyeuk/open.quake/lib/python2.7/site-packages/openquake/commonlib/parallel.py", line 265, in __iter__
    self.save_task_data(mon)
  File "/Users/hyeuk/open.quake/lib/python2.7/site-packages/openquake/commonlib/parallel.py", line 278, in save_task_data
    mon.flush()
  File "/Users/hyeuk/open.quake/lib/python2.7/site-packages/openquake/baselib/performance.py", line 173, in flush
    child.flush()
  File "/Users/hyeuk/open.quake/lib/python2.7/site-packages/openquake/baselib/performance.py", line 178, in flush
    hdf5.extend3(self.hdf5path, 'performance_data', data)
  File "/Users/hyeuk/open.quake/lib/python2.7/site-packages/openquake/baselib/hdf5.py", line 76, in extend3
    extend(dset, array)
  File "/Users/hyeuk/open.quake/lib/python2.7/site-packages/openquake/baselib/hdf5.py", line 62, in extend
    dset.resize((newlength,) + array.shape[1:])
  File "/Users/hyeuk/open.quake/lib/python2.7/site-packages/h5py/_hl/dataset.py", line 350, in resize
    self.id.set_extent(size)
  File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper (/Users/travis/build/MacPython/h5py-wheels/h5py/h5py/_objects.c:2687)
  File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper (/Users/travis/build/MacPython/h5py-wheels/h5py/h5py/_objects.c:2645)
  File "h5py/h5d.pyx", line 274, in h5py.h5d.DatasetID.set_extent (/Users/travis/build/MacPython/h5py-wheels/h5py/h5py/h5d.c:4070)
ValueError: Unable to set extend dataset (No write intent on file)

Michele Simionato

unread,
Nov 14, 2016, 1:04:00 AM11/14/16
to OpenQuake Users
The recommened command to use to run calculations is `oq engine`:

 $  oq engine --run job_hazard.ini,job_risk.ini --exports=csv,xml

`oq engine` is tested and as you will see it works. `oq run` is a leftover of the previous oq-lite project. It is not documented, not tested, and could disappear without any notice. You can use it at your peril.
Having said so, this is a curious bug that I will have to investigate.

     Michele

Hyeuk Ryu

unread,
Nov 14, 2016, 2:22:15 AM11/14/16
to OpenQuake Users
Hi Michele,

Thanks for the kind explanation. I have not thoroughly tested 'oq-lite', but it worked fine for hazard calculation. 

Regards,

Hyeuk
 


--
You received this message because you are subscribed to the Google Groups "OpenQuake Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to openquake-users+unsubscribe@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply all
Reply to author
Forward
0 new messages