Probabilistic risk issue - vulnerability file

154 views
Skip to first unread message

Lisa Jusufi

unread,
Oct 6, 2025, 6:00:10 AMOct 6
to OpenQuake Users
Dear OpenQuake team,

I am having trouble running probability loss assessment. The error I get is regarding the mapping of the exposed assets with the respectable vulnerability curves. The vulnerability functions are the ones from GEM.

I have used the same vulnerability files for scenario losses, and I didn't have a problem running it. 

I would really appreciate if somebody can look at this issue and help me.


PS. I have a problem attaching the hazard file here. I am using the one from ESHM20.

Kindest regards,
Lisa
1 - Probabilistic_risk_annual.zip

Anirudh Rao

unread,
Oct 6, 2025, 6:09:31 AMOct 6
to OpenQuake Users
Hi Lisa, 

Could you please report the specific error message you are getting?

Thanks,
Anirudh

Lisa Jusufi

unread,
Oct 6, 2025, 6:13:40 AMOct 6
to OpenQuake Users

2025-10-06T09:19:51.64,INFO,SpawnProcess-1/6340,lisaj@Lisa running C:\Users\lisaj\AppData\Local\Temp\lisaj\calc_464\1 - Probabilistic_risk_annual\job_risk.ini [--hc=427]

2025-10-06T09:19:51.64,INFO,SpawnProcess-1/6340,Using engine version 3.23.2

2025-10-06T09:19:51.91,WARNING,SpawnProcess-1/6340,Using 6 processpool workers

2025-10-06T09:19:52.10,INFO,SpawnProcess-1/6340,Checksum of the inputs: 1940304701 (total size 10.36 MB)

2025-10-06T09:20:03.24,INFO,SpawnProcess-1/6340,Reading C:\Users\lisaj\AppData\Local\Temp\lisaj\calc_464\1 - Probabilistic_risk_annual\Exp_Schools_RNM.xml

2025-10-06T09:20:03.35,INFO,SpawnProcess-1/6340,Read 1_129 assets in 0.11s from C:\Users\lisaj\AppData\Local\Temp\lisaj\calc_464\1 - Probabilistic_risk_annual\Exp_Schools_RNM.csv

2025-10-06T09:20:03.39,INFO,SpawnProcess-1/6340,Inferred exposure mesh in 0.02 seconds

2025-10-06T09:20:03.41,INFO,SpawnProcess-1/6340,Associated 1_129 assets (of 1_129) to 917 sites (of 8_217)

2025-10-06T09:20:03.41,INFO,SpawnProcess-1/6340,Found 43 taxonomies with ~26.3 assets each

2025-10-06T09:20:03.42,INFO,SpawnProcess-1/6340,The most common taxonomy is MUR/LWAL+DNO/H1-R with 233 assets

2025-10-06T09:20:03.46,INFO,SpawnProcess-1/6340,Reducing risk model from 4206 to 36 risk functions

2025-10-06T09:20:03.46,INFO,SpawnProcess-1/6340,minimum_asset_loss={'occupants': 0, 'structural': 0}

2025-10-06T09:20:03.47,INFO,SpawnProcess-1/6340,Storing risk model

2025-10-06T09:20:03.62,INFO,SpawnProcess-1/6340,Building 1_000 realizations

2025-10-06T09:20:03.64,INFO,SpawnProcess-1/6340,Building risk inputs from 1000 realization(s)

2025-10-06T09:20:03.72,INFO,SpawnProcess-1/6340,Building 1_000 realizations

2025-10-06T09:23:56.53,INFO,SpawnProcess-1/6340,Built 1129 risk inputs

2025-10-06T09:23:57.69,WARNING,SpawnProcess-1/6340,Sent 13 classical_risk tasks, 537.07 MB

2025-10-06T09:24:05.92,INFO,SpawnProcess-1/6340,Received 1 * 86 B in 8 seconds [unpik=0.00s] from classical_risk {'tot': '86 B'}

2025-10-06T09:24:06.03,ERROR,SpawnProcess-1/6340,Traceback (most recent call last): File "C:\Users\lisaj\anaconda3\envs\oqe\lib\site-packages\openquake\engine\engine.py", line 208, in run_calc calc.run(shutdown=True) File "C:\Users\lisaj\anaconda3\envs\oqe\lib\site-packages\openquake\calculators\base.py", line 339, in run raise exc from None File "C:\Users\lisaj\anaconda3\envs\oqe\lib\site-packages\openquake\calculators\base.py", line 326, in run self.result = self.execute() File "C:\Users\lisaj\anaconda3\envs\oqe\lib\site-packages\openquake\calculators\base.py", line 1272, in execute return smap.reduce(self.combine, self.acc) File "C:\Users\lisaj\anaconda3\envs\oqe\lib\site-packages\openquake\baselib\parallel.py", line 924, in reduce return self.submit_all().reduce(agg, acc) File "C:\Users\lisaj\anaconda3\envs\oqe\lib\site-packages\openquake\baselib\parallel.py", line 608, in reduce for result in self: File "C:\Users\lisaj\anaconda3\envs\oqe\lib\site-packages\openquake\baselib\parallel.py", line 594, in __iter__ yield from self._iter() File "C:\Users\lisaj\anaconda3\envs\oqe\lib\site-packages\openquake\baselib\parallel.py", line 584, in _iter out = result.get() File "C:\Users\lisaj\anaconda3\envs\oqe\lib\site-packages\openquake\baselib\parallel.py", line 414, in get raise etype(msg) ZeroDivisionError: File "C:\Users\lisaj\anaconda3\envs\oqe\lib\site-packages\openquake\baselib\parallel.py", line 437, in new val = func(*args) File "C:\Users\lisaj\anaconda3\envs\oqe\lib\site-packages\openquake\calculators\classical_risk.py", line 38, in classical_risk crmodel = monitor.read('crmodel') File "C:\Users\lisaj\anaconda3\envs\oqe\lib\site-packages\openquake\baselib\performance.py", line 398, in read return pickle.loads(dset[()]) File "C:\Users\lisaj\anaconda3\envs\oqe\lib\site-packages\openquake\risklib\scientific.py", line 405, in __setstate__ self.init() File "C:\Users\lisaj\anaconda3\envs\oqe\lib\site-packages\openquake\risklib\scientific.py", line 256, in init self._mlr_i1d = interpolate.interp1d(self.imls, self.mean_loss_ratios) File "C:\Users\lisaj\anaconda3\envs\oqe\lib\site-packages\scipy\interpolate\_interpolate.py", line 267, in __init__ _Interpolator1D.__init__(self, x, y, axis=axis) File "C:\Users\lisaj\anaconda3\envs\oqe\lib\site-packages\scipy\interpolate\_polyint.py", line 58, in __init__ self._set_yi(yi, xi=xi, axis=axis) File "C:\Users\lisaj\anaconda3\envs\oqe\lib\site-packages\scipy\interpolate\_polyint.py", line 131, in _set_yi self._y_axis = (axis % yi.ndim) ZeroDivisionError: integer division or modulo by zero

Anirudh Rao

unread,
Oct 6, 2025, 6:24:10 AMOct 6
to OpenQuake Users
I wonder if the job settings in the hazard job file are leading to some kind of saturation of the hazard curve values. Would it be possible to share your job_hazard.ini file too?

Lisa Jusufi

unread,
Oct 6, 2025, 6:42:43 AMOct 6
to OpenQuake Users

Dear Anirudh,

I have a problem attaching the hazard file directly here because its too large.
This is the link to it: https://we.tl/t-Fwggi2ekWH 

Note that I managed to run probabilistic damage with the same hazard file and with the fragility curves of GEM without a problem.

Kind regards,
Lisa

Anirudh Rao

unread,
Oct 8, 2025, 5:18:48 AMOct 8
to OpenQuake Users
Thanks Lisa, we'll try to see if we can reproduce the issue.

Lisa Jusufi

unread,
Oct 10, 2025, 4:12:53 AMOct 10
to OpenQuake Users
Thank you. Looking forward to your assistance.

Anirudh Rao

unread,
Oct 15, 2025, 5:41:07 AMOct 15
to OpenQuake Users
Hi Lisa,

We confirm we are able to replicate the error on our end: https://github.com/gem/oq-engine/issues/10885

However, it might take some time to fully debug as the developers are away until next week for a workshop. Appreciate your patience!

Best regards,
Anirudh

Lisa Jusufi

unread,
Oct 21, 2025, 3:49:48 AMOct 21
to OpenQuake Users

Hi Anirudh,

Thank you for the update and for confirming the issue. I am waiting for the debugging process to be completed.

Best regards,
Lisa

Lisa Jusufi

unread,
Oct 31, 2025, 5:26:48 AMOct 31
to OpenQuake Users

Hi Anirudh,

I wanted to kindly ask if there are any updates on the debugging process for the issue. These analysis are related to my PhD, so it would help me a lot to know if there is an estimated timeframe for when it might be resolved.

Also, please let me know if there are some improvisations i can considering in the analysis (doesn't matter if its not the most efficient way), something I can do myself to avoid this issue. 

Thank you again for your help and support!

Best regards,
Lisa

Michele Simionato

unread,
Nov 19, 2025, 3:53:27 AMNov 19
to OpenQuake Users
Hi Lisa, the workaround is to remove of comment out the following line in the job_risk.ini file:

# occupants_vulnerability_file = vulnerability_fatalities.xml

This avoids the issue of vulnerability functions with constant values. Tested and works with the current master of the engine (i.e. you have to reinstall it)

Lisa Jusufi

unread,
Nov 19, 2025, 5:23:47 AMNov 19
to OpenQuake Users
Dear Michele,

Thank you for your response. I reinstalled the current master of the engine and i run the analysis again. Now i don't have the interpolation issue anymore.

However now i have another issue (i don't know if is related to the previous one). The results they don't seem realistic.
I am using a taxonomy mapping scheme where some taxonomy strings are associated with multiple vulnerability functions, each with a specified weight. For these cases (i.e., taxonomies linked to more than one vulnerability function), the classical risk calculation results in complete loss for all POE values.

Only when one taxonomy string is related with only one vulnerability function, than it shows reasonable results.

Kind regards,
Lisa

Anirudh Rao

unread,
Nov 19, 2025, 6:01:24 AMNov 19
to OpenQuake Users
Are you seeing the complete loss results for risk_investigation_time = 50, or risk_investigation_time = 1? Or both?

Lisa Jusufi

unread,
Nov 19, 2025, 6:14:38 AMNov 19
to OpenQuake Users
I see it for both. 

Anirudh Rao

unread,
Nov 20, 2025, 6:43:30 AM (14 days ago) Nov 20
to OpenQuake Users
Thanks for checking, there was indeed a subtle bug still present, which has been fixed in https://github.com/gem/oq-engine/pull/10982. The fix should be available in the Windows nightly release by tomorrow.

Lisa Jusufi

unread,
Nov 21, 2025, 5:29:22 AM (13 days ago) Nov 21
to OpenQuake Users
Thank you very much.
Its all good now! 

Kind regards,
Lisa
Reply all
Reply to author
Forward
0 new messages