Hi all,
As far as I understand, the only
way to compute per-year loss exceedance curves is to set
ses_per_logic_tree_path = 1 and investigation_time = N (e.g. 100,000), instead
of the opposite, and then post-process the results stored in the
losses_by_event.csv file (taking advantage of the “year” column).
1) 1) For my analysis I have specified an investigation_time of 100,000 years, however I noticed that the year column contains values up to about 65,000 (for about 20,000 ruptures). I changed the random seed just in case it was some very unlikely statistical fluke, but I got the same thing. Could you help me understand how this works?
2) 2) I would like to have the event loss table by tag as well. Among other things, this would allow to obtain the per-year loss exceedance curves for each tag (=administrative unit). Is there any way that I could be able to extract such information (I didn’t find anything in the hdf5 file), and if not, do you foresee that something like that could be possible in the future?
Hi all,
As far as I understand, the only way to compute per-year loss exceedance curves is to set ses_per_logic_tree_path = 1 and investigation_time = N (e.g. 100,000), instead of the opposite, and then post-process the results stored in the losses_by_event.csv file (taking advantage of the “year” column).
1) 1) For my analysis I have specified an investigation_time of 100,000 years, however I noticed that the year column contains values up to about 65,000 (for about 20,000 ruptures). I changed the random seed just in case it was some very unlikely statistical fluke, but I got the same thing. Could you help me understand how this works?
2) 2) I would like to have the event loss table by tag as well. Among other things, this would allow to obtain the per-year loss exceedance curves for each tag (=administrative unit). Is there any way that I could be able to extract such information (I didn’t find anything in the hdf5 file), and if not, do you foresee that something like that could be possible in the future?
Dear Michele and Anirudh,Many thanks for your quick responses. I was not reading the datastore properly, I found it now, thanks again.I see that the ses_id variable is also stored as '<u2', so I guess I might need to combine ses_per_logic_tree_path and investigation_time values to get 100,000 years/ses while keeping each of them below 65535.
--
You received this message because you are subscribed to the Google Groups "OpenQuake Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to openquake-use...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/openquake-users/054aae31-de09-4f94-a1e5-67ca1524779en%40googlegroups.com.