Openquake updates

92 views
Skip to first unread message

Sriram

unread,
May 30, 2013, 4:53:17 AM5/30/13
to openqu...@googlegroups.com
Dear OpenQuake team,

I have a question regarding the frequent openquake updates. I have been accepting all the updates that come through ubuntu update manager, but yesterday I tried to run a new demo that I pulled from github, the one with characteristic source model. My local copy of openquake popped an error saying characteristicFaultSource is not a defined parameter. 

So I went on to clone locally the new oq-engine, oq-hazardlib and etc.,  and followed the instruction on wiki to install. Then I had a problem with speedups (not found). So i copied the oq-hazarlib.build folder from the old version (which I backed up earlier) to the new oq-hazarlib. This removed the missing speedups error, but now if I run my old hazard calculation models, I have some errors (mostly nrml related). I had to go about making a lot of changes to my models.

What I would like to know is, how do I keep up with the constant updates from openquake? Simply accepting the updates from ppa didnt seem to update much. Also, is there a source for the change logs with each update. 

Thank you,

Best wishes,
Sriram

Lars Butler

unread,
May 30, 2013, 4:57:37 AM5/30/13
to openqu...@googlegroups.com
Hi Sriram,

Can you please paste the full error message output? (You can use pastebin.com or similar.)

Regarding the frequent changes, as we're approaching a stable v1.0 release of the oq-engine, development will be slowing down a bit and future changes will be much less disruptive.

At the moment, there is no condensed changelog.

Thanks,
Lars


--
You received this message because you are subscribed to the Google Groups "OpenQuake Developers" group.
To unsubscribe from this group and stop receiving emails from it, send an email to openquake-de...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.
 
 

Sriram

unread,
May 30, 2013, 5:10:29 AM5/30/13
to openqu...@googlegroups.com
Hey Lars,

Earlier, I had this error, Geodetic and Geoutil speedups are not found. I went on and checked the new version folder oq-hazardlib, where oq-hazarlib.build module is missing. I simply copied this oq-hazardlib.build folder from my backup of older version of openquake to the new folder oq-hazardlib. This seems to have covered the error for now, but is this the correct remedy for the problem? 

Thank you Lars,

Peace,
Sriram

Lars Butler

unread,
May 30, 2013, 5:30:56 AM5/30/13
to openqu...@googlegroups.com
Hi Sriram,

Those are actually 'warnings', not errors. If you get that message, this will not cause any problems. These message simply indicate that your current software configuration is not using C optimizations, but instead is using the (slower) Python version of some specific utilities. If the warning goes away, everything _should_ be fine. Did you install hazardlib from an Ubuntu package, or from source?

If you installed from source, this page may be useful: https://github.com/gem/oq-hazardlib/wiki/Installing-C-extensions-from-git-repository

Cheers,
Lars


--

Sriram

unread,
May 30, 2013, 5:55:17 AM5/30/13
to openqu...@googlegroups.com
Hey Lars,

I have first installed openquake from the source, but then I had an error with python paths (engine.utils module not found). So I tried installing from package, this time it worked. I am running an analysis now. Everything seems to be fine but significantly slower. Either way, if I have some other problem I will get back to you. 

Thanks a lot Lars for your quick responses.

Peace,
Sriram

On Thursday, May 30, 2013 10:53:17 AM UTC+2, Sriram wrote:

Sriram

unread,
May 30, 2013, 9:46:24 AM5/30/13
to openqu...@googlegroups.com
Hey Lars,

I have tried to run the hazard/EventBasedPSHA. The job failed with the following error. Please advice.

sriram@sriram-K56CM:~$ openquake --rh /usr/openquake/engine/demos/hazard/EventBasedPSHA/job.ini 
[2013-05-30 15:36:59,011 hazard #7 sriram-K56CM PROGRESS MainProcess/19245 root] **  pre_executing (hazard)
[2013-05-30 15:36:59,028 hazard #7 sriram-K56CM PROGRESS MainProcess/19245 root] **  initializing sources
[2013-05-30 15:36:59,155 hazard #7 sriram-K56CM PROGRESS MainProcess/19245 root] **  initializing site model
[2013-05-30 15:36:59,222 hazard #7 sriram-K56CM PROGRESS MainProcess/19245 root] **  initializing realizations
[2013-05-30 15:36:59,378 hazard #7 sriram-K56CM PROGRESS MainProcess/19245 root] **  executing (hazard)
[2013-05-30 15:37:00,456 hazard #7 sriram-K56CM PROGRESS MainProcess/19245 root] **  building arglist
[2013-05-30 15:37:00,477 hazard #7 sriram-K56CM PROGRESS MainProcess/19245 root] **  spawning 10 tasks of kind ses_and_gmfs
[2013-05-30 15:37:29,413 hazard #7 sriram-K56CM CRITICAL MainProcess/19245 root] Calculation failed with exception: 'insert or update on table "performance" violates foreign key constraint "uiapi_performance_oq_job_fk"
DETAIL:  Key (oq_job_id)=(4) is not present in table "oq_job".
'
Traceback (most recent call last):
  File "/usr/bin/openquake", line 9, in <module>
    load_entry_point('openquake.engine==0.9.2', 'console_scripts', 'openquake')()
  File "/usr/lib/python2.7/dist-packages/openquake/engine/bin/oqscript.py", line 614, in main
    args.exports)
  File "/usr/lib/python2.7/dist-packages/openquake/engine/bin/oqscript.py", line 280, in run_hazard
    job, log_level, log_file, exports, 'hazard')
  File "/usr/lib/python2.7/dist-packages/openquake/engine/engine.py", line 351, in run_calc
    _do_run_calc(job, exports, calc, job_type)
  File "/usr/lib/python2.7/dist-packages/openquake/engine/engine.py", line 428, in _do_run_calc
    calc.execute()
  File "/usr/lib/python2.7/dist-packages/openquake/engine/calculators/hazard/event_based/core.py", line 491, in execute
    self.parallelize(self.core_calc_task, self.task_arg_gen())
  File "/usr/lib/python2.7/dist-packages/openquake/engine/calculators/base.py", line 98, in parallelize
    tasks.parallelize(task_func, argblock, lambda _: None)
  File "/usr/lib/python2.7/dist-packages/openquake/engine/utils/tasks.py", line 91, in parallelize
    _map_reduce(task_func, task_args, noagg, None)
  File "/usr/lib/python2.7/dist-packages/openquake/engine/utils/tasks.py", line 63, in _map_reduce
    for result in taskset.apply_async():
  File "/usr/lib/python2.7/dist-packages/celery/result.py", line 350, in iterate
    propagate=propagate)
  File "/usr/lib/python2.7/dist-packages/celery/result.py", line 95, in get
    interval=interval)
  File "/usr/lib/python2.7/dist-packages/celery/backends/amqp.py", line 147, in wait_for
    raise self.exception_to_python(meta["result"])
psycopg2.IntegrityError: insert or update on table "performance" violates foreign key constraint "uiapi_performance_oq_job_fk"
DETAIL:  Key (oq_job_id)=(4) is not present in table "oq_job".

[2013-05-30 15:37:29,951 hazard #7 sriram-K56CM WARNING PoolWorker-4/17761 celery] Traceback (most recent call last):
[2013-05-30 15:37:29,951 hazard #7 sriram-K56CM WARNING PoolWorker-4/17761 celery] File "/usr/lib/python2.7/logging/__init__.py", line 857, in emit
[2013-05-30 15:37:29,952 hazard #7 sriram-K56CM WARNING PoolWorker-4/17761 celery] stream.write(ufs % msg)
[2013-05-30 15:37:29,952 hazard #7 sriram-K56CM WARNING PoolWorker-4/17761 celery] IOError: [Errno 5] Input/output error
[2013-05-30 15:37:29,952 hazard #7 sriram-K56CM WARNING PoolWorker-4/17761 celery] Logged from file tasks.py, line 149
[2013-05-30 15:37:29,951 hazard #7 sriram-K56CM CRITICAL PoolWorker-4/17761 root] Error occurred in task: Job 8 is not running
[2013-05-30 15:37:29,953 hazard #7 sriram-K56CM WARNING PoolWorker-4/17761 celery] Traceback (most recent call last):
[2013-05-30 15:37:29,954 hazard #7 sriram-K56CM WARNING PoolWorker-4/17761 celery] File "/usr/lib/python2.7/logging/__init__.py", line 867, in emit
[2013-05-30 15:37:29,954 hazard #7 sriram-K56CM WARNING PoolWorker-4/17761 celery] stream.write(fs % msg)
[2013-05-30 15:37:29,955 hazard #7 sriram-K56CM WARNING PoolWorker-4/17761 celery] IOError: [Errno 5] Input/output error
[2013-05-30 15:37:29,955 hazard #7 sriram-K56CM WARNING PoolWorker-4/17761 celery] Logged from file tasks.py, line 150
[2013-05-30 15:37:29,953 hazard #7 sriram-K56CM ERROR PoolWorker-4/17761 root] JobCompletedError('Job 8 is not running',)
Traceback (most recent call last):
  File "/usr/lib/python2.7/dist-packages/openquake/engine/utils/tasks.py", line 140, in wrapped
    raise JobCompletedError('Job %d is not running' % job_id)
JobCompletedError: Job 8 is not running
[2013-05-30 15:37:29,983 hazard #7 sriram-K56CM WARNING PoolWorker-4/17761 celery] Traceback (most recent call last):
[2013-05-30 15:37:29,984 hazard #7 sriram-K56CM WARNING PoolWorker-4/17761 celery] File "/usr/lib/python2.7/logging/__init__.py", line 857, in emit
[2013-05-30 15:37:29,985 hazard #7 sriram-K56CM WARNING PoolWorker-4/17761 celery] stream.write(ufs % msg)
[2013-05-30 15:37:29,987 hazard #7 sriram-K56CM WARNING PoolWorker-4/17761 celery] IOError: [Errno 5] Input/output error
[2013-05-30 15:37:29,988 hazard #7 sriram-K56CM WARNING PoolWorker-4/17761 celery] Logged from file tasks.py, line 149
[2013-05-30 15:37:29,983 hazard #7 sriram-K56CM CRITICAL PoolWorker-4/17761 root] Error occurred in task: Job 8 is not running
[2013-05-30 15:37:29,989 hazard #7 sriram-K56CM WARNING PoolWorker-4/17761 celery] Traceback (most recent call last):
[2013-05-30 15:37:29,990 hazard #7 sriram-K56CM WARNING PoolWorker-4/17761 celery] File "/usr/lib/python2.7/logging/__init__.py", line 867, in emit
[2013-05-30 15:37:29,990 hazard #7 sriram-K56CM WARNING PoolWorker-4/17761 celery] stream.write(fs % msg)
[2013-05-30 15:37:29,991 hazard #7 sriram-K56CM WARNING PoolWorker-4/17761 celery] IOError: [Errno 5] Input/output error
[2013-05-30 15:37:29,991 hazard #7 sriram-K56CM WARNING PoolWorker-4/17761 celery] Logged from file tasks.py, line 150
[2013-05-30 15:37:29,989 hazard #7 sriram-K56CM ERROR PoolWorker-4/17761 root] JobCompletedError('Job 8 is not running',)
Traceback (most recent call last):
  File "/usr/lib/python2.7/dist-packages/openquake/engine/utils/tasks.py", line 140, in wrapped
    raise JobCompletedError('Job %d is not running' % job_id)
JobCompletedError: Job 8 is not running
[2013-05-30 15:37:30,015 hazard #7 sriram-K56CM WARNING PoolWorker-4/17761 celery] Traceback (most recent call last):
[2013-05-30 15:37:30,016 hazard #7 sriram-K56CM WARNING PoolWorker-4/17761 celery] File "/usr/lib/python2.7/logging/__init__.py", line 857, in emit
[2013-05-30 15:37:30,017 hazard #7 sriram-K56CM WARNING PoolWorker-4/17761 celery] stream.write(ufs % msg)
[2013-05-30 15:37:30,017 hazard #7 sriram-K56CM WARNING PoolWorker-4/17761 celery] IOError: [Errno 5] Input/output error
[2013-05-30 15:37:30,018 hazard #7 sriram-K56CM WARNING PoolWorker-4/17761 celery] Logged from file tasks.py, line 149
[2013-05-30 15:37:30,015 hazard #7 sriram-K56CM CRITICAL PoolWorker-4/17761 root] Error occurred in task: Job 8 is not running
[2013-05-30 15:37:30,019 hazard #7 sriram-K56CM WARNING PoolWorker-4/17761 celery] Traceback (most recent call last):
[2013-05-30 15:37:30,019 hazard #7 sriram-K56CM WARNING PoolWorker-4/17761 celery] File "/usr/lib/python2.7/logging/__init__.py", line 867, in emit
[2013-05-30 15:37:30,020 hazard #7 sriram-K56CM WARNING PoolWorker-4/17761 celery] stream.write(fs % msg)
[2013-05-30 15:37:30,021 hazard #7 sriram-K56CM WARNING PoolWorker-4/17761 celery] IOError: [Errno 5] Input/output error
[2013-05-30 15:37:30,021 hazard #7 sriram-K56CM WARNING PoolWorker-4/17761 celery] Logged from file tasks.py, line 150
[2013-05-30 15:37:30,019 hazard #7 sriram-K56CM ERROR PoolWorker-4/17761 root] JobCompletedError('Job 8 is not running',)
Traceback (most recent call last):
  File "/usr/lib/python2.7/dist-packages/openquake/engine/utils/tasks.py", line 140, in wrapped
    raise JobCompletedError('Job %d is not running' % job_id)
JobCompletedError: Job 8 is not running
[2013-05-30 15:37:30,048 hazard #7 sriram-K56CM WARNING PoolWorker-4/17761 celery] Traceback (most recent call last):
[2013-05-30 15:37:30,048 hazard #7 sriram-K56CM WARNING PoolWorker-2/17759 celery] Traceback (most recent call last):
[2013-05-30 15:37:30,049 hazard #7 sriram-K56CM WARNING PoolWorker-4/17761 celery] File "/usr/lib/python2.7/logging/__init__.py", line 857, in emit
[2013-05-30 15:37:30,049 hazard #7 sriram-K56CM WARNING PoolWorker-2/17759 celery] File "/usr/lib/python2.7/logging/__init__.py", line 857, in emit
[2013-05-30 15:37:30,050 hazard #7 sriram-K56CM WARNING PoolWorker-2/17759 celery] stream.write(ufs % msg)
[2013-05-30 15:37:30,050 hazard #7 sriram-K56CM WARNING PoolWorker-2/17759 celery] IOError: [Errno 5] Input/output error
[2013-05-30 15:37:30,049 hazard #7 sriram-K56CM WARNING PoolWorker-4/17761 celery] stream.write(ufs % msg)
[2013-05-30 15:37:30,050 hazard #7 sriram-K56CM WARNING PoolWorker-4/17761 celery] IOError: [Errno 5] Input/output error
[2013-05-30 15:37:30,051 hazard #7 sriram-K56CM WARNING PoolWorker-4/17761 celery] Logged from file tasks.py, line 149
[2013-05-30 15:37:30,048 hazard #7 sriram-K56CM CRITICAL PoolWorker-4/17761 root] Error occurred in task: Job 8 is not running
[2013-05-30 15:37:30,051 hazard #7 sriram-K56CM WARNING PoolWorker-2/17759 celery] Logged from file tasks.py, line 149
[2013-05-30 15:37:30,052 hazard #7 sriram-K56CM WARNING PoolWorker-4/17761 celery] Traceback (most recent call last):
[2013-05-30 15:37:30,052 hazard #7 sriram-K56CM WARNING PoolWorker-4/17761 celery] File "/usr/lib/python2.7/logging/__init__.py", line 867, in emit
[2013-05-30 15:37:30,048 hazard #7 sriram-K56CM CRITICAL PoolWorker-2/17759 root] Error occurred in task: Job 8 is not running
[2013-05-30 15:37:30,053 hazard #7 sriram-K56CM WARNING PoolWorker-4/17761 celery] stream.write(fs % msg)
[2013-05-30 15:37:30,053 hazard #7 sriram-K56CM WARNING PoolWorker-4/17761 celery] IOError: [Errno 5] Input/output error
[2013-05-30 15:37:30,054 hazard #7 sriram-K56CM WARNING PoolWorker-4/17761 celery] Logged from file tasks.py, line 150
[2013-05-30 15:37:30,055 hazard #7 sriram-K56CM WARNING PoolWorker-2/17759 celery] Traceback (most recent call last):
[2013-05-30 15:37:30,051 hazard #7 sriram-K56CM ERROR PoolWorker-4/17761 root] JobCompletedError('Job 8 is not running',)
Traceback (most recent call last):
  File "/usr/lib/python2.7/dist-packages/openquake/engine/utils/tasks.py", line 140, in wrapped
    raise JobCompletedError('Job %d is not running' % job_id)
JobCompletedError: Job 8 is not running
[2013-05-30 15:37:30,056 hazard #7 sriram-K56CM WARNING PoolWorker-2/17759 celery] File "/usr/lib/python2.7/logging/__init__.py", line 867, in emit
[2013-05-30 15:37:30,057 hazard #7 sriram-K56CM WARNING PoolWorker-2/17759 celery] stream.write(fs % msg)
[2013-05-30 15:37:30,058 hazard #7 sriram-K56CM WARNING PoolWorker-2/17759 celery] IOError: [Errno 5] Input/output error
[2013-05-30 15:37:30,059 hazard #7 sriram-K56CM WARNING PoolWorker-2/17759 celery] Logged from file tasks.py, line 150
[2013-05-30 15:37:30,054 hazard #7 sriram-K56CM ERROR PoolWorker-2/17759 root] JobCompletedError('Job 8 is not running',)
Traceback (most recent call last):
  File "/usr/lib/python2.7/dist-packages/openquake/engine/utils/tasks.py", line 140, in wrapped
    raise JobCompletedError('Job %d is not running' % job_id)
JobCompletedError: Job 8 is not running
[2013-05-30 15:37:31,065 hazard #7 - ERROR MainProcess/19246 supervisor] job process 19245 crashed or terminated
Calculation 7 failed

Thank you,

Peace,
Sriram
On Thursday, May 30, 2013 10:53:17 AM UTC+2, Sriram wrote:
Reply all
Reply to author
Forward
0 new messages