Groups keyboard shortcuts have been updated
Dismiss
See shortcuts

Lustre-based storage and spring.db file

14 views
Skip to first unread message

Isabelle Gordon

unread,
Mar 12, 2020, 11:30:18 AM3/12/20
to emspring
Hi Carsten, 

I am struggling with producing a spring.db file when running jobs on our cluster. Our two main directories, /home and /scratch, are Lustre-based and according to our IT manager, SQLite and Lustre are not compatible. However, there is another directory on the local disk of the compute nodes, /tmp, that could be used. Is there anyway I could direct the code to produce the database file in this separate directory? Please let me know if you have any ideas on how I could work around this issue. 

ERROR Message: 

NFO:compute_average_variance_and_eigenimages_on_orignal_stack:


INFO:compute_average_variance_and_eigenimages_on_orignal_stack:Averages and variances of aligned stack computed.

logged on Thu, 12 Mar 2020 09:54:39

INFO:compute_average_variance_and_eigenimages_on_orignal_stack:Eigenimages of aligned stack computed.

logged on Thu, 12 Mar 2020 09:54:45

Traceback (most recent call last):

  File "/export/src/spring_v0-86-1661/bin/segmentclass", line 171, in <module>

    sys.exit(spring.segment2d.segmentclass.main())

  File "/export/src/spring_v0-86-1661/lib/python2.7/site-packages/emspring-0.86.1661-py2.7.egg/spring/segment2d/segmentclass.py", line 813, in main

    stack.classify()

  File "/export/src/spring_v0-86-1661/lib/python2.7/site-packages/emspring-0.86.1661-py2.7.egg/spring/segment2d/segmentclass.py", line 801, in classify

    self.finish_classification(avgstack, varstack, self.infilestack, self.output_avgstack, self.output_varstack)

  File "/export/src/spring_v0-86-1661/lib/python2.7/site-packages/emspring-0.86.1661-py2.7.egg/spring/segment2d/segmentclass.py", line 759, in finish_classification

    session = self.create_database_with_stack_id_entries(segment_count)

  File "/export/src/spring_v0-86-1661/lib/python2.7/site-packages/emspring-0.86.1661-py2.7.egg/spring/segment2d/segmentclass.py", line 729, in create_database_with_stack_id_entries

    session = SpringDataBase().setup_sqlite_db(base)

  File "/export/src/spring_v0-86-1661/lib/python2.7/site-packages/emspring-0.86.1661-py2.7.egg/spring/csinfrastr/csdatabase.py", line 646, in setup_sqlite_db

    base.metadata.create_all(engine)

  File "/export/src/spring_v0-86-1661/lib/python2.7/site-packages/SQLAlchemy-1.1.15-py2.7-linux-x86_64.egg/sqlalchemy/sql/schema.py", line 3949, in create_all

    tables=tables)

  File "/export/src/spring_v0-86-1661/lib/python2.7/site-packages/SQLAlchemy-1.1.15-py2.7-linux-x86_64.egg/sqlalchemy/engine/base.py", line 1929, in _run_visitor

    conn._run_visitor(visitorcallable, element, **kwargs)

  File "/export/src/spring_v0-86-1661/lib/python2.7/site-packages/SQLAlchemy-1.1.15-py2.7-linux-x86_64.egg/sqlalchemy/engine/base.py", line 1538, in _run_visitor

    **kwargs).traverse_single(element)

  File "/export/src/spring_v0-86-1661/lib/python2.7/site-packages/SQLAlchemy-1.1.15-py2.7-linux-x86_64.egg/sqlalchemy/sql/visitors.py", line 121, in traverse_single

    return meth(obj, **kw)

  File "/export/src/spring_v0-86-1661/lib/python2.7/site-packages/SQLAlchemy-1.1.15-py2.7-linux-x86_64.egg/sqlalchemy/sql/ddl.py", line 712, in visit_metadata

    [t for t in tables if self._can_create_table(t)])

  File "/export/src/spring_v0-86-1661/lib/python2.7/site-packages/SQLAlchemy-1.1.15-py2.7-linux-x86_64.egg/sqlalchemy/sql/ddl.py", line 687, in _can_create_table

    table.name, schema=effective_schema)

  File "/export/src/spring_v0-86-1661/lib/python2.7/site-packages/SQLAlchemy-1.1.15-py2.7-linux-x86_64.egg/sqlalchemy/dialects/sqlite/base.py", line 1181, in has_table

    connection, "table_info", table_name, schema=schema)

  File "/export/src/spring_v0-86-1661/lib/python2.7/site-packages/SQLAlchemy-1.1.15-py2.7-linux-x86_64.egg/sqlalchemy/dialects/sqlite/base.py", line 1572, in _get_table_pragma

    cursor = connection.execute(statement)

  File "/export/src/spring_v0-86-1661/lib/python2.7/site-packages/SQLAlchemy-1.1.15-py2.7-linux-x86_64.egg/sqlalchemy/engine/base.py", line 939, in execute

    return self._execute_text(object, multiparams, params)

  File "/export/src/spring_v0-86-1661/lib/python2.7/site-packages/SQLAlchemy-1.1.15-py2.7-linux-x86_64.egg/sqlalchemy/engine/base.py", line 1097, in _execute_text

    statement, parameters

  File "/export/src/spring_v0-86-1661/lib/python2.7/site-packages/SQLAlchemy-1.1.15-py2.7-linux-x86_64.egg/sqlalchemy/engine/base.py", line 1189, in _execute_context

    context)

  File "/export/src/spring_v0-86-1661/lib/python2.7/site-packages/SQLAlchemy-1.1.15-py2.7-linux-x86_64.egg/sqlalchemy/engine/base.py", line 1402, in _handle_dbapi_exception

    exc_info

  File "/export/src/spring_v0-86-1661/lib/python2.7/site-packages/SQLAlchemy-1.1.15-py2.7-linux-x86_64.egg/sqlalchemy/util/compat.py", line 203, in raise_from_cause

    reraise(type(exception), exception, tb=exc_tb, cause=cause)

  File "/export/src/spring_v0-86-1661/lib/python2.7/site-packages/SQLAlchemy-1.1.15-py2.7-linux-x86_64.egg/sqlalchemy/engine/base.py", line 1182, in _execute_context

    context)

  File "/export/src/spring_v0-86-1661/lib/python2.7/site-packages/SQLAlchemy-1.1.15-py2.7-linux-x86_64.egg/sqlalchemy/engine/default.py", line 470, in do_execute

    cursor.execute(statement, parameters)

sqlalchemy.exc.OperationalError: (sqlite3.OperationalError) disk I/O error [SQL: u'PRAGMA table_info("segments")']


Version


spring --version && springenv python -c 'import platform; print platform.platform()'

Spring environment loaded.

GUI from package Emspring-0.86.1661

Spring environment loaded.

Linux-2.6.32-696.20.1.el6.centos.plus.x86_64-x86_64-with-centos-6.10-Final



Thank you for your time and help.   


Isabelle Gordon
Washington University in St. Louis - School of Medicine 
Neurology Department 

Carsten Sachse

unread,
Mar 12, 2020, 4:34:43 PM3/12/20
to emspring
Hi Isabelle,

You are the first person reporting such a compatibility issue with Lustre-based file system. There may not be an easy fix for this more general issue. I guess your IT manager may have looked into this. As a workaround, you can certainly work on your local drive and restrict the computing to a single node. In case you have large CPU per node, this will also be quite efficient. By design, SPRING will need a generally accessible file system for every CPU job.

Best wishes,


Carsten
Reply all
Reply to author
Forward
0 new messages