haMSM error

7 views
Skip to first unread message

Megha Parashar

unread,
Jan 28, 2026, 2:12:44 PM (6 days ago) Jan 28
to westpa-users
Hello all,
While running the haMSM-WE, I am getting this error:
===== Restart 0, Run 2 initializing =====


Run:
 w_init --tstate-file tstate.file --bstate-file bstates/bstates.txt --sstate-file None --segs-per-state 2

exception caught; shutting down
-- ERROR    [w_run] -- error message: [Errno 24] Too many open files
-- ERROR    [w_run] -- Traceback (most recent call last):

I decreased the number of bins, just to check if bins are playing a role here. But still getting the same error. I could  not figure out how to resolve it, and , why does it occur in the first place?

Thanks,
Megha

Jeremy Leung

unread,
Jan 28, 2026, 2:49:38 PM (6 days ago) Jan 28
to westpa-users
Hi Megha,

It's hard to tell exactly what is causing this error just from your output snippet, but to solve it you can increase your ulimit, which is UNIX's cap for number of open files per process. Defaults are often at 1024, which might be too low depending on the number of files you have to deal with. Also check your get_pcoord.sh to make sure you're not (unnecessarily) opening too many files at the same time!

```
ulimit -n 5012
```

You might also try working with a different work-manager (` --work-manager='threads' ` or ` --work-manager='processes' `) and see if the way python/bash/shell deals with open files change. `--debug` flag will help too.

Best,

Jeremy L.

---
Jeremy M. G. Leung, PhD
Research Assistant Professor, Chemistry (Chong Lab)
University of Pittsburgh | 219 Parkman Avenue, Pittsburgh, PA 15260
jml...@pitt.edu | [He, Him, His]
Reply all
Reply to author
Forward
0 new messages