Hi Daniel,
We're using AtoM 2.7.1.
The CSV had around 1000 rows. But I noticed something interesting:
If there are no digitalObjectURL values (links to digital objects stored in our AWS S3), the import works fine even with 3000+ records.
If digitalObjectURL is present, errors start happening once there are ~500+ records (after ~15 minutes).
Here's the error log:
PHP Fatal error: require(): Failed opening required '/usr/share/nginx/atom/vendor/symfony/lib/exception/sfDatabaseException.class.php' (include_path='/usr/share/nginx/atom:/usr/share/nginx/atom/vendor/symfony/lib/plugins/sfPropelPlugin/lib/vendor:/usr/share/nginx/atom/vendor/symfony/lib/plugins/sfPropelPlugin/lib/task:/usr/share/php') in /usr/share/nginx/atom/vendor/symfony/lib/autoload/sfCoreAutoload.class.php on line 100
Full command I used:
sudo -u www-data /usr/bin/php -d memory_limit=-1 /usr/share/nginx/atom/symfony csv:import /home/deploy/csv-files-archival-descriptions/2026-02-09.csv
Let me know if you need anything else.
Thank you!
Best,
Ekaterina
суббота, 14 февраля 2026 г. в 06:42:15 UTC+9, Daniel Lovegrove: