Hi Brad,
In general we recommend against increasing the PHP timeout limit for public AtoM sites because of the problem your are experiencing, allowing longer running requests creates more load on the server. In many cases requests to large EAD files (e.g. photographic-material;ead?sf_format=xml) are made by search crawlers, which exacerbates the problem. In a future version of AtoM we would like to move the expensive work of generating large EAD documents to a background process instead of trying to generate the document each time it is requested, but this enhancement will require community sponsorship to make it a reality.
In most cases when we've noticed significant slow downs in AtoM performance the bottleneck is that MySQL is maxing out the CPU load. You can check if this is the bottleneck in your case by running "uptime" on your MySQL server and checking your server load - for 4 CPU cores a load of 4 or higher means that processes are waiting for CPU time. You can confirm that MySQL is the process using up the majority of the CPU by using "top" or "htop" to check the resources being using by the mysqld process. While it's true that AtoM doesn't usually show very long query times, it does make a lot of queries, especially for long operations like generating large EAD finding aids.
I think it's unlikely that the number of avaialble PHP processes are your bottleneck. Assuming you are using our
installation instructions, the php-fpm "pm" settings (e.g. pm.max_children, pm.start_servers) listed should allow serving up to 30 PHP processes at the same time. However, you could try increasing the number for "pm.max_children" to see if it helps (make sure to restart php4-fpm after changes). You can count the number of php-fpm processes running with "ps fauxwwww | grep -c php5-fp[m]".
You may also want to analyze your web logs - in many cases where we've seen significant slowdowns in AtoM it's due to search engine web crawlers making a lot of requests in a short amount of time - often tens of thousands of requests a day. We've had good results with adding a
robots.txt Crawl-delay to slow down requests (30 is a good initial value to try) and blocking particularly demanding or unwanted web crawlers.
I hope that helps!
Best regards,
David