Browsing facets problem

306 views
Skip to first unread message

Scott Renton

unread,
May 29, 2024, 5:37:14 AMMay 29
to archivesspac...@lyrasislists.org
Hi folks

Wondering if anyone has seen this before. We have just put in a test instance of ArchivesSpace with a view to a live upgrade; it runs searches perfectly fast, but browsing Creators or Subjects is taking an age for very small returns, and timing out and erroring on bigger ones**, rendering it pretty unusable. I know the facet links are far more complicated than basic searches, but generally they return pretty speedily. I've tried a number of configurations and reindexes, but to no avail. If anyone has any ideas, I'd be keen to give them a try.

Cheers
Scott


Running:
AS- version 3.4.1 (the same database seems to be running stably in live on version 3.2.0)
VM-  Rocky Linux 8, 16G VM, 4 CPUs.
Java-  openjdk version "1.8.0_412"- providing 6G XMX currently (had been on 10G) 
SOLR - initially running using the provided Dockerfile (8.10.1) in a container, but have also tried a VM-level 8.11.3- same behaviour. I've experimented with SOLR_HEAP (initially 512M, currently up at 4G) and made some light-touch adjustments to solrconfig.xml. It is now taking an age to index, perhaps in contrast to what I'd expect!
MySQL- MariaDB 10.6.18, same my.cnf setup as the stable one*. Haven't yet truncated deleted_records as there are only 1500 records in there.

Typical successful call (clicking on the person filter on a search):
I, [2024-05-29T10:22:54.804516 #1639909]  INFO -- : Started GET "/search?facets%5B%5D=primary_type&facets%5B%5D=source&facets%5B%5D=rules&filter_term%5B%5D=%7B%22primary_type%22%3A%22agent_person%22%7D&q=john&sort=score+desc" for 10.64.94.3 at 2024-05-29 10:22:54 +0100
I, [2024-05-29T10:22:54.807392 #1639909]  INFO -- : Processing by SearchController#do_search as HTML
I, [2024-05-29T10:22:54.807539 #1639909]  INFO -- :   Parameters: {"facets"=>["primary_type", "source", "rules"], "filter_term"=>["{\"primary_type\":\"agent_person\"}"], "q"=>"john", "sort"=>"score desc"}

Typical failing call (clicking on a specific person facet):
I, [2024-05-29T10:23:49.077545 #1639909]  INFO -- : Started GET "/search?filter_term%5B%5D=%7B%22creators%22%3A%22Anderson%2C+Tom%2C+1910-1991+%284927%29%22%7D&q=john&sort=score+desc" for 10.64.94.3 at 2024-05-29 10:23:49 +0100
I, [2024-05-29T10:23:49.082467 #1639909]  INFO -- : Processing by SearchController#do_search as HTML
I, [2024-05-29T10:23:49.082641 #1639909]  INFO -- :   Parameters: {"filter_term"=>["{\"creators\":\"Anderson, Tom, 1910-1991 (4927)\"}"], "q"=>"john", "sort"=>"score desc"}

**my.cnf setup
[mysqld]
basedir = /usr
bind-address = 127.0.0.1
datadir = /var/lib/mysql
expire_logs_days = 10
key_buffer_size = 16M
log-error = /apps/logs/mysql/mariadb.log
max_allowed_packet = 16M
max_binlog_size = 100M
max_connections = 151
pid-file = /var/lib/mysql/mysqld.pid
port = 3306
skip-external-locking
skip-ssl
socket = /var/lib/mysql/mysql.sock
ssl = false
thread_cache_size = 8
thread_stack = 256K
tmpdir = /tmp
user = mysql

*timeout error: 
WARNING: ERROR: couldn't handle exception (response is committed)
java.io.IOException: java.util.concurrent.TimeoutException: Idle timeout expired: 30001/30000 ms
at org.eclipse.jetty.util.SharedBlockingCallback$Blocker.block(org/eclipse/jetty/util/SharedBlockingCallback.java:257)
at org.eclipse.jetty.server.HttpOutput.channelWrite(org/eclipse/jetty/server/HttpOutput.java:270)
at org.eclipse.jetty.server.HttpOutput.write(org/eclipse/jetty/server/HttpOutput.java:896)
at java.io.OutputStream.write(java/io/OutputStream.java:75)
at java.lang.reflect.Method.invoke(java/lang/reflect/Method.java:498)
at org.jruby.javasupport.JavaMethod.invokeDirectWithExceptionHandling(org/jruby/javasupport/JavaMethod.java:456)
at org.jruby.javasupport.JavaMethod.invokeDirect(org/jruby/javasupport/JavaMethod.java:317)
at uri_3a_classloader_3a_.jruby.rack.response.invokeOther1:write(uri_3a_classloader_3a_/jruby/rack/uri:classloader:/jruby/rack/response.rb:178)
at uri_3a_classloader_3a_.jruby.rack.response.write_body(uri:classloader:/jruby/rack/response.rb:178)
at org.jruby.RubyArray.each(org/jruby/RubyArray.java:1821)
at org.jruby.RubyArray$INVOKER$i$0$0$each.call(org/jruby/RubyArray$INVOKER$i$0$0$each.gen)
at org.jruby.RubyClass.finvokeWithRefinements(org/jruby/RubyClass.java:514)
at org.jruby.RubyBasicObject.send(org/jruby/RubyBasicObject.java:1755)
at org.jruby.RubyBasicObject$INVOKER$i$send.call(org/jruby/RubyBasicObject$INVOKER$i$send.gen)
at apps.www.archivesspace.gems.gems.rack_minus_2_dot_2_dot_6_dot_2.lib.rack.body_proxy.invokeOther1:__send__(apps/www/archivesspace/gems/gems/rack_minus_2_dot_2_dot_6_dot_2/lib/rack//apps/www/archivesspace/gems/gems/rack-2.2.6.2/lib/rack/body_proxy.rb:41)
at apps.www.archivesspace.gems.gems.rack_minus_2_dot_2_dot_6_dot_2.lib.rack.body_proxy.method_missing(/apps/www/archivesspace/gems/gems/rack-2.2.6.2/lib/rack/body_proxy.rb:41)
at org.jruby.RubyClass.finvokeWithRefinements(org/jruby/RubyClass.java:512)
at org.jruby.RubyBasicObject.send(org/jruby/RubyBasicObject.java:1755)
at org.jruby.RubyBasicObject$INVOKER$i$send.call(org/jruby/RubyBasicObject$INVOKER$i$send.gen)
at apps.www.archivesspace.gems.gems.rack_minus_2_dot_2_dot_6_dot_2.lib.rack.body_proxy.invokeOther1:__send__(apps/www/archivesspace/gems/gems/rack_minus_2_dot_2_dot_6_dot_2/lib/rack//apps/www/archivesspace/gems/gems/rack-2.2.6.2/lib/rack/body_proxy.rb:41)
at apps.www.archivesspace.gems.gems.rack_minus_2_dot_2_dot_6_dot_2.lib.rack.body_proxy.method_missing(/apps/www/archivesspace/gems/gems/rack-2.2.6.2/lib/rack/body_proxy.rb:41)
at org.jruby.RubyClass.finvokeWithRefinements(org/jruby/RubyClass.java:512)
at org.jruby.RubyBasicObject.send(org/jruby/RubyBasicObject.java:1755)
at org.jruby.RubyBasicObject$INVOKER$i$send.call(org/jruby/RubyBasicObject$INVOKER$i$send.gen)
at apps.www.archivesspace.gems.gems.rack_minus_2_dot_2_dot_6_dot_2.lib.rack.body_proxy.invokeOther1:__send__(apps/www/archivesspace/gems/gems/rack_minus_2_dot_2_dot_6_dot_2/lib/rack//apps/www/archivesspace/gems/gems/rack-2.2.6.2/lib/rack/body_proxy.rb:41)
at apps.www.archivesspace.gems.gems.rack_minus_2_dot_2_dot_6_dot_2.lib.rack.body_proxy.method_missing(/apps/www/archivesspace/gems/gems/rack-2.2.6.2/lib/rack/body_proxy.rb:41)
at org.jruby.RubyClass.finvokeWithRefinements(org/jruby/RubyClass.java:512)
at org.jruby.RubyBasicObject.send(org/jruby/RubyBasicObject.java:1755)
at org.jruby.RubyBasicObject$INVOKER$i$send.call(org/jruby/RubyBasicObject$INVOKER$i$send.gen)
at apps.www.archivesspace.gems.gems.rack_minus_2_dot_2_dot_6_dot_2.lib.rack.body_proxy.invokeOther1:__send__(apps/www/archivesspace/gems/gems/rack_minus_2_dot_2_dot_6_dot_2/lib/rack//apps/www/archivesspace/gems/gems/rack-2.2.6.2/lib/rack/body_proxy.rb:41)
at apps.www.archivesspace.gems.gems.rack_minus_2_dot_2_dot_6_dot_2.lib.rack.body_proxy.method_missing(/apps/www/archivesspace/gems/gems/rack-2.2.6.2/lib/rack/body_proxy.rb:41)
at org.jruby.RubyClass.finvokeWithRefinements(org/jruby/RubyClass.java:495)
at org.jruby.RubyBasicObject.send(org/jruby/RubyBasicObject.java:1729)
at org.jruby.RubyKernel.send(org/jruby/RubyKernel.java:2182)
at org.jruby.RubyKernel$INVOKER$s$send.call(org/jruby/RubyKernel$INVOKER$s$send.gen)
at uri_3a_classloader_3a_.jruby.rack.response.invokeOther78:send(uri_3a_classloader_3a_/jruby/rack/uri:classloader:/jruby/rack/response.rb:177)
at uri_3a_classloader_3a_.jruby.rack.response.write_body(uri:classloader:/jruby/rack/response.rb:177)
at uri_3a_classloader_3a_.jruby.rack.response.invokeOther3:write_body(uri_3a_classloader_3a_/jruby/rack/uri:classloader:/jruby/rack/response.rb:98)
at uri_3a_classloader_3a_.jruby.rack.response.respond(uri:classloader:/jruby/rack/response.rb:98)
at org.jruby.rack.AbstractRackDispatcher.process(org/jruby/rack/AbstractRackDispatcher.java:33)
at org.jruby.rack.AbstractFilter.doFilter(org/jruby/rack/AbstractFilter.java:66)
at org.eclipse.jetty.servlet.FilterHolder.doFilter(org/eclipse/jetty/servlet/FilterHolder.java:201)
at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(org/eclipse/jetty/servlet/ServletHandler.java:1601)
at org.eclipse.jetty.servlet.ServletHandler.doHandle(org/eclipse/jetty/servlet/ServletHandler.java:548)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(org/eclipse/jetty/server/handler/ScopedHandler.java:143)
at org.eclipse.jetty.security.SecurityHandler.handle(org/eclipse/jetty/security/SecurityHandler.java:600)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(org/eclipse/jetty/server/handler/HandlerWrapper.java:127)
at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(org/eclipse/jetty/server/handler/ScopedHandler.java:235)
at org.eclipse.jetty.server.session.SessionHandler.doHandle(org/eclipse/jetty/server/session/SessionHandler.java:1624)
at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(org/eclipse/jetty/server/handler/ScopedHandler.java:233)
at org.eclipse.jetty.server.handler.ContextHandler.doHandle(org/eclipse/jetty/server/handler/ContextHandler.java:1434)
at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(org/eclipse/jetty/server/handler/ScopedHandler.java:188)
at org.eclipse.jetty.servlet.ServletHandler.doScope(org/eclipse/jetty/servlet/ServletHandler.java:501)
at org.eclipse.jetty.server.session.SessionHandler.doScope(org/eclipse/jetty/server/session/SessionHandler.java:1594)
at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(org/eclipse/jetty/server/handler/ScopedHandler.java:186)
at org.eclipse.jetty.server.handler.ContextHandler.doScope(org/eclipse/jetty/server/handler/ContextHandler.java:1349)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(org/eclipse/jetty/server/handler/ScopedHandler.java:141)
at org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(org/eclipse/jetty/server/handler/ContextHandlerCollection.java:234)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(org/eclipse/jetty/server/handler/HandlerWrapper.java:127)
at org.eclipse.jetty.server.Server.handle(org/eclipse/jetty/server/Server.java:516)
at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(org/eclipse/jetty/server/HttpChannel.java:400)
at org.eclipse.jetty.server.HttpChannel.dispatch(org/eclipse/jetty/server/HttpChannel.java:645)
at org.eclipse.jetty.server.HttpChannel.handle(org/eclipse/jetty/server/HttpChannel.java:392)
at org.eclipse.jetty.server.HttpConnection.onFillable(org/eclipse/jetty/server/HttpConnection.java:277)
at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(org/eclipse/jetty/io/AbstractConnection.java:311)
at org.eclipse.jetty.io.FillInterest.fillable(org/eclipse/jetty/io/FillInterest.java:105)
at org.eclipse.jetty.io.ChannelEndPoint$1.run(org/eclipse/jetty/io/ChannelEndPoint.java:104)
at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(org/eclipse/jetty/util/thread/strategy/EatWhatYouKill.java:338)
at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(org/eclipse/jetty/util/thread/strategy/EatWhatYouKill.java:315)
at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(org/eclipse/jetty/util/thread/strategy/EatWhatYouKill.java:173)
at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.produce(org/eclipse/jetty/util/thread/strategy/EatWhatYouKill.java:137)
at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(org/eclipse/jetty/util/thread/QueuedThreadPool.java:883)
at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(org/eclipse/jetty/util/thread/QueuedThreadPool.java:1034)
at java.lang.Thread.run(java/lang/Thread.java:750)
Caused by: java.util.concurrent.TimeoutException: Idle timeout expired: 30001/30000 ms
at org.eclipse.jetty.io.IdleTimeout.checkIdleTimeout(IdleTimeout.java:171)
at org.eclipse.jetty.io.IdleTimeout.idleCheck(IdleTimeout.java:113)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)


==========

Scott Renton

Digital Library Development & Systems

Floor F East

Argyle House

515219

The University of Edinburgh is a charitable body, registered in Scotland, with registration number SC005336. Is e buidheann carthannais a th’ ann an Oilthigh Dhùn Èideann, clàraichte an Alba, àireamh clàraidh SC005336.

Joshua D. Shaw

unread,
May 29, 2024, 8:01:43 AMMay 29
to archivesspac...@lyrasislists.org
Hi Scott

I've seen this before running on my dev laptop when we moved to 3.3.1 (from 3.1.1). There's probably something right above the error in the logs that will give you a response time for a previous operation that's over the limit. In my case it was a notification thread (controlled by: AppConfig[:notifications_backlog_ms] = 60000), but it all turned out to be red herrings (as far as I can tell). 

In my case, the actual problem seems to have been the Solr timeout and the indexer config - basically Solr was failing to commit records in a timely fashion.

We've tuned our AppConfig with the following, but these will depend on your situation.

# Solr commit time
AppConfig[:indexer_solr_timeout_seconds] = 7200

# PUI thread count
AppConfig[:pui_indexer_records_per_thread] = 50
AppConfig[:pui_indexer_thread_count] = 2

We have not fiddled with the actual solrconfig.xml. Locally I'm running

  1. Java 11
  2. MariaDB 10.11.8 (the latest 10.11 release - not​ 10.11.7 which we've had issues with)
  3. Solr 8.11.2
  4. 4G memeory allocated each for the app and Solr
  5. Nothing special in my.cnf

I'd fiddle with the Solr and indexer settings and rerun the index. How many records do you have? We have about 800k AOs and 20k resources and our vanilla index run (without plugin additions) takes about 6-8 hours when I run locally (bare metal on an M2 Mac). Containerization usually adds 25-50% to that locally.

Unfortunately I never found a definitive answer and was more interested in getting things wokring than doing a deep dive into creating a reproducible error, so this is more of a 'these changes fixed it for us' type reply.

Best,
Joshua


From: archivesspac...@lyrasislists.org <archivesspac...@lyrasislists.org> on behalf of Scott Renton <Scott....@ed.ac.uk>
Sent: Wednesday, May 29, 2024 5:36 AM
To: archivesspac...@lyrasislists.org <archivesspac...@lyrasislists.org>
Subject: [ArchivesSpace Users Group] Browsing facets problem
 
You don't often get email from scott....@ed.ac.uk. Learn why this is important
--
You received this message because you are subscribed to the Google Groups "Archivesspace_Users_Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email to Archivesspace_User...@lyrasislists.org.
To view this discussion on the web visit https://groups.google.com/a/lyrasislists.org/d/msgid/Archivesspace_Users_Group/AS8PR05MB762483C868E3D6ABDA1EE8B3BAF22%40AS8PR05MB7624.eurprd05.prod.outlook.com.

Scott Renton

unread,
May 30, 2024, 5:15:12 AMMay 30
to Joshua D. Shaw, archivesspac...@lyrasislists.org
Thanks Joshua, much appreciated. This isn't huge- 126000 AOs and 50000ish digital objects- but the index on the live version is over 4GB, which sounds pretty massive .

I've changed some of the settings (the pui isn't even enabled for this, as an external site pulls data via the API- I should have said this problem is all in the frontend)  as per your experience.

AppConfig[:indexer_records_per_thread] = 50
AppConfig[:indexer_solr_timeout_seconds] = 7200 

I had the thread count at 8 when I reindexed, but put it back to 4 (I only have 4 CPUs on this machine). Index complete- interestingly the size of the index has come down from 4.2GB to 2.9GB, which sounds positive.

No change to the behaviour though, and other than the size of the index not really anything to suggest I'm getting closer. I do notice that it's not just facet browsing it struggles with, but also searches with more than one term.

I have another installation running v3.4, and it's totally fine (more content, but interestingly a smaller index, solr_heap unchanged at 512Mb in fact). I wonder if it's worth putting 3.2 back on (just as you'd said you'd seen similar problems going from 3.1 to 3.3, Joshua).

Still struggling!
Scott

==========

Scott Renton

Digital Library Development & Systems

Floor F East

Argyle House

515219


From: archivesspac...@lyrasislists.org <archivesspac...@lyrasislists.org> on behalf of Joshua D. Shaw <Joshua...@dartmouth.edu>
Sent: 29 May 2024 13:01
To: archivesspac...@lyrasislists.org <archivesspac...@lyrasislists.org>
Subject: [ArchivesSpace Users Group] Re: Browsing facets problem
 
This email was sent to you by someone outside the University.
You should only click on links or attachments if you are certain that the email is genuine and the content is safe.

Joshua D. Shaw

unread,
May 30, 2024, 7:39:45 AMMay 30
to Scott Renton, archivesspac...@lyrasislists.org
Hey Scott

That's interesting that one instance works OK, but the other does not. Is there a qualitative difference in the data between the instance that works fine vs this one?

Here are a couple of other things that might be useful to check/try.

  1. Wasn't sure if you had the PUI indexing turned off all together or not, but if you haven't: AppConfig[:pui_indexer_enabled] = false​ and rerun the indexer.
  2. During the indexer run do you see any​ errors - especially ones that mention timeouts?
  3. Does checking/searching Solr directly yield the same problem(s)?
  4. Can you see anything in the Solr logs themselves?
  5. Solr can take a looooong time to finalize the data - even after the index round claims to have finsihed. I've seen it take up to several hours before things are actually done.

For comparison, our Solr data is about 9GB, but we have few subjects and few agents linked to archival objects. A large number of objects linked to a single subject has been the culprit for some when it comes to slow/problem searches.

Best,
Joshua


From: Scott Renton <Scott....@ed.ac.uk>
Sent: Thursday, May 30, 2024 5:14 AM
To: Joshua D. Shaw <Joshua...@dartmouth.edu>; archivesspac...@lyrasislists.org <archivesspac...@lyrasislists.org>
Subject: Re: Browsing facets problem
 

Scott Renton

unread,
May 30, 2024, 12:41:49 PMMay 30
to Joshua D. Shaw, archivesspac...@lyrasislists.org
Thanks again Joshua.

Not hugely different. The other application has 145474 records, like so:
It reports 1.99GB as its index. Its config is almost out of the box.

To answer the questions...

  1. Yeah, pui disabled as is pui_indexer.
  2. Running a separate log for the indexer, set to debug, no errors (index round lasted from 2024-05-29T13:52:46 to I, [2024-05-29T19:59:41] However, its last message was this:  INFO -- : Thread-2932: Staff Indexer [2024-05-29 19:59:41 +0100] ~~~ Indexed 89600 of 126837 archival_object records in repository Tad_Sandbox. The application SAYS there are 126837 records, but it seems to stop indexing at 89600? Both the application and SOLR admin panel report 198830 records in total, which this adds up to.  In indexer_state "May 29 21:00 2_archival_object.dat". So it reported as finished an hour later, even if we didn't see the logging report. That does seem  a bit odd.
  3. Good question. If I search for "john mackay" (9319 entries) (https://test.tad.is.ed.ac.uk/search?utf8=%E2%9C%93&q=john+mackay), it takes 50 seconds in the application , and 1.66 seconds in total in the SOLR admin panel (http://lac-archivesspace-test2.is.ed.ac.uk:8983/solr/#/archivesspace/query?q=john%20mackay&q.op=AND&indent=true&rows=50). If I click on this name as a facet, it will never come back, but I think seeing this searching discrepancy is useful as it gives evidential numbers. I'm sure if I can nail the cause both searching and browsing facets will improve.
  4. In that indexing run, only the warnings about security in SOLR Logging. No actual errors.
  5. I know what you mean- I have seen that in AS. Been finished a good 22 hours now though.

There is quite a consistent structure to data in this instance. Only 1 repo-3 series-about 17000 sub series and the rest items (it is not traditional archives, but an archive of tapes and tracks).

A couple of oddities in there, I guess.

Cheers yet again
Scott


==========

Scott Renton

Digital Library Development & Systems

Floor F East

Argyle House

515219


From: Joshua D. Shaw <Joshua...@dartmouth.edu>
Sent: 30 May 2024 12:39
To: Scott Renton <Scott....@ed.ac.uk>; archivesspac...@lyrasislists.org <archivesspac...@lyrasislists.org>

Subject: Re: Browsing facets problem

Blake Carver

unread,
May 30, 2024, 8:08:35 PMMay 30
to archivesspac...@lyrasislists.org
>> "Staff Indexer [2024-05-29 19:59:41 +0100] ~~~ Indexed 89600 of 126837 archival_object records in repository Tad_Sandbox. The application SAYS there are 126837 records, but it seems to stop indexing at 89600? "


If that's really the final line, something doesn't seem right. It should finish up with something more like "Indexed  126837 archival_object records in 8956 seconds ".

I wouldn't be surprised if it crashes and starts over after a little while from that line. If you grep Indexed can you see the "Indexed" count start over?  I don't think it always logs the crash, rather it just starts over at 1 again.

Sent: Thursday, May 30, 2024 12:41 PM
Subject: [ArchivesSpace Users Group] Re: Browsing facets problem
 

Scott Renton

unread,
May 31, 2024, 6:24:35 AMMay 31
to Archivesspac...@lyrasislists.org, Blake Carver
Thanks Blake, much appreciated. Re-extracted the data, reloaded, reindexed. Indexing round did complete, and I got the appropriate messages (it was slower, but I was running fewer threads).

I, [2024-05-31T09:57:49.164126 #1815845]  INFO -- : Thread-2932: Staff Indexer [2024-05-31 09:57:49 +0100] Indexed 126837 records in 32975 seconds

Hate to say it, but it's now running slower than yesterday! Even the single-term searches, or browsing at type level, which were returning quickly, are now taking an age. Everything nice and fast in the SOLR admin panel though. I note that running the time-consuming queries is very heavy on CPU for the mariadb process.

I wonder- would there be any means to get some time to show yourself or a colleague, to talk this through?

Best wishes
Scott


==========

Scott Renton

Digital Library Development & Systems

Floor F East

Argyle House

515219


From: 'Blake Carver' via Archivesspace_Users_Group <Archivesspac...@lyrasislists.org>
Sent: 31 May 2024 01:08

Scott Renton

unread,
Jun 3, 2024, 11:27:57 AMJun 3
to archivesspac...@lyrasislists.org, Blake Carver

Hi everyone, 

Just a quick note to say that a move to v.3.5 has resolved this issue, as per Blake's suggestion. I guess if anyone else sees it and they're on v.3.3 or v.3.4, that would be the obvious fix.

Thanks Blake and Joshua for looking at this, it's really massively appreciated.

Cheers
Scott


==========

Scott Renton

Digital Library Development & Systems

Floor F East

Argyle House

515219


Reply all
Reply to author
Forward
0 new messages