Hi, folks,
Thanks for all the helo to date on Arelle. One thing that I've experienced and seen mentioned is that Arelle is reasonably computation-intensive, and it will take hours to run even on reasonably powerful hardware. I am mostly interested in validating DORA registers of information xBRL-CSV files.
That said, I am trying to understand what a "recommended" spec would be for an Arelle processing machine - what params make a difference.
Running locally on my M1 MacBook Pro (not super powerful, I know )with 32GB of RAM. Right now it's taking 12-24 hours to validate a not-terribly-complicated DORA package.
I see Arelle GUI running at 100% of 1 CPU core and pretty much exactly 1GB of memory when processing, and up to 2-3GB of memory when working on views. Arelle CLI seems to top out at 100% of 1 CPU and ~2-3GB of memory (even when more is available). This is a little concerning, since it seems to indicate Arelle won't max out the resources available on the machine, but I'm sure better hardware would help. Also possible the Mac version is less able to scale than on other OS's.
I don't really need the GUI, so would probably run from the CLI. Looking for a recommendation on:
* Which OS does Arelle run best on, if it matters? I can get a Linux or Windows VM very easily.
* How many processor cores can Arelle scale to use? Don't want to request 16 and have 15 sit idle
* How much memory can Arelle make use of for processing tasks?
* Are there any config settings in the CLI or GUI that I should be aware of that would allow Arelle to use more resources? There's nothing. I see that's obvious in the CLI docs.
Mike