Hi all,
Sorry, if this has been asked and answered before.
Does someone created a script/sql-query or maybe can provide combination of command line flags to create a ‘report’ for:
for different time scalings (perDay, perWeek, perTimeInterval) using historic data of the stored slurmdb records, and could share it?
Additional question: Does anyone got PySLURM running for slurm version 20.x.x?
Many thanks in advance.
Cheers,
-Frank
I'll never miss an opportunity to plug XDMoD for anyone who doesn't want to write custom analytics for every metric. I've managed to get a little bit into its API to extract current values for number of jobs completed and the number of CPU-hours provided, and insert those into a single slide presentation for introductory meetings.
You can see a working version of it for the NSF XSEDE facilities at https://xdmod.ccr.buffalo.edu
From:
slurm-users <slurm-use...@lists.schedmd.com> on behalf of Hadrian Djohari <hx...@case.edu>
Date: Tuesday, April 13, 2021 at 8:11 AM
To: Slurm User Community List <slurm...@lists.schedmd.com>
Subject: Re: [slurm-users] derived counters
External Email Warning
This email originated from outside the university. Please use caution when opening attachments, clicking links, or responding to requests.
Hi all, many thanks for all hints. The link in the latest pointing points to an impressive switch-board.
Cheers,
-Frank
Before you get all excited about it, we have had a terrible time trying to get gppu metrics. Finally abandoned and switch to Grafana, Prometheus influx. Good luck to you though.