I think you have a problem with your units.
"Summing" a gauge doesn't make much sense, because the sum will vary depending on how many data points you take. What you want is to integrate it over time. In that case, the problem is with your terminology and your units. "500GB of RAM over the last 24h" doesn't make much sense, unless you're talking about an average. If you want integrated usage, then your units are in the wrong format: do you mean 500GB-days? 500GB-seconds?
It's like measuring power from your home. If you put a kettle on, the instantaneous power usage might be 2kW. But the electricity company doesn't charge you in kW, it charges in kWh (energy consumed). Leave the kettle on for 30 minutes and your meter will click up to 1kWh. Leave it on for a whole hour and you'll be charged for 2kWh.
So if you are using 16GB of RAM, and you used it for 1 hour, then your usage is 16GB-hours, or 57600GB-seconds, or 0.667GB-days.
Anyway: you should find there are already CPU-usage metrics which are counters (in CPU-seconds), so those aren't a problem.
For memory usage, which is just a gauge, you can integrate the memory usage over a time period as follows:
* take the *average* memory usage over the period of interest (say 24 hours). If this is across multiple machines, then sum the averages.
* convert to GB (divide by 1024^3)
* if you want the answer in GB-days, then you have the answer
* if you want the answer in GB-seconds, then multiply it by 86400 (in this example).
If you want to accumulate this into a counter, then it's harder. I believe you can use a recording rule to accumulate, by adding to its own previous value (ugh). Or you could make a custom exporter which handles this logic.