First, thanks for the reply.
After posting this discussion, I put together a sheet listing all app scripts that increments a count each time the url fetch is called. So it tacked roughly 25000 calls to url fetch in some capacity. (This would account for about 2/3 of a days usage based on when I turned it on.)
Now I am using a workspace account so I am assuming the 100k limit is what should be applied to this situation.
To answer your questions:
1. judging from the tracker, it is not expected that I am surpassing 100k, but as I have been learning these systems from scratch over the past 9 months I most likely have some less than efficient code.
2. Since I am working with creating as close to real-time dashboards as possible, there is a handful of data that can be cached (and I just started trying to work this into my code because of this issue) such as data that wouldn't change very often or only changes daily.
One huge note to add here is I created a library with a sort of "base api call function", so that I could just add the library to other app scripts and call the function with the necessary parameters rather than creating the entire URL fetch every single time.
I guess my question now is, should I rethink creating live dashboards all together, or is there some best practice that I just have not found yet when it comes to dealing with live data like this?