Hello all,
I'm running into an issue with the caching system that Rez uses to manage its packages.
I'm using the python API in a launcher application to create resolved contexts for various tools and environments.
It seems as though while the tool is open, if a new package family, or new package version, etc. is added to a package repository that Rez will not pick it up if the package resources have already been searched.
An example of how I'm grabbing all of the available packages to display to the user:
from rez.packages_ import iter_package_families
def get_all_packages():
it = iter_package_families(config.packages_path)
seen = set()
for family_resource in it:
for package_resource in family_resource.iter_packages():
if key in seen:
continue
seen.add(key)
yield package_resource
The goal of the launcher is to be a tool that can be up at all times, and will always try to resolve the context with access to the latest package information.
I've found that I can force Rez to check for new packages by clearing the cache:
from rez.packages_import package_repository_manager
package_repository_manager.clear_caches()
This seems like a somewhat inefficient way to make sure that Rez is resolving to the current state of the package repositories.
Is this behavior intended, or is the cache actually more intelligent that I'm giving it credit for and there's something that I need to be doing differently?
Has anyone found a way of solving this, or is clearing the cache the correct way?
Thanks all!