Hello API testers,
In order to help new testers get a better understanding of the mechanism of adding noise when summary reports are generated, we are launching a new simulation tool - Noise Lab. This tool is meant to help users build an intuition of noise management by guiding them through the various levers available with the configuration of the Attribution Reporting API to estimate the level of noise in the output.
Main use-cases of Noise Lab:
Build up your understanding of the parameters that impact noise
Quickly test out strategies to reduce noise impact
This tool requires no technical setup and is modeled based on the real API which means that the simulation uses the mechanisms the aggregation service uses to add noise.
For more details about noise, Noise Lab features, and a demo, please check out this video.
Additionally, this tool is also accompanied by a user guide that walks you through the different scenarios that can be achieved with the Noise lab tool.
What should you do?
If you are now getting started with the Attribution Reporting API, you can use this simulation tool to get a better understanding about the impact of noise and test your strategies.
Resources:
Noise Lab web app → https://goo.gle/noise-lab
Noise Lab open-source code → https://goo.gle/noise-lab-oss-code
Noise Lab user guide → https://goo.gle/noise-lab-guide
RMSRE → https://goo.gle/rmsre
Give feedback or request a feature →https://github.com/GoogleChromeLabs/privacy-sandbox-dev-support
Simulation library → https://goo.gle/45Sb6wz