BQML Cost for Agencys / Minimum recommended amount of BQ Data

29 views
Skip to first unread message

Pascal Röllin

unread,
Feb 24, 2025, 2:54:40 AMFeb 24
to Instant BQML and Vertex Users
Hi everyone, 

I recently learned about BQML & CRMint and this is quite an interessting topic for me. As for someone working in an Agency, I wondered what it would cost to scale over 10-100 GA4 Propertys. If I use one instance of CRMint and add all Propertys / Datasets to this instance, will it use for a new Vertex AI Training instance for each job, or will it run one job after another or what should I expect in terms of cost here? (and is there an simple way to optimize this? (Like by using the same Bigquery Project for all BQ Data or whatever)

Also, I also have many propertys that did not export to BQ until now, while I'm turning it on now, how long should I expect to "wait" to have "enough" Data to get some outcome? 

Thanks in advance

Pascal 

Instant BQML and Vertex Users

unread,
Feb 25, 2025, 1:01:41 AMFeb 25
to Instant BQML and Vertex Users

Hi Pascal,

Thanks for your interest in Instant BQML and Vertex! It's a great tool for agencies looking to scale AI-powered marketing solutions.

Let's address your questions:

Scaling CRMint for Multiple GA4 Properties:

  • Model Deployment: Typically, we see customers deploy a unique Vertex AI model per property and even a unique model per marketing objective/use case (like purchase propensity, product propensity, event propensity, etc.) since GA4 properties are usually measuring and capturing differing and related information. The Instant Vertex approach configures both the training and prediction pipeline for you. Check out the Instant Vertex Training Program (https://docs.google.com/presentation/d/1dzdSCqhKjysMFA_dMCxSrkYtycRPiS2PJMr83gpJQ94/edit?usp=sharing) to see it in action.
  • CRMint Scalability: CRMint itself can definitely scale to handle many pipelines. I've personally seen a CRMint instance with 1700+ pipelines. This did take some backend configuration and spacing out of the scheduled jobs, but it can handle it.
  • Agency Architecture: From an agency perspective, it might make more sense to deploy a unique CRMint application in each customer's Google Cloud Platform project, wherever their GA4 BigQuery export is pointed. This depends on whether you're exporting data to a centralized location that you manage or to individual customer instances that you have access to.

Cost Considerations:

  • Default Pipeline Cost: The default Instant Vertex pipeline will cost you roughly $240 per model per month.
  • Cost Breakdown: The default Instant Vertex pipeline trains 1x per week (~$80/month) and predicts 4x per day (~$160/month), which you can modify depending on your cost constraints. These costs can fluctuate depending on the amount of data you're analyzing.
  • CRMint Cost: The cost of running an instance of CRMint is approximately $60/month.

Data Requirements and Timeline:

  • Start Early: There's no need to delay! Once you have data flowing from GA4 into BigQuery using the native export, you can begin building models.
  • Minimum Data: You'll need at least 1000 unique visitors over the time period you're analyzing, with at least one positive example (e.g., a purchase) and one negative example (e.g., a non-purchaser).
  • Continuous Improvement: Since the model is retrained weekly and uses data collected up to a year in the past (if available), each week your model and predictions will accumulate more data and improve. Get started building machine learning models optimized to your customers' objectives as soon as the data hits BigQuery.
Reply all
Reply to author
Forward
0 new messages