Skip to main content
GA4 | newsletter

MeasureSummit + some other urgent updates

โ€” Balazs Vajna

Hi Folks,

Just a quick out-of-course newsletter here with a few time sensitive news.

MeasureSummit spring edition next week

After the last MeasureSummit edition, it was announced that in 2026 there would be two events, given the enourmous interest both on the presenter and on the attendee side. The next one is approaching very soon: it will be held next week on 12th and 13th May, between 9am and 3pm EST time zone.

If you haven't done so yet, sign up for the free event on the MeasureSummit home page. Over If you already know you'll want to have the option to watch presentations you've missed, or re-watch ones that you found particularly useful (or get the transcripts to feed your AI knowledge bank and/or the sides), get the All Access Pass while it's on sale. From experience, it's not possible (or very-very hard) to watch all the interesting ones during the event itself.

Of course it's hard to tell only from the presentation titles which ones will have a BigQuery focus, but here are the ones that look promising in this regard:

  • Data Modeling with GA4 & GSC in BigQuery by Marco Giordano
  • Anomaly Detection in GA4 at Scale by Siavash Kanani
  • The Power of Data Warehouse by Benoit Olivier Weber (even if itโ€™s not BigQuery itโ€™ll be highly relevant)
  • How We Made dbt Core Work by Roman Petrochenkov (again not sure it'll be in BigQuery, but it'll surely touch on SQL and it will be relevant)
  • Advanced Attribution Tweaks with AI by Alex Ignatenko

Surely there will be others too, but these one stood out based on either the title or some comments by the author. Mine this time will of course mention BigQuery, but the talk will be tool stack agnostic, providing a review of the different options for data activation i.e. using the data directly to drive value.

The Google Ads BigQuery Data Transfer did not use to have a retention limit and could be backfilled indefinitely. This is going to change now: after 1st June, you can "only" backfill 37 months' worth of data.

While this is a very generous limit and I presume 99% of users will not be affected, it's worth being aware of the limitation, and there might be edge cases where it introduces extra complexity: e.g. Marketing Mix Modelling (if you want 4-5 years' of data) or long term customer cohort / LTV calculations.

This is an API limitation and will affect all the API based data retrieval methods, not just the data transfers. The platforms affected are Google Ads, Google Analytics and Search Ads 360. Have a look at this LinkedIn post for more details.

This is a narrow deadline, 1st June is approaching fast: if you want to get some very old data before it hits, see some tricks on the GA4BigQuery platform in this free article and an even faster way in this premium article.

Google Tag Manager containers will be Google Tags

Using Simo Ahava's words: at Google Marketing Live (May 21st), GTM is getting its most significant update in years. For more details, I'll just refer to his Linkedin post: have a read as it's worth thinking about this in advance of the change rolling out.

--

I know I've been quiet with BigQuery updates for far too long: aiming to get the March updates out next week and the April ones shortly after that.

Until then, happy querying,

Balazs