Skip to main content
GA4 | newsletter

#19 - Conversational Analytics in BigQuery and other updates in January

β€” Balazs Vajna

Hi there,

A warm welcome to all new subscribers who joined since the start of 2026.

Time flies, and as usual, lots of things have happened both in the BigQuery "labs" and in the digital analytics community. Let's have a closer look.

Conversational Analytics is now generally available

We live in a turbulent era, where we're flooded by LLM and other Generative AI advice and options, while we try to get a grip and to find tangible ways of using AI to increase productivity.

Well, here is one.

Conversational Analytics has been around since late October; you might have seen people posting about this feature or you may have used the the pre-release sign-up form yourself. Now it appears in every project as the "Agents" item in the left-side menu.

New Agents tab in the BigQuery menu
New Agents tab in the BigQuery menu

The tool promises to enable us to "chat with our data"; something that stakeholders have been aspiring for and is pretty much sci-fi grade cool. And, I have to say, it's pretty decent indeed: you can set up an agent in relatively short time, with a couple of guardrails to make sure its answers are relevant and accurate (and that the cost does not blow up). It interprets your question and gets you the answer; it can draw charts as well, and therefore provides a neat way to drill into the data to perform ad-hoc analyses.

Conversational analytics window: configuring and testing an agent
Conversational analytics window: configuring and testing an agent

There are several configuration options to get it to work well:

  • Knowledge Source: which objects (tables, UDFs etc) it is allowed to use
  • Natural language instructions that always apply; some soft rules (e.g. which fields to use or not use) or more context about the data
  • Verified queries: your own SQL queries to answer certain questions to guide the LLM better and anchor the analyses into the "ground truth"
  • Glossary: metadata so that Gemini better understands your data
  • Labels: for subsequent cost analysis, especially if you query across projects
  • Query size limit: to make sure you don’t rack up a big bill by asking the wrong question on the wrong table

It's not a magic bullet, but I think it has a good chance for adoption. Just like dashboards, this is best built on a previously curated and purposefully structured data mart (in the "widest" sense of the term, be it a schema or a one-big-table). I don't think it's time to call "dashboards are dead", but it is a good alternative especially for self-serve analytics. If you want to dig deeper, some good discussions in the comment sections of this, this and this LinkedIn post.

πŸ’‘
It does need proper config to work; just like plugging raw data into a BI tool is usually not a good idea, asking it questions about a raw table without enough prior context will not magically get you correct answers. So our jobs are not in danger yet. (?)

New Data Transfers

With some recent price hikes in ELT land, companies are "waking up" and are looking for alternatives to get rid of their expensive connectors. BigQuery does seem to aggressively go after this market. Two new transfers to watch:

  • Shopify: importing customer, order, product (etc) data. In e-commerce world, the back-end "golden truth" data is almost a necessity beside the GA4 data, as the latter often does not capture everything, or gets a little "noisy". The two together become very powerful though: this data will give you the right revenue and margin information, while with GA4 you can spot behavioral patterns that you can tie to the financials.
  • Mailchimp: enables performance analysis on owned media in a similar manner to how advertising campaigns are evaluated. To be more concrete, this provides the "left side" of the performance information, starting from "impressions" i.e. emails sent and "clicks", which then we can continue with sessions and conversions from GA4 (and CRM) data. (You still have to UTM your emails properly to be able to tie it all together.)

New UI features

Gemini creeps into all sorts of different corners in BigQuery. There have been some more "silent" additions. One is a much better autocomplete in the SQL query window. It's not perfect, but I've actually found myself accepting the offered string several times. Fortunately it's not invasive, you can just ignore it and type further if the recommendation isn't useful. You might have noticed this yourself as it seems to be available in all the projects now.

Smart autocomplete in SQL editor powered by Gemini in BigQuery
Smart autocomplete in SQL editor powered by Gemini in BigQuery

The other change is similarly subtle and can end up being irritating, if you don't know how to deal with it.

In the Google Cloud console for BigQuery, editor tabs now have two states: "temporary" or "permanent". The purpose is to prevent accidental tab proliferation, but it can be very confusing if you're not aware of the new behavior.

  • Previously, when you clicked on a new resource, it was opened in a new tab.
  • Now, when you click on a new resource, the details of the new resource will replace the content of that current tab.
  • If the tab name is shown in italics (which is the case in the above example), it indicates a "preview" or temporary state, and means it will be replaced when you open anything else.
  • To make a temporary tab permanent: double-click the tab name. The name will be shown with normal (not italic) font. Now when you open another resource, it'll open in a new (temporary) tab.
  • To open a new permanent tab: Press Ctrl (or Command on macOS) and click the resource/table name in the Explorer menu.
Permanent and temporary tabs in BigQuery Studio (note the italic font on the right)
Permanent and temporary tabs in BigQuery Studio (note the italic font on the right)

Keep in mind that you can also split view and close other tabs by clicking the "arrow_drop_down" icon next to the tab name and select Split tab to left / right or Close other tabs, respectively. 

Other BigQuery updates

  • A new way to do "vibe querying" (if you're into that sort of thing) by utilizing the commenting feature in SQL: explaining what you want in comments between /* */ and then "expanding" the query using Gemini. Credit to this LinkedIn post for putting it on my radar.
  • New security policy rolled out in Dataform and a couple of other services (BigQuery Notebooks, Pipelines and Data Preparations). Workflows cannot be run with the default Dataform service account anymore: you have to create a custom one, or you need to run them via named users' privileges. The Service Account User role needs to be added to all the principals involved in running and modifying workflows.
πŸ’‘
New repositories have to comply from 19th Jan onwards. Existing repositories have leeway until the second half of April.
Changing the user that runs the workflow in Dataform (new policy update).
Changing the user that runs the workflow in Dataform (new policy update).

Community posts around BigQuery

Other relevant news from the community

That's all for now β€” I hope you've enjoyed it. The month of January did not leave us empty-handed either!

Best regards,

Balazs


P.S.: many thanks to my colleague Gergo for explaining the new query tab behavior.