Google BigQuery: Automation Powerhouse for Google Ads

This article describes potential use cases of Google BigQuery for automating management of your Google Ads accounts. It’s based on 4 months of personal intensive experience.  I don’t know everything but I know enough to share some thoughts and basic directions. Before deep diving into BigQuery, you have to know SQL If you don’t know SQL at the moment, you can at least learn what’s possible (and feel the need for learning SQL). So let’s get started.

This article is super long, so let me give you preview of covered topics:

  1. Why Should You Use BigQuery for PPC?
  2. How to Setup a Project in Google Cloud Platform?
  3. How to Link BigQuery with Your Google Ads Accounts?
  4. What Do You See After Linking?
  5. How to Get Other Data to BigQuery?
  6. The First Set of Queries You Should Prepare
  7. Few Suggestions for Google Ads Automation via BigQuery
  8. How to Get Data from BigQuery Back to Google Ads
  9. Conclusion

Why Should You Use BigQuery for PPC?

By now, you are probably asking what actually BigQuery is. BigQuery is “a module” inside Google Cloud Platform. Google’s definition is “Google BigQuery is an enterprise data warehouse that solves this problem by enabling super-fast SQL queries using the processing power of Google’s infrastructure. Simply move your data into BigQuery and let us handle the hard work.”  Ok, so now we know BigQuery is powerful data warehouse against which you can run SQL queries. If you are frequent reader of this website, you know it’s been mainly dealing with Power Query/Power BI so far. Think about BigQuery as about more advanced “Power Query” which runs in Google’s server and is controlled purely by SQL. So basically, you offload all the processing to Google’s servers and operate everything through your web browser.

So now to answering the basic question: Why should you use BigQuery for managing your Google Ads account? You can create custom data flows such as:

  1. Your Google Ads campaigns generate data
  2. You load the data to BigQuery
  3. You run SQL transformation queries against this data inside BigQuery
  4. You take the results of queries and load them back to Google Ads. The format is just set of “rows”.
  5. You schedule the entire process so it runs over and over and over

And now some more specific use cases when BigQuery comes handy:

  1. Your standard Google Ads scripts do not finish within 30 minutes because your account is too big
  2. You are not really fan of Javascript but you know SQL already
  3. You can create custom bidding logic which e. g. takes into account average position AND impression share
  4. You can keep pausing/activating keywords when the landing pages have products in/out of stock
  5. You can create custom bidding algorithms which align bids with current budget pacing so you don’t over spend by the end of the month
  6. You can add new keywords based on search queries
  7. You can add new negative keywords based on purely performing search terms
  8. You can automate campaign creation based on inputs in Google Sheets
  9. You can enrich Google Ads data with your in-house data – you can create custom bidding mechanisms based on life time value of your customers or you can e. g. pair Google click IDs (=gclids) with data in your CRM and only upload “good leads” back to Google Ads
  10. You want to build your database of gclids which lasts more than 30 days
  11. And many more. Basically, you are only limited by your imagination and needs because you import complete snapshot of Google Ads accounts with all metrics and all columns

Limits

I am using word “powerhouse” in the title of this article. So let’s explain that. BigQuery can hold billions of rows so if you have huge accounts with millions of items (keywords, ads), you can easily this data in BigQuery while you would have hard time working with this data in Power Query in Excel. You would have 2 problems: even getting the data into Power Query and then querying it.

So how long does it take to run a complex SQL query in BigQuery against, say, all 1,000,000 keywords in your accounts? You would be looking at 1 minutes, maybe 2 minutes. Super heavy queries with multiple joins run for about 5 minutes for me.

Even queries in BigQuery can sometimes fail – that usually happens when you use too many ORDER BY clauses in your SQL. Remove them and you will be fine.

Queries in BigQuery can run for max 2 hours. I have hit this limit once when I had an error in my SQL and all the rows were getting multiplied several times and the end result would several billion rows.

Other than ORDER BY and major flaws in your SQL, there are really no extra limits – at least I have not hit any and some of my tables carry more than 10,000,000 rows. 🙂

Pricing

Now the important question – how much BigQuery costs? It’s not free, but it’s quite affordable when you use it for managing your Google Ads accounts. Here is the breakdown of the costs:

  1. You pay for “data transfer” of your Google Ads account. You pay for every Google Ads customer ID. 1 account = 2 USD/per month. If you link your MCC to BigQuery which contains 10 accounts, you will pay 20 USD per month.
  2. You pay for running SQL queries however you can query 1 TB per month per project for free.  Every extra TB costs $5 USD. I do exceed 1 TB in a big MCC which has many automation features in place pushed through BigQuery.
  3. You pay for storage – there is something like “active” and “long-term” storage.

Here is our the cost breakdown looks in one of my projects:

You can actually run SQL against all you operations in BigQuery to get exact cost per “BigQuery” SKU. You can see that I am paying almost 40 USD for my AdWords MCC linking (20 accounts in the MCC), then I pay $11 for active storage and then $10.35 for “Analysis” = querying the data. This means that I queried more than 2 TB in October 2018. The rest of the costs is close to 0 USD. Overall, I should be paying around 60 USD per month in this project, however, I paid only 48 USD in October. Why is that? Some Active Storage also comes for free every month. More details about pricing can be found here. 

So say 50 USD per month. Is it too expensive? No, it’s not. When you get to this level of spend in BigQuery, it likely means you are spending hundreds of thousands dollars in Google Ads so the cost of using BigQuery is really negligible.

How to Setup a Project in Google Cloud Platform?

When you want to start using BigQuery, you need to setup your first Google Cloud Platform project. I will refer you to an older article where I describe how to setup your first Google Cloud Platform project. Read more here.

How to Link BigQuery with Your Google Ads Accounts?

After you have your project ready and you finally get to BigQuery UI, it’s time to connect your Google Ads account. So let’s start.

The first thing you need to do is to create a dataset where the Google Ads data will be stored. Find Create Dataset button:

Fill out the parameters. Tip which can save you many headaches in future: always create datasets in the same locations across your project. If you have a table 1 in dataset A sitting in US and you want join a table 2 from dataset B sitting in EU, you will not be able to do it and you cannot change the locations AFTER you save the dataset. You can only query tables in datasets in the same location. Set “Never’ in Default table expiration. Press Create dataset.

Violá, here is your new your dataset:

I already had “AdWordsData” dataset there before writing this article. Now it’s time to fill your new shiny new dataset with data from Google Ads: Find the Transfers button:

You will be taken to the old UI (unless you already started in the old UI) and hit Enable:

Then Add Transfer:

A now it’s time to pay attention:

Steps:

  1. Select “Google Ads” in Source
  2. Enter some user friendly name of the transfer – you can change it later.
  3. Setup your refresh windows. Max is 30 days, use 30. It’s the best option. Latent conversions are assigned back to the date of the last click (not date of happening), this setting makes sure that even 30 days old data will be up to date because BigQuery will always reload the last 30 days of data.
  4. Setup schedule to 24 hours. You can do it less frequently but I don’t see any reason to do it less than once a day.
  5. Select your destination dataset which you created few minutes ago.
  6. Enter you Google Ads customer ID. You can enter enter either 1111111111 or 111-111-1111 format. You can enter your MCC ID and all accounts in that MCC will get downloaded OR you can enter ID of an single account. It’s up to you. If you are running an MCC but you only need 2 accounts from that MCC to be transferred to BigQuery, you can either create separate dataset for each Google Ads account and separate transfers OR you can create 2 transfers but load all the data into 1 dataset.
  7. Click “Advanced” and enable “Send email notifications”  – you will get an email if a data load fails.
  8. Press “Add” button and wait 5 minutes. In case your prompted for account details in a pop up window, go ahead and enter them + allow access for BigQuery to Google Ads.

Next what you see is something like this:

At the bottom you will see all the days of data which are currently being transfered to BigQuery. At this point, you are probably wondering which Google Ads reports are being send to BigQuery? All of them + all the columns :). That’s the beauty. You will have all available Google Ads data sitting in BigQuery ready for your SQL magic!

If you want to get even historical Google Ads data, you can schedule backfill (max for 1 run is 180 days, so you have to repeat the backfills manually if you have several years of data). 1 day of data can take up to an hour to be transferred, so you have to be patient at the beginning.

As a side note, I should also mention that there is a Google Ads script (It’s called BigQuery Exporter) which exports only selected reports to BigQuery and not all the reports unlike the “transfer setup”. The problem of BigQuery Exporter script is that it’s not very reliable (it can fail for no reason), it can run only for 30 minutes like any other script, it does not have and refresh window, and overall it’s just bad choice. The positive thing is that it’s for free no matter how many accounts you have and it will give you user friendly reports from the very beginning. If you are interested in using BigQuery Exporter script, read more here.

What Do You See After Linking?

Okay, so you have linked your Google Ads with BigQuery and you ran the first sync. After few minutes, your dataset will start getting filled with various tables. Basically, any piece of data about your campaigns from Google Ads which is accessible via API, will be loaded into BigQuery tables and views.  This is how it looks in the old BigQuery UI:

I will now switch the new beta UI since this is what Google will be pushing from now on. Same view as before in the new UI:

You will see hundreds of view and tables. What’s the difference between view and table? “View” is presaved SQL query of underlying tables. When you want to preview “view”,  you need to run the query itself. When you want to preview the tables, you just click preview button and see what’s inside.

How to tell what’s view and what’s table? The leading icons are different:

You can see what’s the query for “view”:

You can see that the views always add a new column “_LATEST_DATE” with yesterday’s date (the snapshots were taken on 2018-11-04) on the top of the underlying table of the similar name. When the underlying table is called “DatasetName.p_Keyword_ACCOUNTID”, the view will be called “DatasetName.Keyword_ACCOUNTID” without the “_p”.

You can also preview the table:

(I am not showing “preview” because my sample account in Google Ads has not data). You can also see “Schema” of the table showing what each column in the table means.

After you get little more oriented, you will see that there are 2 types of tables/views. The first set of views/tables is the simple “performance” split by date. The second set is the snapshot of “attributes” of the entities in the table. So for example, DatasetName.p_Keyword_ACCOUNTID table contains information about the keywords:

So you can see keyword’s CPC bid, Final URL, Approval Status etc. What you cannot see is keywords’ performance such as Impressions & Clicks. Those are stored in a different table – in DatasetName.p_KeywordBasicStats_ACCOUNTID:

And in order to create even more mess, performance data is spilled across multiple “stats” tables – so you have “BasicStats”,”ConversionStats”,”CrossDeviceStats” and so on. I will not be getting into more details since this is topic for 10 articles alone. 🙂 When I am trying to say here is that getting some meaningful data for the very first time can look as impossible task.

How to Get Other Data to BigQuery?

When it comes to uploading other than Google Ads data to BigQuery, you have few options: You can create empty table, you can load data from Google Cloud Storage, you can do manual upload, or you can use data stored in your Google Drive:

Manual uploads support these file types: CSV, JSON, Avro, Parquet, ORC. (I have no idea what the last 3 options mean)

Things get little more interesting in case of Google Drive:

You can use Google Sheet as table (only sheet from the spreadsheet though) or you can use CSV stored in your Google Drive. So for example, if you input some manual data in daily basis and you want to pass it to BigQuery, the best option is definitely Google Sheets.

BigQuery also has its own API through which you can upload your custom data (say, data from Facebook, data from Bing Ads, data from your CRM etc.). These integration are not as straightforward as “clickable” options in the UI. You need to write a custom piece of code and run it on your server on regular basis in order to pass the fresh data to BigQuery. In case you need help with custom data, we have already developed our own proprietary solutions for few clients (Facebook, Bing, custom CRM integrations, loading multiple sheets from Google Sheets spreadsheet). You can contact us for more info.

Another option is a new product from Supermetrics called Supermetrics for BigQuery which can sync data from all the major marketing platforms to BigQuery. See more here: GO TO SUPERMETRICS

The First Set of Queries You Should Prepare

When you start looking around at the imported tables or views, you will notice that the imported data is not very user friendly. Let’s take for example our “DatasetName.p_Keyword_ACCOUNTID” table:

So you get CampaignId, AdGroupId columns but no “Campaign” nor “AdGroup” column with user friendly names. That’s the way it is unfortunately. After seeing this problem, you would quickly figure that the first thing you need to do is to create few default views listing items in your accounts such as:

  1. “Accounts view” – list of all account names and IDs in case you are importing MCC
  2. “Campaign view” – list of all campaign names with campaign IDs and other campaign attributes, along with account name joined from #1. Then similarly:
  3. “AdGroup view”
  4. “Keyword view”
  5. “Ad view”
  6. Any other elements you are interested in

You can write to SQL directly in BigQuery or in an text editor of your choice (I am using Sublime). Writing code in BigQuery looks like this:

The highlighted buttons allow you to save query as view OR save results as table.

Sample SQL how to create list of all campaigns along with user friendly account names (assume, I have already created “account view”, I am using Sublime and then CTRL+C CTLR+V into BigQuery window above):

After you are done with creating list of items as views, you can create “performance” views which do have nice user friendly names of accounts, campaigns, ad groups, keywords… So you will need to create views such as:

  1. Account performance with nice friendly names
  2. Campaign performance with nice friendly names
  3. Ad group performance with nice friendly names
  4. Keyword performance with nice friendly names
  5. Ad performance with nice friendly names

So let’s have a look how campaign performance view query looks:

And part 2:

You can see that creating very basis performance looks frightening but don’t worry – once you’ll get past your first project, you will get used to it and what’s more – you can just copy the views between the project and change only account IDs and dataset names.

So after you create your account, campaign, ad group, keyword, ad performance views, you can move to next phase – to running custom queries – which will help you automate your Google Ads account.

Few Suggestions for Google Ads Automation via BigQuery

If you made this far, you are probably waiting what you can get for all the pain you spend on create even just the basic views. So here are some real use cases which we have developed so far:

  1. Complete build out of search campaign structure based on “feed” of inputs stored in Google Sheets – we had to employ SQL and BigQuery’s performance since we had to generate around 3M keywords, hundreds of thousands of custom ads, everything based on super custom rules. If new items appear into our feed, we can easily generate campaigns, keywords, ads with few clicks (we could send all this new content automatically to Google Ads if wanted to).
  2. Automatically adding good search terms as new keywords (if your campaign structure allows).
  3. Automatically adding bad search terms as new negative keywords.
  4. Pausing/activating keywords based on the content on your website – for this exercise, we have build custom crawler running on daily basis through client’s page collecting information about his offering. We then pair this crawled data with Final URLs in Google Ads. If certain conditions are met, we pause/activate keywords.
  5. Creating custom bidding strategies by combining Average Position AND Impression Share.
  6. Creating custom bidding strategy which makes sure your account don’t run too hot or cold on our monthly budget.
  7. Various checks for empty ad groups (no KWs, no Ads).
  8. Check for duplicate items (ads, keywords).
  9. Custom conversion uploads based on the combination of Google Ads data (gclids) and data stored in client’s CRM – handy for clients when you want to upload only good leads and not “all leads”.
  10. Custom adjustments of campaigns’ geo targeting based on the information stored in a feed
  11. Custom adjustments of campaigns’ budgets
  12. … anything else on your mind which is changeable via API or bulk uploads in BigQuery

Obviously, showing exact steps and SQL code for samples 1-12 would results in 12 articles if similar length like this one – I am not going to do that, I just want to give some inspiration of what’s possible.

On the top of performing various automation tasks, BigQuery is super useful for doing custom very data heavy reports – e. g. when you need to prepare location analysis involving millions of rows, BigQuery will have no troubles processing such volume of data.

How to Get Data from BigQuery Back to Google Ads?

The final step – you have your queries in BigQuery ready and you need to get the results into Google Ads. How to achieve that? You can take multiple routes.

Let’s assume you have a budget pacing query which returns set of keywords which need to have bids adjusted. Results look like this:

The query returns action, campaign, ad group, keyword, match type, changed max cpc, campaign id, ad group id columns and now you need to push these results into Google Ads.

Scenario 1:

  1. Run queries in BigQuery
  2. Load formatted data to Google Sheets via Supermetrics for Google Sheets. The guys from Supermetrics have build BigQuery connector which allows you schedule your queries.
  3. Use Google Ads script to pull data from Google Sheets and apply the data as “Bulk Upload” (or upload involving iterators in Google Ads). The sample screen above is suitable to be uploaded via “Bulk Upload”.

Scenario 1 allows you to doublecheck the results of your queries in Google Sheets whenever you want but you may hit Google Sheets limits if you try to load too many rows from BigQuery into Google Sheets.

Scenario 2:

  1. Run queries in BigQuery
  2. Apply the results of your queries immediately as Bulk Upload in Google Ads with our custom Google Ads script. Googel Ads script for this approach can be downloaded here. When you open the script, I recommend you to read through the comments in the code.

Bulk Upload is quite underused approach to make changes in Google Ads – you can change hundreds of thousands items within upload of 1 csv file – something you cannot do with regular scripts involving iterators. On the other hand, the numbers of elements which can be changed via Bulk Uploads is limited. Read more about supported entities here. 

If you need a custom script (e. g. to change campaign locations which are not supported by Bulk Uploads) to upload results of BigQuery queries into Google Ads, contact us for more info.

Conclusion

At the beginning of this article, I mentioned this flow:

  1. Your Google Ads campaigns generate data
  2. You load the data to BigQuery
  3. You run SQL transformation queries against this data inside BigQuery
  4. You take the results of queries and load them back to Google Ads. The format is just set of “rows”.
  5. You schedule the entire process so it runs over and over and over by scheduling your Google Ads scripts & Supermetrics queries.

And that circle is now closed.:)

I do realize this was a very long article, I tried to give you an idea about BigQuery can do & how difficult it’s to setup and what obstacles you will face on the way.

BigQuery converted into PPC automation tool is not for everybody and not that many people actually know about it. It requires SQL skills, little knowledge of Google Ads scripts, quite lengthy initial learning curve and not so easy setup process.

On the other, once you master the basics, extreme processing power of BigQuery can deliver results which Google Ads scripts simply cannot achieve in case of very big accounts due to the limits.

If you find this article helpful share it with your PPC tribe 🙂

20 thoughts on “Google BigQuery: Automation Powerhouse for Google Ads

  1. Hi,

    Thanks for posting so much valuable content.

    I sometimes have script which are longer than 30mn and also have some problematics when trying to analyse ngram for 2 millions of search terms.

    About this last need, I tried your ngram analyser with Excel, but Excel didn’t succeed to manage so much data.

    Do you think ngram analyse is possible with Big Query and with the same facility as with Excel ?

    Thanks for your work 🙂

  2. Oh wow, I’ve never ran into another digital marketer that utilized SQL for campaign, ad group, keyword, etc builds in PPC like I did.

    Nice to see that this is picking up momentum, alot of people don’t realize that we are living in a golden age of “cheap data connections”. Utilizing BigQuery, APIs, etc allows you to run lean yet make a very large impact for bigger accounts.

    A lot of my side clients thought I was a full fledge agency with 20+ employees, nope, just me by myself and great tools 🙂

  3. Hi,

    I have bing campaigns on the Google search 360. How can i bring that to Big query ?

    1. Ask your Search Ads account rep, they will enable BQ exports if you are big enough advertiser 🙂

      Otherwise you’d have to load it via Bing API=>BQ API somehow.

      DZ

  4. Hi – question on backfilling. My previous coworker was using BigQuery to pull in Google Ads info, but now that he’s gone, I see that its refresh has only been set to 1 (not 30). Our conversions are off by 2-3% over 18 months, and I think it’s because BigQuery hasn’t been catching the latent conversion.

    I’ve since changed our refresh window to 30 days. However, that only will correct the last month of data.

    If I do manual backfills month-by-month, since the day we started pulling data (June 2018), will that delete and re-insert all the data again into the BigQuery partition tables? I’m not an expert and don’t want to create duplicates in the data we do currently have.

    Thank you! Your site has great content!

    1. Hi Kari,
      this is a great question but I don’t have a firm answer for you. I have not faced this situation yet.

      I think no matter what you do, the data won’t get duplicated.
      I am just not sure if the data can be “refreshed” when “some data already exists” for a given date.

      My guess is that Google will overwrite rows in BQ which do not match the current state in the engine. In other words, if you re-backfill the data, it should make your conversion gap lower.

      Why don’t you just try to backfill one day of data where your conversions are not currently matching to see what’s gonna happen?

      DZ

    2. Hey Kari,

      I am having the same trouble, but I cannot find the place in which to set the refresh window.
      Can you please guide me where to set it, because the internet is poor with information.

      Thanks, Ido (ido@albert.ai).

  5. Hey Daniel, thank you for sharing such a helpful guide! Much appreciated.

    Just one question – is it somehow possible to get to extensions data? No matter how I try I just can’t find any table dealing with sitelinks etc.

    My point is to visualize missing sitelinks on adgroup / campaign level in GDS.

    Any ideas?

    Thank you

    1. Hi David,
      no this table is not available unfortunately. I cry for that one too every day… I don’t have a good solution for this. Maybe a Google Script could do the trick but it would not be as seamless as the BQ transfer.

      I solve it by exporting sitelink data via Supermetrics to Google Sheets and then I link this sheet to BQ where I can query the data.

      DZ

  6. Hi, I have a question regarding the backfill option.
    I would like to backfill a large amount of data (4 years).
    The thing is that it will take quite a long time to backfill because BQ is scheduling the jobs with 30 min interval for every day.
    Is there any way to change it so that the next day will start running once the previous day finished running?
    Example of how BQ is scheduling the transfer now

    run date – 3 January 2020
    schedule time – 7 February 2020 at 17:44:54 UTC+1
    run date – 2 January 2020
    schedule time – 7 February 2020 at 18:14:54 UTC+1

    Thanks!

    1. Hi Idan,
      I understand the question but I don’t know the answer.

      Might be possible through API but never tested it. I just keep waiting when doing backfills.

      DZ

  7. Hi,
    Is there a possibility to get the custom Google Ads Script working on MCC account for changing bids?
    I only got it to work on Account level.

Leave a Reply

Your email address will not be published. Required fields are marked *