Google Big Query: Automation Powerhouse for Google Ads

This article describes potential use cases of Google Big Query for automating management of your Google Ads accounts. It’s based on 4 months of personal intensive experience.  I don’t know everything but I know enough to share some thoughts and basic directions. Before deep diving into Big Query, you have to know SQL If you don’t know SQL at the moment, you can at least learn what’s possible (and feel the need for learning SQL). So let’s get started.

This article is super long, so let me give you preview of covered topics:

  1. Why Should You Use Big Query for PPC?
  2. How to Setup a Project in Google Cloud Platform?
  3. How to Link Big Query with Your Google Ads Accounts?
  4. What Do You See After Linking?
  5. How to Get Other Data to Big Query?
  6. The First Set of Queries You Should Prepare
  7. Few Suggestions for Google Ads Automation via Big Query
  8. How to Get Data from Big Query Back to Google Ads
  9. Conclusion

***Tip: Have a question? Ask in the comment section below or in this Facebook group. You can also sign up for our free newsletter.***

***Do you feel like you need more help? We can help you with PPC Automation and/or with your reporting under our new project MythicalReports.com here.***

Why Should You Use Big Query for PPC?

By now, you are probably asking what actually Big Query is. Big Query is “a module” inside Google Cloud Platform. Google’s definition is “Google BigQuery is an enterprise data warehouse that solves this problem by enabling super-fast SQL queries using the processing power of Google’s infrastructure. Simply move your data into BigQuery and let us handle the hard work.”  Ok, so now we know Big Query is powerful data warehouse against which you can run SQL queries. If you are frequent reader of this website, you know it’s been mainly dealing with Power Query/Power BI so far. Think about Big Query as about more advanced “Power Query” which runs in Google’s server and is controlled purely by SQL. So basically, you offload all the processing to Google’s servers and operate everything through your web browser.

So now to answering the basic question: Why should you use Big Query for managing your Google Ads account? You can create custom data flows such as:

  1. Your Google Ads campaigns generate data
  2. You load the data to Big Query
  3. You run SQL transformation queries against this data inside Big Query
  4. You take the results of queries and load them back to Google Ads. The format is just set of “rows”.
  5. You schedule the entire process so it runs over and over and over

And now some more specific use cases when Big Query comes handy:

  1. Your standard Google Ads scripts do not finish within 30 minutes because your account is too big
  2. You are not really fan of Javascript but you know SQL already
  3. You can create custom bidding logic which e. g. takes into account average position AND impression share
  4. You can keep pausing/activating keywords when the landing pages have products in/out of stock
  5. You can create custom bidding algorithms which align bids with current budget pacing so you don’t over spend by the end of the month
  6. You can add new keywords based on search queries
  7. You can add new negative keywords based on purely performing search terms
  8. You can automate campaign creation based on inputs in Google Sheets
  9. You can enrich Google Ads data with your in-house data – you can create custom bidding mechanisms based on life time value of your customers or you can e. g. pair Google click IDs (=gclids) with data in your CRM and only upload “good leads” back to Google Ads
  10. You want to build your database of gclids which lasts more than 30 days
  11. And many more. Basically, you are only limited by your imagination and needs because you import complete snapshot of Google Ads accounts with all metrics and all columns

Limits

I am using word “powerhouse” in the title of this article. So let’s explain that. Big Query can hold billions of rows so if you have huge accounts with millions of items (keywords, ads), you can easily this data in Big Query while you would have hard time working with this data in Power Query in Excel. You would have 2 problems: even getting the data into Power Query and then querying it.

So how long does it take to run a complex SQL query in Big Query against, say, all 1,000,000 keywords in your accounts? You would be looking at 1 minutes, maybe 2 minutes. Super heavy queries with multiple joins run for about 5 minutes for me.

Even queries in Big Query can sometimes fail – that usually happens when you use too many ORDER BY clauses in your SQL. Remove them and you will be fine.

Queries in Big Query can run for max 2 hours. I have hit this limit once when I had an error in my SQL and all the rows were getting multiplied several times and the end result would several billion rows.

Other than ORDER BY and major flaws in your SQL, there are really no extra limits – at least I have not hit any and some of my tables carry more than 10,000,000 rows. 🙂

Pricing

Now the important question – how much Big Query costs? It’s not free, but it’s quite affordable when you use it for managing your Google Ads accounts. Here is the breakdown of the costs:

  1. You pay for “data transfer” of your Google Ads account. You pay for every Google Ads customer ID. 1 account = 2 USD/per month. If you link your MCC to Big Query which contains 10 accounts, you will pay 20 USD per month.
  2. You pay for running SQL queries however you can query 1 TB per month per project for free.  Every extra TB costs $5 USD. I do exceed 1 TB in a big MCC which has many automation features in place pushed through Big Query.
  3. You pay for storage – there is something like “active” and “long-term” storage.

Here is our the cost breakdown looks in one of my projects:

You can actually run SQL against all you operations in Big Query to get exact cost per “Big Query” SKU. You can see that I am paying almost 40 USD for my AdWords MCC linking (20 accounts in the MCC), then I pay $11 for active storage and then $10.35 for “Analysis” = querying the data. This means that I queried more than 2 TB in October 2018. The rest of the costs is close to 0 USD. Overall, I should be paying around 60 USD per month in this project, however, I paid only 48 USD in October. Why is that? Some Active Storage also comes for free every month. More details about pricing can be found here. 

So say 50 USD per month. Is it too expensive? No, it’s not. When you get to this level of spend in Big Query, it likely means you are spending hundreds of thousands dollars in Google Ads so the cost of using Big Query is really negligible.

How to Setup a Project in Google Cloud Platform?

When you want to start using Big Query, you need to setup your first Google Cloud Platform project. I will refer you to an older article where I describe how to setup your first Google Cloud Platform project. Read more here.

How to Link Big Query with Your Google Ads Accounts?

After you have your project ready and you finally get to Big Query UI, it’s time to connect your Google Ads account. So let’s start.

The first thing you need to do is to create a dataset where the Google Ads data will be stored. Find Create Dataset button:

Fill out the parameters. Tip which can save you many headaches in future: always create datasets in the same locations across your project. If you have a table 1 in dataset A sitting in US and you want join a table 2 from dataset B sitting in EU, you will not be able to do it and you cannot change the locations AFTER you save the dataset. You can only query tables in datasets in the same location. Set “Never’ in Default table expiration. Press Create dataset.

Violá, here is your new your dataset:

I already had “AdWordsData” dataset there before writing this article. Now it’s time to fill your new shiny new dataset with data from Google Ads: Find the Transfers button:

You will be taken to the old UI (unless you already started in the old UI) and hit Enable:

Then Add Transfer:

A now it’s time to pay attention:

Steps:

  1. Select “Google Ads” in Source
  2. Enter some user friendly name of the transfer – you can change it later.
  3. Setup your refresh windows. Max is 30 days, use 30. It’s the best option. Latent conversions are assigned back to the date of the last click (not date of happening), this setting makes sure that even 30 days old data will be up to date because Big Query will always reload the last 30 days of data.
  4. Setup schedule to 24 hours. You can do it less frequently but I don’t see any reason to do it less than once a day.
  5. Select your destination dataset which you created few minutes ago.
  6. Enter you Google Ads customer ID. You can enter enter either 1111111111 or 111-111-1111 format. You can enter your MCC ID and all accounts in that MCC will get downloaded OR you can enter ID of an single account. It’s up to you. If you are running an MCC but you only need 2 accounts from that MCC to be transferred to Big Query, you can either create separate dataset for each Google Ads account and separate transfers OR you can create 2 transfers but load all the data into 1 dataset.
  7. Click “Advanced” and enable “Send email notifications”  – you will get an email if a data load fails.
  8. Press “Add” button and wait 5 minutes. In case your prompted for account details in a pop up window, go ahead and enter them + allow access for Big Query to Google Ads.

Next what you see is something like this:

At the bottom you will see all the days of data which are currently being transfered to Big Query. At this point, you are probably wondering which Google Ads reports are being send to Big Query? All of them + all the columns :). That’s the beauty. You will have all available Google Ads data sitting in Big Query ready for your SQL magic!

If you want to get even historical Google Ads data, you can schedule backfill (max for 1 run is 180 days, so you have to repeat the backfills manually if you have several years of data). 1 day of data can take up to an hour to be transferred, so you have to be patient at the beginning.

As a side note, I should also mention that there is a Google Ads script (It’s called Big Query Exporter) which exports only selected reports to Big Query and not all the reports unlike the “transfer setup”. The problem of Big Query Exporter script is that it’s not very reliable (it can fail for no reason), it can run only for 30 minutes like any other script, it does not have and refresh window, and overall it’s just bad choice. The positive thing is that it’s for free no matter how many accounts you have and it will give you user friendly reports from the very beginning. If you are interested in using Big Query Exporter script, read more here.

What Do You See After Linking?

Okay, so you have linked your Google Ads with Big Query and you ran the first sync. After few minutes, your dataset will start getting filled with various tables. Basically, any piece of data about your campaigns from Google Ads which is accessible via API, will be loaded into Big Query tables and views.  This is how it looks in the old Big Query UI:

I will now switch the new beta UI since this is what Google will be pushing from now on. Same view as before in the new UI:

You will see hundreds of view and tables. What’s the difference between view and table? “View” is presaved SQL query of underlying tables. When you want to preview “view”,  you need to run the query itself. When you want to preview the tables, you just click preview button and see what’s inside.

How to tell what’s view and what’s table? The leading icons are different:

You can see what’s the query for “view”:

You can see that the views always add a new column “_LATEST_DATE” with yesterday’s date (the snapshots were taken on 2018-11-04) on the top of the underlying table of the similar name. When the underlying table is called “DatasetName.p_Keyword_ACCOUNTID”, the view will be called “DatasetName.Keyword_ACCOUNTID” without the “_p”.

You can also preview the table:

(I am not showing “preview” because my sample account in Google Ads has not data). You can also see “Schema” of the table showing what each column in the table means.

After you get little more oriented, you will see that there are 2 types of tables/views. The first set of views/tables is the simple “performance” split by date. The second set is the snapshot of “attributes” of the entities in the table. So for example, DatasetName.p_Keyword_ACCOUNTID table contains information about the keywords:

So you can see keyword’s CPC bid, Final URL, Approval Status etc. What you cannot see is keywords’ performance such as Impressions & Clicks. Those are stored in a different table – in DatasetName.p_KeywordBasicStats_ACCOUNTID:

And in order to create even more mess, performance data is spilled across multiple “stats” tables – so you have “BasicStats”,”ConversionStats”,”CrossDeviceStats” and so on. I will not be getting into more details since this is topic for 10 articles alone. 🙂 When I am trying to say here is that getting some meaningful data for the very first time can look as impossible task.

How to Get Other Data to Big Query?

When it comes to uploading other than Google Ads data to Big Query, you have few options: You can create empty table, you can load data from Google Cloud Storage, you can do manual upload, or you can use data stored in your Google Drive:

Manual uploads support these file types: CSV, JSON, Avro, Parquet, ORC. (I have no idea what the last 3 options mean)

Things get little more interesting in case of Google Drive:

You can use Google Sheet as table (only sheet from the spreadsheet though) or you can use CSV stored in your Google Drive. So for example, if you input some manual data in daily basis and you want to pass it to Big Query, the best option is definitely Google Sheets.

Big Query also has its own API through which you can upload your custom data (say, data from Facebook, data from Bing Ads, data from your CRM etc.). These integration are not as straightforward as “clickable” options in the UI. You need to write a custom piece of code and run it on your server on regular basis in order to pass the fresh data to Big Query. In case you need help with custom data, we have already developed our own proprietary solutions for few clients (Facebook, Bing, custom CRM integrations, loading multiple sheets from Google Sheets spreadsheet). You can contact us for more info.

The First Set of Queries You Should Prepare

When you start looking around at the imported tables or views, you will notice that the imported data is not very user friendly. Let’s take for example our “DatasetName.p_Keyword_ACCOUNTID” table:

So you get CampaignId, AdGroupId columns but no “Campaign” nor “AdGroup” column with user friendly names. That’s the way it is unfortunately. After seeing this problem, you would quickly figure that the first thing you need to do is to create few default views listing items in your accounts such as:

  1. “Accounts view” – list of all account names and IDs in case you are importing MCC
  2. “Campaign view” – list of all campaign names with campaign IDs and other campaign attributes, along with account name joined from #1. Then similarly:
  3. “AdGroup view”
  4. “Keyword view”
  5. “Ad view”
  6. Any other elements you are interested in

You can write to SQL directly in Big Query or in an text editor of your choice (I am using Sublime). Writing code in Big Query looks like this:

The highlighted buttons allow you to save query as view OR save results as table.

Sample SQL how to create list of all campaigns along with user friendly account names (assume, I have already created “account view”, I am using Sublime and then CTRL+C CTLR+V into Big Query window above):

After you are done with creating list of items as views, you can create “performance” views which do have nice user friendly names of accounts, campaigns, ad groups, keywords… So you will need to create views such as:

  1. Account performance with nice friendly names
  2. Campaign performance with nice friendly names
  3. Ad group performance with nice friendly names
  4. Keyword performance with nice friendly names
  5. Ad performance with nice friendly names

So let’s have a look how campaign performance view query looks:

And part 2:

You can see that creating very basis performance looks frightening but don’t worry – once you’ll get past your first project, you will get used to it and what’s more – you can just copy the views between the project and change only account IDs and dataset names.

So after you create your account, campaign, ad group, keyword, ad performance views, you can move to next phase – to running custom queries – which will help you automate your Google Ads account.

Few Suggestions for Google Ads Automation via Big Query

If you made this far, you are probably waiting what you can get for all the pain you spend on create even just the basic views. So here are some real use cases which we have developed so far:

  1. Complete build out of search campaign structure based on “feed” of inputs stored in Google Sheets – we had to employ SQL and Big Query’s performance since we had to generate around 3M keywords, hundreds of thousands of custom ads, everything based on super custom rules. If new items appear into our feed, we can easily generate campaigns, keywords, ads with few clicks (we could send all this new content automatically to Google Ads if wanted to).
  2. Automatically adding good search terms as new keywords (if your campaign structure allows).
  3. Automatically adding bad search terms as new negative keywords.
  4. Pausing/activating keywords based on the content on your website – for this exercise, we have build custom crawler running on daily basis through client’s page collecting information about his offering. We then pair this crawled data with Final URLs in Google Ads. If certain conditions are met, we pause/activate keywords.
  5. Creating custom bidding strategies by combining Average Position AND Impression Share.
  6. Creating custom bidding strategy which makes sure your account don’t run too hot or cold on our monthly budget.
  7. Various checks for empty ad groups (no KWs, no Ads).
  8. Check for duplicate items (ads, keywords).
  9. Custom conversion uploads based on the combination of Google Ads data (gclids) and data stored in client’s CRM – handy for clients when you want to upload only good leads and not “all leads”.
  10. Custom adjustments of campaigns’ geo targeting based on the information stored in a feed
  11. Custom adjustments of campaigns’ budgets
  12. … anything else on your mind which is changeable via API or bulk uploads in Big Query

Obviously, showing exact steps and SQL code for samples 1-12 would results in 12 articles if similar length like this one – I am not going to do that, I just want to give some inspiration of what’s possible.

On the top of performing various automation tasks, Big Query is super useful for doing custom very data heavy reports – e. g. when you need to prepare location analysis involving millions of rows, Big Query will have no troubles processing such volume of data.

How to Get Data from Big Query Back to Google Ads?

The final step – you have your queries in Big Query ready and you need to get the results into Google Ads. How to achieve that? You can take multiple routes.

Let’s assume you have a budget pacing query which returns set of keywords which need to have bids adjusted. Results look like this:

The query returns action, campaign, ad group, keyword, match type, changed max cpc, campaign id, ad group id columns and now you need to push these results into Google Ads.

Scenario 1:

  1. Run queries in Big Query
  2. Load formatted data to Google Sheets via Supermetrics for Google Sheets. The guys from Supermetrics have build Big Query connector which allows you schedule your queries.
  3. Use Google Ads script to pull data from Google Sheets and apply the data as “Bulk Upload” (or upload involving iterators in Google Ads). The sample screen above is suitable to be uploaded via “Bulk Upload”.

Scenario 1 allows you to doublecheck the results of your queries in Google Sheets whenever you want but you may hit Google Sheets limits if you try to load too many rows from Big Query into Google Sheets.

Scenario 2:

  1. Run queries in Big Query
  2. Apply the results of your queries immediately as Bulk Upload in Google Ads with our custom Google Ads script. Googel Ads script for this approach can be downloaded here. When you open the script, I recommend you to read through the comments in the code.

Bulk Upload is quite underused approach to make changes in Google Ads – you can change hundreds of thousands items within upload of 1 csv file – something you cannot do with regular scripts involving iterators. On the other hand, the numbers of elements which can be changed via Bulk Uploads is limited. Read more about supported entities here. 

If you need a custom script (e. g. to change campaign locations which are not supported by Bulk Uploads) to upload results of Big Query queries into Google Ads, contact us for more info.

Conclusion

At the beginning of this article, I mentioned this flow:

  1. Your Google Ads campaigns generate data
  2. You load the data to Big Query
  3. You run SQL transformation queries against this data inside Big Query
  4. You take the results of queries and load them back to Google Ads. The format is just set of “rows”.
  5. You schedule the entire process so it runs over and over and over by scheduling your Google Ads scripts & Supermetrics queries.

And that circle is now closed.:)

I do realize this was a very long article, I tried to give you an idea about Big Query can do & how difficult it’s to setup and what obstacles you will face on the way.

Big Query converted into PPC automation tool is not for everybody and not that many people actually know about it. It requires SQL skills, little knowledge of Google Ads scripts, quite lengthy initial learning curve and not so easy setup process.

On the other, once you master the basics, extreme processing power of Big Query can deliver results which Google Ads scripts simply cannot achieve in case of very big accounts due to the limits.

If you find this article helpful share it with your PPC tribe 🙂

***Tip: Have a question? Ask in the comment section below or in this Facebook group. You can also sign up for our free newsletter.***

***Do you feel like you need more help? We can help you with PPC Automation and/or with your reporting under our new project MythicalReports.com here.***

2 thoughts on “Google Big Query: Automation Powerhouse for Google Ads

  1. Hi,

    Thanks for posting so much valuable content.

    I sometimes have script which are longer than 30mn and also have some problematics when trying to analyse ngram for 2 millions of search terms.

    About this last need, I tried your ngram analyser with Excel, but Excel didn’t succeed to manage so much data.

    Do you think ngram analyse is possible with Big Query and with the same facility as with Excel ?

    Thanks for your work 🙂

Leave a Reply

Your email address will not be published. Required fields are marked *