Tuesday, February 21, 2023
As we speak, we’re saying bulk knowledge export, a brand new characteristic that lets you export knowledge from Search Console to Google BigQuery on an ongoing foundation. (Notice that the rollout will take roughly one week, so it’s possible you’ll not have entry instantly.)
You’ll be able to configure an export in Search Console to get a every day knowledge dump into your BigQuery undertaking. The information consists of all of your efficiency knowledge, aside from anonymized queries, that are filtered out for privateness causes; in different phrases, the majority knowledge export shouldn’t be affected by the every day knowledge row restrict. This implies you’ll be able to discover your knowledge to its most potential, becoming a member of it with different sources of information and utilizing superior evaluation and visualization methods.
This knowledge export could possibly be notably useful for giant web sites with tens of 1000’s of pages, or these receiving visitors from tens of 1000’s of queries a day (or each!). Small and medium websites have already got entry to all their knowledge by the person interface, the Looker Studio connector (previously often called Knowledge Studio) or the Search Analytics API.
Establishing a brand new bulk knowledge export
To configure a brand new report, you may want to organize your BigQuery account to obtain the information and arrange your particulars within the Search Console settings. Verify the Assist Middle for a step-by-step information, however on the whole, the method is split into two levels:
- Put together your Cloud undertaking (inside Google Cloud Console): this consists of enabling the BigQuery API in your undertaking and giving permission to your Search Console service account.
- Set export vacation spot (inside Search Console): this consists of offering your Google Cloud undertaking ID, and selecting a dataset location. Notice that solely property house owners can arrange a bulk knowledge export.
When you submit the knowledge to Search Console, it’s going to simulate an export. If the export succeeds, we’ll inform all property house owners through e mail and your ongoing exports will begin inside 48 hours. If the export simulation fails, you may obtain an instantaneous alert on the problem detected; this is an inventory of doable export errors.
Knowledge obtainable on bulk knowledge exports
As soon as the majority knowledge export is ready up efficiently, you’ll be able to log in to your BigQuery account and begin querying the information.
You’ll find detailed desk pointers and references within the assist middle; additionally test the reason on the distinction between aggregating knowledge by property vs by web page, because it’ll provide help to perceive the information higher. Here’s a fast description of the three tables that will probably be obtainable to you:
- searchdata_site_impression: This desk incorporates knowledge aggregated by property, together with question, nation, kind, and machine.
- searchdata_url_impression: This desk incorporates knowledge aggregated by URL, which allows a extra detailed view of queries and wealthy outcomes.
- ExportLog: This desk is a document of what knowledge was saved for that day. Failed exports will not be recorded right here.
In the event you want a little bit assist to start out querying the information, test the pattern queries printed within the assist middle, they are often useful to rise up and operating. This is one instance, the place we pull the entire question by URL mixtures for pages with a minimum of 100 FAQ wealthy end result impressions during the last two weeks.
SELECT url, question, sum(impressions) AS impressions, sum(clicks) AS clicks, sum(clicks) / sum(impressions) AS ctr, /* Added one under, as a result of place is zero-based */ ((sum(sum_position) / sum(impressions)) + 1.0) AS avg_position /* Keep in mind to replace the desk title to your desk */ FROM searchconsole.searchdata_url_impression WHERE search_type = ‘WEB’ AND is_tpf_faq = true AND data_date BETWEEN DATE_SUB(CURRENT_DATE(), INTERVAL 14 day) AND CURRENT_DATE() AND clicks > 100 GROUP BY 1,2 ORDER BY clicks LIMIT 1000
We hope that by making extra Google Search knowledge obtainable, web site house owners and SEOs will have the ability to discover extra content material alternatives by analyzing lengthy tail queries. It will additionally make it simpler to affix page-level data from inner techniques to Search ends in a simpler and complete method.
And as all the time, when you have any questions or issues, please attain out to us through the Google Search Central Group or on Twitter.
Posted by Daniel Waisberg, Gaal Yahas, and Haim Daniel, Search Console group