Understanding Google Crawl Factors.
The last section of Google search console is crawl reports. We have handful reports, and you can see them here, on left-hand navigation, we have crawl errors, which is the page I am on, crawl states, fetches google, robot.txt tester, sitemaps, and URL parameter.
The crawl reports error.
It is going to deal with issues that Google has found with your site, these might be servers errors, missing pages, or pages that return bad request errors, of your code, along the top, you see your site errors, this sections show the main issues of the past 90 days, the prevent the Google crawlers to your entire, site, you can select any of boxes, DNS, server connectivity or robot.txt fetch and it display, a relevant chart, for full 90 days window.
First box DNS.
It refers to Domain Name System, this relay translate the name of your site, to the address in server use, to find your site, your server where all the file, of your website list, and if you have DNS, error it means Google could not sort out where server was, the machine handles the name, number translations, may have been broken, and you find out here.
Server Connectivity errors.
It refers to any issues with, your actual server provider, this could be a speed problem, downtime, and so on, a time out, which will be in red, would indicate the google, waited too long, and gave up, a no response, which is purple, means Google will never hood, things from server, , and find out the problem and see you can understand, how to prevent it, happening in the future.
Next robot.txt we have every time google visit your site, and access to your robot.txt file first, if Google goes to grab robot.txt file, and there is an error, you see it here, Google won’t know which pages can and can not index of that file, so double check that you have, not had robot.txt.
Now below this, is the URL section.
I scroll down and you can see the chart, displays the URL errors, this tense to be the most, helpful crawl reports, not all of the features, you see here, labeled smartphones, features phones, soft 404 and not found and so on, going to available to you, it really depends if the errors for those sections, and size and scope of your site, what is you seen here, specific errors google encounters, when trying to crawl, either a desktop, or smartphones,, when you drill into any of these sections, say smartphone, for example, you are only going to be seen, errors that occur on view, this is only true for the smartphones.
So, if the issues only occur here, this is whether display, it is exclusive problem, you notice that data, changes down below, based on which view, you within so
I am back on desktop view let’s just.
Discuss all of these elements along the top of the chart you have various views, soft 404 , not found, soft 404 errors means that the page Google is accessed but it does not exist, but your server is not telling the google, does not exist, now your server suppose to return a 404 file not found error, so you need to talk with your, web host and developer, or if your server savvy review why your server is not returning the proper 404 error code.
So, it is important to remedy the issues, find the sections, you notice that as I scroll down.
The page, Google is going to list the URL along with priority you can sort the view by the priority by response code or even by the date, issue was detected, you can download full list, to thousand pages, by selecting the download, here on the left-hand side.
Use the graph by identifying major blips and any areas, here I can see 10.
We have large server errors, but those seem you dissipated over time.
Get more detail by clicking any link.
It tells which will be especially helpful, for not found pages if you have a not found or soft 404 Google is going tell you where is the link, originates from so you want to go that page, and identify that why it is broken.
Next up your crawl states.
I select that, from the menu on the left-hand side, this is a pretty basic overview, let’s see the activity on your site, for the last 90 days.
You can see how many pages are being crawled per day, the amount of content is being downloaded, and how fast Google has downloaded these pages, your goal is to decrease the time, Google spends the crawling by optimizing your site, by doing that you are going to likely increase the crawl rate.
Because now Google can go faster to your site, when you optimized, from there.
Let’s take a look the fetch as google option.
This area is pretty useful, it is very good tool allows to your simulate, Google crawls or renders URL site, the ideas you give, Google URL on your site, and it is going to send a crawler out to fetch that page, it is really small scale simulation of the real deal.
The fetch as Google tool has two modes of operations, you can test from crawling and rendering, for any URL of your site, there is fetch and other is fetch render.
Let’s go ahead and take a to look at how they work, I am just select which is automatically going to bring in home page, now you notice that I have, a drop down which allow me to choose if I fetch as desktop, or smartphone device, I left the desktop, then for smartphone, now fetch happens very pretty, quick its here at the down.
In this table view, you notice that there is a little check mark, indicates complete and I select here to take a look some, of the details, along with the contents, that was fetched.
Here at the bottom of the page, we can see the total time of taking the google.
Fetch that data, now I scroll to the top, I have submitted the option.
Which allow me to tell to google, like this content the index, you do not need to manually index page but you can use this as the way to introduce to the google to the new piece of content faster than google come around to your site there are no guarantees I select fetch as Google.
Back and now next I go to this time choose, fetch as render.
I show you different when you fetch as to render, it normally is going to take the few minutes, but it is going to allow you.
To detect differences between how your page looks to you and how Google sees your page, once it is ready.
Select that option, and you are going to see that left how google bots, crawl your page, on the right how visitors see your page.
Next, I want to show you, the robot.txt, tester.
I select that, the menu on the left, much like crawlers errors, this section gives you, understanding if the google, found the problem, with your robot.txt file, maybe you have the typo, or directives, that improperly applied, you can find the problems here.
You can also use this section, at the bottom of the screen, to see if the URL is one is blocked, is actually blocked, you add URL here, and then choose the test, and Google let you know if there is crawl is not.
Moving along we have, site map sections.
Which I choose here, on left-hand side, it is a better idea to provide a sitemap.xml file for your site, if you do not have one, at the glance this section show you, how many pages you have submitted to google how may they actually they have indexed, this helps you identify issues, perhaps you are blocking the contents with your robot.txt file.
So, huge discrepancies between your site map and your index , or perhaps google can actually navigate your site properly so it is not sure that the site map, actually relevant you can also drill to in sitemap, by selecting here in the bottom screen this will show the different types of content that are being indexed, now in this scenarios we only are submitting a web page site map, but you could have site map, for images and videos, as news content as well and those will appear here, check this section for any error or warning these will help you in any issues with your site map.
Last but not least we have URLs parameters sections.
This is the most advanced feature so, you primarily want to, to check into it, that Google is warning you any errors, parameters are added to, the end of URL you might see the question mark category, equal x y z, so, on, Google is going to try best you understand and those parameters, but you might need to provide, more, directives, here by selecting configure URL parameters link, to learn more about it, section it really goes beyond the course. And this is it for crawl section, there are really important crawl data in the report, so, use them, your advantage.