hero logo

FIND TECHNICAL ERRORS THAT BLOCK YOUR SEARCH ENGINE VISIBILITY

We crawl your website like Google does and find all the technical errors that block Google from actually seeing it.

Our clients get 3 - 5 times more organic traffic on average after implementing our recommendations!

USE PROMOCODE AGT

Crawl up to 20,000 pages during free trial. 10% Off Annual for 1st year when free trial ends!

Start your free trial now!


Crawl Your Website Like Googlebot

The JetOctopus Report consists of several basic sections: Problems, Analytics, Data Table and Segments. The Problems section shows the most critical problems and errors that can be found at the website by means of different automatic checkups. Each section starts with a dashboard and we can see the total number of crawled pages, the number of pages with critical errors and the number of pages with warning issues. We’re talking not about the total number of errors at the website, but about the number of pages with errors. For instance some pages may lack title, or meta description or may have duplicate content – here such pages will be counted just once. Thanks to this you can see the real number of problem pages and not just a figure of all the errors that may vary from case to case. In this case we can see that most errors are those with HTML. We can see the list of main issues: these are issues related to the given section – those are empty meta description, we’ve also got IFrame tags that are considered as warning issues and the pages lacking H1 tags. Continue reading...

How Googlebot Crawls Your Pages - Logs Insights

You probably know log files are treasure troves. There you can see how bots crawl your pages, which content is frequently visited or vice versa ignored by bots. But often SEOs don’t know how to get the value of logs - it’s time-consuming to explore the sheer volume of data manually, and analytics tools may be costly. Still, you need to find the way out and pull the data from logs. Today we show how log files analysis helps to reveal problems with tags, indexability issues, and crawl budget wastes. Continue reading...

Are Your Pages Showing Up in the SERPs?

A few years ago, I was trying to increase traffic on our job-aggregator website with 5 million pages. I decided to use SEO agency services, expecting that traffic would go through the roof. But I was wrong. Instead of a comprehensive audit, I got tarot cards reading. That’s why I went back to square one and created a web crawler for comprehensive on-page SEO analysis. I’ve been spying on Googlebot for more than a year, and now I’m ready to share insights about its behavior. I expect my observations will at least clarify how web crawlers work, and at most will help you to conduct on-page optimization efficiently. I gathered the most meaningful data that is useful for either a new website or one that has thousands of pages. To know for sure which pages are in the search results, you should check the index-ability of the whole website. However, analyses of each URL on a 10 million-plus pages website costs a fortune, about as much as a new car.

Let’s use log files analysis instead. We work with websites in the following way: We crawl the web pages as the search bot does, and then we analyze log files that were gathered for half the year. Logs show whether bots visit the website, which pages were crawled and when and how often bots visited the pages.

Crawling is the process of search bots visiting your website, processing all links on web pages and placing these links in line for indexation. During the crawling, bots compare just-processed URLs with those already in the index. Thus, bots refresh the data and add/delete some URLs from the search engine database to provide the most relevant and fresh results for users.

Now, we can easily draw these conclusions:

Unless search bot was on the URL, this URL probably won’t be in the index.

Altogether, this information reveals what prevents organic growth and development of your website. Now, instead of operating blindly, your team can wisely optimize a website.

If Googlebot visits the URL several times a day, that URL is high-priority and therefore requires your special attention. Continue reading...

Page Speed: How it Impacts Your SEO and How to Accelerate It

Why Website Speed Matters

Google has included site speed (and as a result, page speed) as a signal in search ranking algorithms. Speeding up websites is crucial — not just to website owners, but to all Internet users.

Surveys show that people really care about the speed of a page. But faster webpages don't just improve UX. Improving site speed also reduces operating costs. For instance, a 5 second speed up (from ~7 seconds to ~2 seconds) resulted in a 25% increase in page views, a 7-12% increase in revenue, and a 50% reduction in hardware. This last point shows the win-win of performance improvements, increasing revenue while driving down operating costs.

JetOctopus team encourages you to start looking at your webpages speed — not only to improve your ranking in Google, but also to improve user's experience and increase profitability.

How Fast Should a Webpage Load

To understand what load time is “ideal”, let’s look at the real numbers. Studies show that the average load time for a webpage on the Web is 3.21 sec. The next experiment shows that the average bounce rate for pages loading within 2 sec. is 9%. As soon as the page load time surpasses 3 sec., the bounce rate soars, to 38% by the time it hits 5 sec.

On the basis of our experience in technical SEO, JetOctopus crawler team claims that ideal loading time should be 0,5–1 second. If you know that you’re better than the industry standard load times for 2019, then you’re going to be better than most of your competitors. Continue reading...