7-technical-seo-mistakes-you-must-avoid

7 Technical SEO Mistakes You Must Avoid

Technical SEO refers to optimization on the technology of a website aside from content pages. Think about it as the backbone of a website. If your website has a strong technical structure, it will give your contents a robust “base” so it can rank better for relevant keywords.

Whereas website that is plauged with technical errors – may be lowering its chance to rank on search engine result page (SERP).

As an SEO strategist, I’ve had the chance to work on a number of technical SEO audits for clients. I came across some common technical SEO mistakes that website owners must avoid:

1. Blocking duplicate contents using robots.txt

Duplicate pages are considered low quality contents in Google’s SEO. Google may choose to lower rankings for website that has duplicate content issues.

(To know more about low quality contents, you can check out our previous post)

People who are new to SEO might think blocking duplicate pages with robots file can solve the issues of duplicate contents. It does not work that way.

technical-seo-robots-file

A robots.txt use is to tell crawlers that it should not visit certain pages on the site. Sure, it helps to block Googlebot from accessing it, however, it doesn’t effectively remove search indexing from these duplicate pages.

These duplicate resources will still appear on other location on the Internet.

Here’re some proper ways to solve duplicate contents:

  • Manually remove duplicate contents or posts
  • Use 301 redirect to move old pages to new URL that doesn’t duplicate
  • Use a canonical link to specify “preferred” version of these duplicate contents

2. Missing XML sitemap

XML sitemap allows search engines to crawl a website more effectively. It ensures that Googlebot can easily discover all available pages on your website and indexes them quickly.

optimize-seo-xml-sitemap

Therefore, submitting a sitemap is important for optimizing a website. Here’re a few best practices when creating XML sitemap:

  • Only include canonical URLs
  • Exclude URLs that are disallowed by robots.txt
  • Specify a last modification time for each URL in an XML sitemap

3. Missing structured data

Structured data are markup data on a webpage that contains organized and relational database.

These markup data add additional details about a Web content, and help Google to have better understanding about that particular webpage.

When done correctly, structured data will help to return better search results for a webpage such as rich snippets, knowledge box, rich cards, etc. These can also increase click-through-rate (CTR) on SERPs.

Here are 2 different formats of structured data. I’ll show you how to add each of them to the HTML of your web pages:

Microdata

Microdata uses an in-line method to include markup within the head of HTML.

Let’s start with a website that talks about a local business.

You want to tell search engine what this company is about, where it’s located, etc. Your HTML code might look something like this:

structured-data-for-seo-microdata

JSON-LD

JSON-LD uses a Javascript method to insert markup into the head of HTML code.

Consider you have a Recipe webpage and you want to markup with details such as ingredients, prep time and calories.

Let’s look at the example of code:

structured-data-for-seo-json-ld

4. Deep page depth

Page depth is calculated by the number of clicks needed to reach a specific page from the homepage.

For example, if you are visiting our Homepage (starts at depth 1), it will take you 1 click to reach our About page. Therefore, our About page is at depth 2.

If a webpage takes more than 3 clicks to reach, Google will not crawl these pages as often as the ones at depth 2. That also means deep pages are less likely to be ranked highly on search engine.

Review your site structure

Getting rid of deep pages not only improve SEO performance, it also provides better user experience. Here are few ways to get started:

seo site hierarchy

  • Map out your current site structure in the form of tree graph
  • Identify how these URLs are being linked together. Are they having too many page depth?
  • Use internal links to give users better access to valuable information on your website
  • Use breadcrumbs to indicate where visitors are on your site (especially large, complex eCommerce websites)

5. Noindex in the wrong way

A “noindex” meta robots tag is used when you want to tell Google to not index a specific webpage URL. In other words, you don’t want this URL to appear on search engine result page (SERP).

noindex-robots-meta-tag

In some situation, people might want to “noindex” a page when:

  • A relevant page that corresponds with another article in terms of similarity of contents
  • Pages such as listing and category that is not worthy of getting indexed on search engine
  • Pages that are created to support some other pages; however does not require the creation of an entirely new web page

However, improper implementation of “noindex” can hurt your SEO. It’s recommended to avoid below practices:

  • Having large percentage of noindexed pages that could cause Google to take longer to crawl these pages and use up “crawl budget” for other important pages on your website
  • Noindex your own blog posts that could cause Google to remove from its search results
  • Noindex internal links where majority of these links are pointed to top-level navigation of the website

6. Failure in redirection

Failure in redirection on your website is bad for SEO. The will block user from accessing the webpage, interrupts browsing and cause them to abandon the site. If your site is well-ranked on Google, this redirect failure can greatly cause rankings loss.

One of the redirection issues is “too many redirect” errors. Visitors from Chrome browser will end up seeing an error message like this:

too-many-redirect-loop

What can cause redirect errors?

Technical misconfiguration in the server, hosting account or content management system (CMS) can cause such error.

However, this is a more common problem with WordPress website- as WordPress uses redirect functions to create SEO-friendly URLs.

For example, inconsistent settings between WordPress Address and Site Address URL in wp-config.php file can cause redirect loop. You want to make sure you set up both with the same version of URL prefix:

  • WordPress Address: http://mysite.com
  • Site Address URL: http://mysite.com

Or if you prefer to use the www version,

  • WordPress Address: http://www.mysite.com
  • Site Address URL: http://www.mysite.com

Once you decide your URL versions, head over to your server and configure wp-config.php file as:

  1. define(‘WP_HOME’,’http://example.com’);
  2. define(‘WP_SITEURL’,’http://example.com’);

7. Slow page loading speed

Users quit browsing when the site is too slow. This will send signals to Google, telling search engine that this website has high bounce rate and short time-on-site.

Therefore, search engine may choose not to rank highly for this site.

There are many reasons for causing slow website speed. For examples:

Poor server performance

Using low cost web host will usually give you limited features on site performance, such as a shared web host. A shared hosting means your website will be sharing server resources with other websites.

When a sudden increase of traffic arrives at your website, it might take longer to load and cause your website to run slowly.

Too many plugins

For many WordPress users, plugins enhances the functionality and features of a website. It’s totally fine to have a few of them. But when it’s too many, it might slow down your site.

wordpress-plugins

Plugins installation can add weights to page size, increase external JavaScript resources and database stored on server. All these can affect your page loading speed.

If you’re running tons of plugins, it’s time for a cleanup. Take a closer look whether these plugins are necessary.

Large images

Is your website image-heavy? Are you using extra large-sized images on your website? If the answer is “Yes”, you might experience a slow site.

The thing is, large images are going to take more time to load up and takes longer to display.

To resolve large image issues, you can compress images using tool like Compressor.io to reduce image file size and speed up page load speed. Choose the right format for web page images. Some format like TIFF (Tag Image File Format) will have larger file size as compared to JPG and PNG.

Want some EXTRA help?

Technical SEO plays a major role in improving search engine rankings. The biggest problem for most people is that, you don’t have time to figure out about technical SEO. You have other more important things to do.

That’s why we created MashWebby Full SEO Audit.

Full SEO Audit is the easiest way to get started with technical SEO audit. We help you identify SEO issues that cause your website not to rank. Our in-depth analyses include technical SEO, content, links audit and competitors’ ranking audit.

Click here to check out MashWebby Full SEO Audit.

About the Author: Zoe Chew is a SEO training instructor and Founder at mashwebby.com . She developed high quality online products with her expertise in digital marketing/SEO to help business owners increase traffic and sales.

how google rate webpages as low quality 4 Signs Google Would Rate Your Webpages as Low Quality
SEO for Startups: A 11-Step Guide to Online Exposure (Part 1)

One Comments

Leave a Reply