What Is Technical SEO?

The def­i­n­i­tion of tech­ni­cal SEO, how good prac­tice can impact site per­for­mance, and what your team or agency should rou­tine­ly check.

Nichola Stott By Nichola Stott from theMediaFlow. Join the discussion » 0 comments

With­out a tech­ni­cal­ly opti­mized foun­da­tion, any time and mon­ey you spend on oth­er SEO activ­i­ties, such as what’s com­mon­ly referred to as “on-page” and con­tent mar­ket­ing, is an inef­fi­cient use of time and bud­get. This arti­cle explains what tech­ni­cal SEO is, how good-prac­tice can impact site per­for­mance, and what your team or agency should rou­tine­ly check.

Do you know what my favorite part of the game is? The oppor­tu­ni­ty to play.” – Mike Sin­gle­tary

SEO includes a range of activ­i­ties designed to result in more sales via more, rel­e­vant, organ­ic search traf­fic to your web­site. Activ­i­ties like:

  • Mak­ing sure the key­words that accu­rate­ly reflect what your page offers, used in a way that match­es how peo­ple use a search engine, are insert­ed in the right place.
  • Cre­at­ing use­ful and orig­i­nal con­tent to sup­port your prod­uct to help increase the ways that search users can come across your brand.

These kinds of activ­i­ties are crit­i­cal once your site is com­pet­ing in the search engine results. Here’s the thing… Have you entered the game? Do you know if all the right pages on your site are in the com­pe­ti­tion? Do you have any per­for­mance inhibitors that are stop­ping your pages from get­ting into the race?

Accessible and Unique

The web is quite a chal­leng­ing place as an infor­ma­tion repos­i­to­ry. Due to the poten­tial of hyper­link­ing con­nect­ing two doc­u­ments in sep­a­rate loca­tions, this cre­ates an infi­nite web-like struc­ture and com­plex­i­ty that we don’t get with more tra­di­tion­al infor­ma­tion repos­i­to­ries – such as a hier­ar­chi­cal fil­ing struc­ture. Think about a time you might have land­ed on an inter­est­ing Wikipedia page then found an inter­est­ing source (hyper­linked) which links to anoth­er Wikipedia page. It’s a nev­er-end­ing click hole! This isn’t just a time-sink for humans. Robots, like spi­ders and crawlers, trav­el by links too; there­fore we need to allow them a clear and easy path through our site to ensure that all the URLs we want search engines to index are acces­si­ble and unique. By “acces­si­ble” our con­cern is to ensure that search spi­ders can access the URLs that you (the brand, busi­ness, or site own­er) need to get indexed and ranked. On the flip side, we need to ensure that URLs that aren’t sup­posed to be indexed and ranked are ade­quate­ly blocked from such. There are a hand­ful of ways of achiev­ing that end result. The cho­sen solu­tion that a tech­ni­cal SEO may rec­om­mend will con­sid­er the total­i­ty of your site, plus your high-lev­el dig­i­tal mar­ket­ing goals, com­bined with the data they have avail­able before mak­ing a rec­om­men­da­tion as to how pre­vent any site URLs from being accessed or indexed. Here are the main con­sid­er­a­tions when it comes to an effi­cient crawl.

1. Website Architecture

Most pop­u­lar search engines, such as Google and Bing, place a strong empha­sis on link data in their core algo­rithms. Search engines try to sur­face the best answer to a human ques­tion. While numer­ous sig­nals con­tribute towards this qual­i­fi­ca­tion, freely giv­en links from oth­er writ­ers on the web can act as a sig­nal that a page is per­ceived as use­ful and author­i­ta­tive. This link data accu­mu­lates equi­ty on the tar­get URL and the stronger the equi­ty the greater the high­er the algo­rith­mic val­ue of this com­po­nent. For this rea­son hav­ing a site struc­ture that affords the effi­cient flow of equi­ty through your site should be a high pri­or­i­ty. An ide­al struc­ture should be a pyra­mid and not a water­fall and in as far as pos­si­ble try to keep core pages no fur­ther than three clicks from home. ideal site structure The sec­ond rea­son it’s impor­tant to archi­tect a site coher­ent­ly is that spi­ders trav­el via links; that is how our site pages are dis­cov­ered before the machine decides to index. A well archi­tect­ed site will nat­u­ral­ly result in a num­ber of inter­nal links point­ing to deep­er pages, thus increas­ing the chances of such a page being spi­dered. Lay­ers of com­plex­i­ty are added when it comes to larg­er ecom­merce sites that may have numer­ous cat­e­go­riza­tions, prod­ucts, and vari­a­tion with­in the same prod­uct (e.g., an online cloth­ing store that caters to men and women; with numer­ous col­lec­tions, types of item – with­in a col­lec­tion – and vari­ables such as size, col­or, and use-case – par­ty dress, work dress, evening dress, etc.). Remem­ber­ing that, when­ev­er pos­si­ble, we want the core prod­uct page to be no more than three lev­els from home, this is where a tech­ni­cal SEO will be look­ing for your use of attrib­ut­es and direc­tives.

2. Attributes & Directives

similar buildings Attrib­ut­es and direc­tives can help solve sim­i­lar com­mon prob­lems that occur on larg­er sites; par­tic­u­lar­ly ecom­merce sites. One of the most com­mon issues pre­vent­ing full crawl and index­ing of larg­er sites is that caused by sub­stan­tial­ly sim­i­lar pages. A search engine crawler is look­ing for new con­tent to add to the index. A page has to be dif­fer­ent enough (from oth­er pages on the site) to war­rant space in the index. With ecom­merce sites, pages often aren’t that dif­fer­ent to each oth­er (e.g., cat­e­go­ry land­ing pages with mul­ti­ple pages of results). If I nav­i­gate to “black shoes” on my favorite online cloth­ing store I will have hun­dreds if not thou­sands of results returned. Using attrib­ut­es cor­rect­ly we can describe the rela­tion­ship between one page and anoth­er, when to all intents Page 2 of my results is sim­i­lar to Page 1 or 3, etc. Con­sid­er­ing rel attrib­ut­es in mark-up, a tech­ni­cal SEO will assess what kind of attrib­ut­es you’re using and for what type of pages. Here are some that would be com­mon­ly con­sid­ered:

  • Rel=prev or rel=next: pag­i­na­tion attrib­ut­es can be used for cat­e­go­ry land­ing pages, and help describe the rela­tion­ship between mul­ti­ple pages of results when there’s lit­tle oth­er point of dif­fer­ence.
  • Rel=canonical: can be used to explain a pre­ferred URL when a set of URLs are sub­stan­tial­ly sim­i­lar.

Anoth­er way to con­sid­er treat­ing sub­stan­tial­ly sim­i­lar pages is with strate­gic deci­sions about whether all ver­sions should be indexed. In such sit­u­a­tions we have to deter­mine if our sub­stan­tial­ly sim­i­lar pages have any point of dif­fer­ence that makes all ver­sions wor­thy of index­ing.

3. Index Or Not?

Set­ting aside the obvi­ous areas you can’t allow to be indexed (site func­tions, secure process­es, login areas and such), there are con­sid­er­a­tions as to what oth­er pages may be pre­vent­ed from crawl or index, for the greater good. Search engine crawlers are look­ing for new unique con­tent. So it is a waste of crawl bud­get and index space to occa­sion­al­ly allow large areas of a site that may add no point of dif­fer­ence to be crawled and indexed. At worst, such issues could result in algo­rith­mic demo­tions trig­gered by the Pan­da algo­rithm fil­ter. For exam­ple if at the end of sea­son, I move a lot of unsold stock to a “sale” fold­er, it may be best to keep this con­tent out of the index so as not to poten­tial­ly com­pete with my core, full-priced inven­to­ry. There are a num­ber of ways to pre­vent crawl and index. A tech­ni­cal SEO will often need to under­stand and quan­ti­fy all a web­site vari­ables before sug­gest­ing a way, (or even com­bi­na­tion of ways) of pre­vent­ing URLs from being crawled and indexed though options may include:

  • Secur­ing pages behind a login.
  • Using instruc­tions in a Robots.txt file to pre­vent URLs and fold­ers being crawled.
  • Using a “noin­dex” direc­tive on an indi­vid­ual page or set of pages.
  • Using para­me­ter han­dling in Search Con­sole for URLs that have func­tion­al para­me­ters applied though the con­tent hard­ly alters.

Keep­ing indexed pages at an opti­mal lev­el means that we’re keep­ing any equi­ty mov­ing effi­cient­ly through the site.

4. Crawl Efficiency

To get an effi­cient crawl we want to make sure that spi­ders can get around our site with as few bar­ri­ers as pos­si­ble so that there’s a greater chance of dis­cov­er­ing our impor­tant URLs that we want in the index. Our con­cerns here as tech­ni­cal SEOs are:

  • Do all URLs return a 200 (suc­cess) head­er sta­tus?
  • Do 200 pages all ren­der the cor­rect con­tent?
  • Do we have any 404 (not found/error) pages?
  • Are we link­ing to any URLs that return a 404?
  • Does our intend­ed 404 page return a cus­tomized page that helps get peo­ple (and robots) to anoth­er page?
  • Can we find any redi­rect­ed URLs on crawl­ing the site?

It’s impor­tant to keep a site as tidy as pos­si­ble, with no links to URLs that return a 404 and no links to URLs that redi­rect to anoth­er URL. Instead all links on our site should point direct­ly to their des­ti­na­tion. While in many sit­u­a­tions there may be rea­sons for hav­ing to redi­rect URLs, using a 301 redi­rec­tion head­er response code (not 302 for exam­ple) will also redi­rect some of the link equi­ty we talked about ear­li­er, as well as human vis­i­tors. How­ev­er there’s no need to link to a URL that 301 redi­rects to anoth­er URL on our own site.

5. Index Efficiency

Under­stand­ing how our site is per­ceived in search indices is an impor­tant part of the bench­mark­ing process. It can help us under­stand how much of our data is indexed, might we have a prob­lem with dupli­cate con­tent, or are we wast­ing space with old URLs still there? Using com­bi­na­tions of advanced queries a tech­ni­cal SEO will try to under­stand how many results from your site are indexed; are these the right results, is there an issue with lack of URL, domain or sub­do­main canon­i­cal­iza­tion. A query like [site:thisisanexample.com] may reveal results that show both HTTP and HTTPs results for my web­site. If that is the case, is this as intend­ed? Was this intend­ed? Is this impact­ing per­for­mance? These are all the ques­tions that would need to be assessed as part of a tech­ni­cal audit. When prob­lems are found and treat­ments agreed (such as a dis­al­low rule in robots.txt file), it’s then good-prac­tice to clean up the index by sub­mit­ting a removal request in Search Con­sole to clean out a spe­cif­ic URL or even fold­er. Opti­miz­ing crawl and index work togeth­er as a cor­ner­stone of tech­ni­cal SEO. We’re mak­ing effi­cient use of search robots mov­ing through our site with­out dead ends and redi­rec­tions, as well as pre­vent­ing a build-up of sub­stan­tial­ly sim­i­lar or even func­tion­al dupli­cate pages from get­ting indexed. Time and again we see some of our most sig­nif­i­cant gains in vis­i­bil­i­ty, traf­fic and rank­ing terms result­ing from clean­ing up crawl and index.

6. User-Friendly

In recent years, the most sig­nif­i­cant algo­rithm updates from Google have put the user expe­ri­ence front and cen­ter. In April this year we had the mobile update and from as ear­ly as 2010 Google have been con­firm­ing usabil­i­ty aspects such as page speed are fac­tors includ­ed in their rank­ing algo­rithm.

Device Strategy

A tech­ni­cal SEO review should take account of how your web­site per­forms on dif­fer­ent kinds of devices. Does your ana­lyt­ics pack­age show any large dis­par­i­ty in per­for­mance? Is there a sig­nif­i­cant dif­fer­ence in your rank­ings when com­par­ing mobile to desk­top search queries?

How to Cater for Different Devices

Rou­tine checks should include sam­ple device com­par­isons from a front-end per­spec­tive, as well as look­ing at how you’re deliv­er­ing con­tent to non-desk­top users. Choic­es exist around using one web­site that may respond and scale accord­ing to the device infor­ma­tion sent in the serv­er request; or to use more than one web­site ver­sion, each opti­mized for a par­tic­u­lar kind of screen (e.g. a desk­top ver­sion and a smart­phone ver­sion) and use auto­mat­ic redi­rec­tion accord­ing to user agent. There’s no sin­gle per­fect solu­tion for all web­sites. The best strat­e­gy must take account of your busi­ness direc­tion and audi­ence needs. That said – one of the most com­mon tech­ni­cal SEO issues is caused by dupli­cat­ed web­sites intend­ed for dif­fer­ent audi­ences. Call­ing a site, or sub­do­main “the mobile web­site” doesn’t make it a mobile web­site. A web­site is avail­able on any device that can con­nect and receive data through HTTP unless we specif­i­cal­ly set to be treat­ed dif­fer­ent­ly. If we don’t, the result will be the kind of dupli­cate con­tent issues men­tioned above, that can cause seri­ous prob­lems.

Speed Matters

Pages on your site should ren­der quick­ly irre­spec­tive of the end-user device. There are sev­er­al ways to improve per­for­mance. Checks could include:

  • Host per­for­mance, secu­ri­ty, and serv­er spec­i­fi­ca­tion.
  • How page con­tent is deliv­ered and use of any CDN (con­tent deliv­ery net­work).
  • Reduc­ing any unnec­es­sary scripts and tight­en­ing code.
  • Par­al­lelize simul­ta­ne­ous calls.

We’ve expe­ri­enced tan­gi­ble incre­men­tal traf­fic gains that appear to direct­ly cor­re­late with the com­ple­tion of speed opti­miza­tion work on a hand­ful of client sites, so it’s def­i­nite­ly an area to make sure if on the check­list as part of your tech­ni­cal SEO plan. In case you’re inter­est­ed in mon­i­tor­ing or imple­ment­ing a tech­ni­cal SEO rou­tine with your team there’s a handy check­list on our site that you might want to check out that includes all of the above aspects to work through and more.

Nichola Stott

Written by Nichola Stott

Managing Director, theMediaFlow

Nichola Stott is managing director of theMediaFlow; a multi-award winning digital marketing agency combining technical excellence with creativity and a strong focus on SEO. Nichola has worked in digital communications for almost 20 years, with experience in global communications, investor relations, design and more recently before founding theMediaFlow; as head of search partnerships at Yahoo!

Inked is published by Linkdex, the SEO platform of choice for professional marketers.

Discover why brands and agencies choose Linkdex

  • Get started fast with easy onboarding & training
  • Import and connect data from other platforms
  • Scale with your business, websites and markets
  • Up-skill teams with training & accreditation
  • Build workflows with tasks, reporting and alerts

Get a free induction and experience of Linkdex.

Just fill out this form, and one of our team members will get in touch to arrange your own, personalized demo.