5 Tips To Conquer The Two-Algorithm World Of 2015

The rise of deep learn­ing means search mar­keters must opti­mize less for rank­ing inputs and more for searcher out­puts.

Lisa Lacy By Lisa Lacy. Join the discussion » 0 comments

Like it or not, SEO has become a brave new two-algo­rithm world in which search mar­keters must opti­mize for both Google’s algo­rithm, as well as human input, which is some­thing the search engine is start­ing to care about more. But that’s not to say SEOs and dig­i­tal mar­keters should shift gears com­plete­ly and sim­ply make pages for peo­ple instead of search engines. Rather, they should opti­mize for both, says Moz Co-Founder Rand Fishkin. Here’s why.

As Google launched new algo­rithms to fight manip­u­la­tive links and con­tent – and used fear and uncer­tain­ty about penal­iza­tion to keep sites in line – over the last three years, its actions have erased a decade of old-school SEO prac­tices, Fishkin said.

But it isn’t nec­es­sar­i­ly bad news from a con­sumer per­spec­tive. Google has also fig­ured out searcher intent and start­ed exam­in­ing lan­guage instead of just words to pro­vide bet­ter results and they’ve also fig­ured out sce­nar­ios when con­sumers want recent results, such as in a search for “dig­i­tal mar­ket­ing con­fer­ences.”

Mean­while, Google’s search qual­i­ty team has also under­gone an evo­lu­tion, Fishkin said. That includes the incor­po­ra­tion of machine learn­ing to not only pre­dict ad click-through rates (CTRs), but also organ­ic results. As machine learn­ing takes over more of Google’s algo­rithm, the under­pin­nings of the rank­ings change, Fishkin said.

What’s more, with a machine learn­ing sys­tem in search – in which poten­tial rank­ing fac­tors and train­ing data, such as what con­sti­tutes good and bad search results, are used to cre­ate a learn­ing process and then the best-fit algo­rithm – it’s some­times hard to fig­ure out why some­thing ranks the way it does. That’s even more pro­nounced with deep learn­ing, which takes machine learn­ing a step fur­ther in that it’s essen­tial­ly an algo­rithm that builds its own algo­rithm.

What Does Deep Learning Mean For SEO?

For one thing, it means Google won’t know why some­thing ranks the way it does or whether a vari­able is in the algo­rithm. That’s because query suc­cess met­rics – such as long- to short-click ratio, user engage­ment across the domain and sharing/amplification rate ver­sus oth­er results – will be all that mat­ter to machines.

We’ll be opti­miz­ing less for rank­ing inputs and opti­miz­ing more for searcher out­puts,” Fishkin said.

That means the near future is real­ly about opti­miz­ing for two algo­rithms, Fishkin said.

The best SEOs have always opti­mized for where we’re going,” he said. “Today I think we know bet­ter than ever where we’re going.”

That means find­ing bal­ance between clas­sic rank­ing inputs like key­word tar­get­ing, qual­i­ty, and unique­ness and searcher out­puts like rel­a­tive CTR and short- ver­sus long-click.

So how should search mar­keters do that? Here’s Fishkin’s advice.

Tip 1: Optimize More For Clicks

Search mar­keters should opti­mize the title, meta descrip­tion, and URL a lit­tle for key­words and then a lot for clicks.

If you rank num­ber three, but have a high­er than aver­age CTR for that posi­tion, you might get moved up,” Fishkin said.

Because Google often tests new results briefly on page one, Fishkin said it may also be worth repeat­ed pub­li­ca­tion on a top­ic to earn high CTR.

In addi­tion, Fishkin said dri­ving up CTR through brand­ed search­es may give an extra boost as the per­cent­age of peo­ple who do brand­ed search influ­ence how a result ranks for non-brand­ed search.

Also, in a cat­e­go­ry like car insur­ance, brand spend on TV influ­ences the search­es being per­formed by con­sumers, which sort of sec­on­dar­i­ly influ­ences their suc­cess met­rics, which is why Fishkin said he thinks Triva­go will start creep­ing up in trav­el search­es.

Tip 2: Compel Site Visitors To Stay Awhile

Fishkin said pogo-stick­ing (when users have to go to a bunch of dif­fer­ent sites to find what they’re look­ing for because sites rank high­ly, but don’t sat­is­fy user queries) and long clicks (when users per­form a search, click on a result, and remain on that site for a long peri­od of time) togeth­er may deter­mine where a brand ranks and for how long. So SEOs should seri­ous­ly con­sid­er con­tent that ful­fills both the searcher’s con­scious and uncon­scious needs, as well as ensure speed, in order to com­pel vis­i­tors to go deep­er into a site.

One exam­ple is the New York Times, which had an inter­ac­tive graph that asked users to draw their best guess about how income pre­dicts a child’s col­lege chances because read­ers would nat­u­ral­ly spend a long time on that page draw­ing said graph.

Tip 3: Be As Comprehensive As Possible

Google is look­ing for con­tent sig­nals that a page will ful­fill all of a searcher’s needs and machine learn­ing mod­els may note the pres­ence of cer­tain words, phras­es, and top­ics pre­dict more suc­cess­ful search­es.

In oth­er words, in a search for New York City, a page that men­tions each of the five bor­oughs may rank high­er than a page that does not.

Tools like Alche­myAPI and Mon­keyLearn can help here, Fishkin said.

Tip 4: Create Content That Inspires Loyalty

Pages that get lots of social activ­i­ty and engage­ment, but few links, seem to over­per­form even for high­ly com­pet­i­tive key­words like “pho­tos of dogs.”

Fishkin said we see this kind of behav­ior when a URL becomes hot in social, which could mean Google is using oth­er met­rics to get data that mim­ics social shares, such as click­stream, engage­ment, and brand­ed queries.

Per Fishkin, Google almost cer­tain­ly clas­si­fies search results pages dif­fer­ent­ly and opti­mizes to dif­fer­ent goals, such as for a med­ical query that might be best served by a page that does not have many social shares.

In addi­tion, Google prob­a­bly wants to see shares that result in loy­al­ty and return vis­its, Fishkin said. Know­ing what the audi­ence and their influ­encers share is essen­tial, as is know­ing what makes them return or pre­vents them from doing so.

We don’t need bet­ter con­tent. We need 10X con­tent,” Fishkin said.

Tip 5: Solve The User’s Entire Task, Not Just A Query

Google wants searchers to accom­plish their tasks faster.

So if Google sees that many users who per­form sim­i­lar types of queries end up on the same site, it might use click­stream data to rank that site high­er, even if that site doesn’t have tra­di­tion­al rank­ing sig­nals.

A page that sim­ply answers the ini­tial query may not be enough because Google wants to send users to web­sites that resolve their mis­sion, Fishkin said.

What do you think about opti­miz­ing for both Google’s algo­rithm and human input?

Lisa Lacy

Written by Lisa Lacy

Lisa is a senior features writer for Inked. She also previously covered digital marketing for Incisive Media. Her background includes editorial positions at Dow Jones, the Financial Times, the Huffington Post, AOL, Amazon, Hearst, Martha Stewart Living and the Dian Fossey Gorilla Fund.

Inked is published by Linkdex, the SEO platform of choice for professional marketers.

Discover why brands and agencies choose Linkdex

  • Get started fast with easy onboarding & training
  • Import and connect data from other platforms
  • Scale with your business, websites and markets
  • Up-skill teams with training & accreditation
  • Build workflows with tasks, reporting and alerts

Get a free induction and experience of Linkdex.

Just fill out this form, and one of our team members will get in touch to arrange your own, personalized demo.