← Back to SEO Overview

Quick Reference — 9 Topics · 40+ Questions

SEO
Quick Answers

Fast, accurate answers to the most frequently asked SEO questions — from what an audit actually is, to how backlinks work, to what the latest algorithm updates mean for your site.

Topic 01

SEO Audits — What They Are and How to Run One

An SEO audit is a comprehensive evaluation of a website's technical health, content quality, and backlink profile to identify issues that are preventing the site from ranking well in search engines. A full audit covers crawlability, indexation, page speed, on-page optimisation, content quality, internal linking, and off-page authority. It produces a prioritised list of fixes ordered by impact. Think of it as a doctor's check-up for your website.
Quarterly for most sites. After any significant site migration, CMS change, or redesign — run an immediate post-launch audit to catch regressions. After major Google algorithm updates, run a targeted audit focused on the affected signals. New sites should be audited at launch and again at 3 months once Google has had time to crawl and index the content.
The essential stack: Screaming Frog (site crawl — free up to 500 URLs, £200/year for unlimited), Google Search Console (indexation and performance data — free), Ahrefs or Semrush (backlink analysis, keyword tracking — $100–200/month), and PageSpeed Insights (Core Web Vitals — free). For local businesses, add BrightLocal for citation and review auditing.
The indexation status of your key pages. Open Search Console's Coverage report and look at the "Excluded" section. "Crawled, currently not indexed" means Google crawled your pages but chose not to index them — usually because it deemed the content low quality, thin, or duplicate. This single finding often explains why a site with good content isn't ranking. Everything else matters less if your core pages aren't even in the index.
A technical audit focuses exclusively on the engineering layer: crawlability, rendering, page speed, structured data, redirects, and indexation. An SEO audit is broader — it encompasses technical SEO plus content quality, keyword strategy, on-page optimisation, and link profile analysis. For most sites, you need both, but technical issues should be fixed before investing in content improvement.

Topic 03

Google Algorithm Updates — What They Are and How to Respond

A Google algorithm update is a change to the way Google's search ranking system evaluates and orders web pages. Google runs thousands of small updates per year — most are invisible. Named updates (Core Updates, Helpful Content Updates, Link Spam Updates, etc.) are larger changes that can cause significant ranking movements. Google officially announces major updates on its Search Central Blog, and industry tools like Semrush Sensor or MozCast track unconfirmed volatility.
Google's Helpful Content system (launched 2022, now baked into the core algorithm) targets sites that produce content primarily for search engines rather than for people. It's a site-wide signal — if Google determines that a significant portion of your site is unhelpful "search-engine-first" content, all pages on your site are demoted. Sites with thin, AI-generated-without-editing content, or content that exists only because the keyword has search volume, are most at risk.
First, confirm it was the Core Update and not a coincidental technical issue (check Search Console for crawl errors or index coverage problems). If it's the update, understand that Core Updates don't penalise sites — they re-evaluate all sites against new quality thresholds. The question Google advises asking is: compared to the pages that now outrank you, is your content genuinely more useful and comprehensive? If the honest answer is no, that's your roadmap. Content quality improvements can take 2–3 months to reflect in rankings.
Panda (2011) targeted low-quality, thin, and duplicate content — sites with large amounts of poor content were demoted across the board. Penguin (2012) targeted manipulative link building — sites with unnatural backlink profiles (exact-match anchor text manipulation, link farms) were penalised. Hummingbird (2013) was a core algorithm rewrite that enabled Google to understand natural language queries rather than matching keywords literally. All three are now permanently integrated into the core algorithm, running in real-time rather than as periodic updates.

Topic 04

E-E-A-T — Experience, Expertise, Authoritativeness, Trustworthiness

E-E-A-T stands for Experience, Expertise, Authoritativeness, and Trustworthiness. It's the framework Google's human quality raters use to evaluate website quality, drawn from Google's Search Quality Evaluator Guidelines. Experience (added in 2022) refers to first-hand experience with the topic. Expertise is formal or demonstrated knowledge. Authoritativeness is reputation within your field. Trust is the overarching factor — an untrustworthy site cannot be rated high on E-E-A-T regardless of expertise.
Not directly — E-E-A-T is not a measurable algorithmic signal like page speed. However, the signals that demonstrate E-E-A-T (expert authorship, authoritative backlinks, accurate information, positive brand signals) are heavily weighted ranking factors. Think of E-E-A-T as the framework Google uses to describe what high-quality content looks like, and the algorithm is built to approximate it using hundreds of measurable signals.
Key actions: (1) Add detailed author bios with credentials, professional profiles, and links to social proof for every writer. (2) Cite authoritative sources and link out to them. (3) Display business trust signals: clear About page, physical address, phone number, verified contact information. (4) Earn editorial backlinks from respected publications in your field. (5) Keep all factual claims accurate and updated. (6) For YMYL (Your Money Your Life) topics — health, finance, legal — hire genuine subject matter experts as authors or reviewers.
YMYL stands for "Your Money or Your Life" — topics where inaccurate information could seriously harm readers' health, financial wellbeing, safety, or civic participation. Google holds YMYL content to higher E-E-A-T standards because the stakes of low-quality content are higher. YMYL categories include: medical, legal, financial advice; news and current events; government and voting information; shopping and financial transactions. If your site covers any of these areas, demonstrable expertise and trustworthiness is not optional — it is required to rank.

Topic 05

Google Penalties — Manual Actions and How to Recover

A manual action is applied by a human Google reviewer who has determined your site violates Google's Webmaster Guidelines. You will receive a notification in Search Console's Manual Actions report. An algorithmic drop is caused by an algorithm update re-evaluating your site — no notification, and you cannot appeal. Manual actions require specific remediation and a reconsideration request. Algorithmic drops require content or link profile improvement and waiting for the next algorithm refresh to recover.
Unnatural links pointing to your site (link buying or link schemes), unnatural links from your site (selling links), thin or low-quality content, cloaking (showing different content to Google than to users), sneaky redirects, hidden text or hidden links, and pure spam. The most common is unnatural inbound links — often a legacy issue from historical black hat link building by previous site owners or SEO agencies.
Step 1: Read the exact manual action description in Search Console to understand what violation was found. Step 2: Fix the root cause — remove or disavow bad links, remove thin content, fix cloaking, etc. Step 3: Document everything you've done, including evidence. Step 4: Submit a reconsideration request via Search Console explaining what the problem was and what you did to fix it. Step 5: Wait — Google typically reviews within a few weeks. The reconsideration request must be specific and honest. Vague submissions are routinely rejected.

Topic 06

SEO Tools — The Full Stack Explained

ToolCategoryBest ForCost
Google Search ConsoleGoogle's own dataIndexation, performance, crawl issues, Core Web Vitals field dataFree
Google Analytics 4AnalyticsOrganic traffic, user behaviour, conversions, landing page performanceFree
PageSpeed InsightsPerformanceCore Web Vitals, LCP/INP/CLS, page speed optimisation recommendationsFree
AhrefsAll-in-one SEOBacklink analysis, keyword research, rank tracking, site audit, competitor analysis$129–$449/month
SemrushAll-in-one SEOKeyword research, competitor intelligence, content gap analysis, local SEO$140–$500/month
Screaming FrogTechnical crawlSite crawl, broken links, redirect chains, title/meta analysis, technical issuesFree (500 URLs) / £250/year
Moz ProAll-in-one SEODomain Authority tracking, local SEO, rank tracking$99–$299/month
BrightLocalLocal SEOCitation auditing, local rank tracking, review management$39–$49/month
Surfer SEOContent optimisationOn-page content scoring, NLP optimisation, content editor$89–$199/month
SitebulbTechnical auditAdvanced technical crawls with visual reports — great for agencies$14–$55/month

Topic 07

AI and SEO — What's Changed and What Matters

No — Google's official position is that it evaluates content quality regardless of how it was produced. AI-generated content that is helpful, accurate, and written for people is treated the same as human-written content that meets the same standards. What Google penalises is "AI-first" content: content generated at scale with no editorial oversight, primarily to occupy search rankings rather than genuinely help readers. The distinction is quality and intent, not the production method.
Google's AI Overview (formerly Search Generative Experience) generates a summary answer at the top of search results for many informational queries. Early data suggests AI Overviews reduce click-through rates for queries where they appear, particularly for simple informational questions. However, complex, high-intent, and transactional queries are less affected. The strategic response is to optimise for AI Overview citations (which drive branded visibility) while shifting content investment toward deeper, more complex topics where AI summaries are less useful to searchers.
Semantic SEO is the practice of optimising content for topic comprehensiveness and contextual relevance rather than individual keyword matching. Google's natural language models (BERT, MUM) understand meaning and context — they can tell that a page about "running shoes" that also covers "cushioning", "pronation", "trail vs road" is more topically complete than one that just repeats "running shoes" frequently. Build content that covers a topic thoroughly, not one that targets a single keyword.
No — but it is being fundamentally transformed. The low-skill parts of SEO (churning out thin keyword-matched articles, buying mediocre guest posts) are being commoditised and devalued. The high-skill parts — original research, genuine E-E-A-T, digital PR, technical architecture, brand building, and deeply useful content — are becoming more valuable as AI-generated noise floods the internet. The SEO practitioners who understand the strategic layer will be in higher demand; the ones doing mechanical content production will be automated out.

Topic 08

Key SEO Metrics — What to Track and What to Ignore

MetricWhat It Tells YouTrack?
Organic SessionsTotal visits from organic search — the primary business metricWeekly
Keyword RankingsVisibility for target terms — correlates with traffic intentWeekly
Impressions (Search Console)How often your pages appear in search results — early ranking indicatorMonthly
Click-Through Rate% of impressions that generate clicks — title/meta quality signalMonthly
Referring DomainsUnique sites linking to you — authority and link growth metricWeekly
Domain Rating / AuthorityOverall site authority — moves slowly, meaningful over monthsMonthly
Indexed PagesHow many pages Google has in its index — indexation health checkMonthly
Core Web VitalsPage experience signals that affect ranking eligibilityQuarterly
Bounce RateNot a direct ranking factor — but high bounce on landing pages signals poor intent matchMonthly
Page AuthorityThird-party estimate of page-level strength — useful for comparison, not absolute truthWhen needed

Topic 09

Common SEO Myths — Debunked

SEO has more persistent myths than almost any other marketing discipline. Here are the most damaging ones, corrected.

Myth: SEO is a one-time task

SEO is a continuous process. Competitors constantly publish new content, algorithm updates regularly reassess your rankings, and technical issues accumulate over time. Sites that "do SEO once" and stop see rankings erode within 6–12 months.

Myth: You need to submit your site to Google to rank

Google discovers sites through links and sitemaps automatically. Submission was relevant in the 1990s. Today, a sitemap submission in Search Console speeds up initial crawling for new sites — but Google will find and index your site without it.

Myth: More keywords = better rankings

Keyword density is not a meaningful ranking signal in modern SEO. Google's NLP models understand context — a page doesn't need to mention "running shoes" 15 times to rank for it. Keyword stuffing actively hurts readability and can trigger spam signals.

Myth: Social media activity directly boosts SEO

Social signals (likes, shares, followers) are not direct ranking factors. However, social media indirectly supports SEO by amplifying content that may earn backlinks, building brand search volume, and driving traffic that improves user engagement signals.

Myth: HTTPS is a major ranking factor

HTTPS is a confirmed but extremely light ranking signal — Google described it as a "tiebreaker" in identical-quality scenarios. The real reason to use HTTPS is security and user trust (browsers mark HTTP sites as "Not Secure"), not a significant ranking boost.

Myth: Google Analytics data influences rankings

Google has explicitly confirmed it does not use Google Analytics data (bounce rate, session duration, etc.) as a ranking signal. They are entirely separate systems. Google uses Chrome user data and Search Console interaction data — not your GA4 property.