Digital Marketing

Technical SEO Audit: The Complete 2026 Checklist for Australian Businesses

Technical SEO Audit: The Complete 2026 Checklist for Australian Businesses

A solid SEO strategy rests on three pillars: content, authority, and technical foundation. You can write the best article in Australia, but if your site’s technical infrastructure is broken, Google will struggle to crawl it, index it, and rank it.

A technical SEO audit is the systematic process of finding and fixing the technical issues holding your site back. Unlike content or link-building work, technical SEO is measurable, repeatable, and often the quickest way to improve rankings—because you’re removing barriers rather than building new authority.

This guide walks you through everything you need to audit your own site, or use to brief a technical SEO agency. We’ve organised it by category so you can tackle issues in priority order.


What Is a Technical SEO Audit (And Why It Matters)

A technical SEO audit examines how Google and other search engines interact with your website. It looks at:

  • Crawlability: Can Google’s bots actually reach your pages?
  • Indexation: Are pages being indexed, or blocked?
  • Core Web Vitals: Is your site fast enough to rank well?
  • Mobile friendliness: Does it work on devices?
  • Structured data: Are search engines understanding your content correctly?
  • Security and canonicalisation: Are you sending the right signals to search engines?

The payoff is real. Fixing technical issues often leads to:

  • 10–30% increase in indexed pages (when you remove blocks)
  • 15–40% faster load times (when you optimise Core Web Vitals)
  • 5–20% uplift in rankings for existing content (when crawl budget is freed up)
  • Cleaner crawl logs and fewer Google Search Console errors

The order matters, though. You don’t optimise images before fixing canonicalisation—you’d be wasting time. That’s why we’ve structured this guide by priority.


Priority Order: What to Fix First

If you only have limited time and budget, fix issues in this order:

  1. Indexation issues (noindex pages, blocked content) — These block your entire site from ranking.
  2. Canonicalisation and redirects (301/302 issues, self-referential canonicals) — These dilute ranking potential.
  3. Mobile usability and Core Web Vitals (LCP, INP, CLS) — Google has made these explicit ranking factors.
  4. Crawlability (robots.txt, blocked resources) — These prevent Google from seeing your content.
  5. Site structure and internal linking (orphaned pages, poor hierarchy) — These affect how authority flows through your site.
  6. Schema markup and structured data (missing, incorrect, or conflicting schemas) — These improve SERP appearance and rich results.
  7. Security and HTTPS (mixed content, SSL issues) — These are ranking factors and trust signals.

Now let’s walk through each category in detail.


1. Indexation: The Foundation

If pages aren’t indexed, they can’t rank. Before anything else, verify what Google actually has indexed.

What to Check

Indexed page count in Google Search Console (GSC)

  1. Log in to Google Search Console
  2. Go to Indexing > Pages
  3. Note the total indexed pages
  4. Compare this to your expected site size (e.g., if you have 500 blog posts, you should see roughly that number indexed)

Sitemaps are submitted and valid

  1. In GSC, go to Sitemaps
  2. Check that your XML sitemap is submitted and showing “Success”
  3. Compare indexed vs submitted; a big gap suggests excluded pages

No widespread noindex tags

  1. Run a site crawl with Screaming Frog (free version works for up to 500 URLs)
  2. Filter for pages with noindex meta tags
  3. Ask: should these pages be noindex? (Common mistake: accidentally noindexing entire site sections)

How to Check It

Tools:

  • Google Search Console (free, essential)
  • Screaming Frog SEO Spider (free up to 500 URLs; paid: AU$199/year)
  • Ahrefs Site Audit (paid; AU$99/month+) for larger sites

How to interpret:

  • Indexed count is 80–100% of submitted: Good.
  • Indexed count is 50–80% of submitted: You have exclusions (noindex, robots.txt blocks, or poor linking). Investigate.
  • Indexed count is <50% of submitted: Major issue. Pages are either blocked or genuinely orphaned.

What Passing Looks Like

  • Your entire site (except intentional noindex pages like privacy, login, duplicates) is indexed
  • GSC shows indexed count matching your expected page count
  • Sitemap is green and valid
  • No widespread “Discovered – not indexed” errors in GSC

What Failing Looks Like

  • GSC shows only 200 indexed pages but you have 1,000+ on your site
  • Large sections of your site are marked “noindex” unintentionally
  • Sitemap shows red errors (invalid XML, too many URLs, etc.)
  • Pages have when they shouldn’t

How to Fix It

If pages are noindex by accident:

  1. Find pages with unintended noindex (use Screaming Frog filter)
  2. Remove the noindex tag from the template or page
  3. Do NOT remove the tag from the HTML yet—wait for Google to re-crawl
  4. Go to GSC URL Inspection for one of the affected pages
  5. Click Request indexing
  6. Google will re-crawl within days

If pages are blocked in robots.txt:

  1. Check your /robots.txt file (visit yoursite.com/robots.txt)
  2. Look for broad blocks like:

` User-agent: * Disallow: / `

  1. If unintentional, fix it (or remove the entire block)
  2. Request indexing again in GSC

If pages are orphaned (no internal links):

  1. Use Screaming Frog to find pages with 0 internal links
  2. Add relevant internal links to these pages from your main navigation or content
  3. Once linked, they’ll eventually be discovered and indexed

2. Canonicalisation and Redirects

Canonicalisation tells Google which version of a page is the “main” one when duplicates exist. Bad canonicals dilute your ranking potential by splitting authority across multiple URLs.

What to Check

Self-referential canonicals

Every page should have a self-referential canonical (pointing to itself). Check:

  1. Visit your homepage and a few content pages
  2. View the page source (Ctrl+U or Cmd+U in Chrome)
  3. Look for a line like:

`html `

  1. Verify the URL matches the page you’re viewing exactly

Redirect chains (bad)

If URL A redirects to URL B, and URL B redirects to URL C, that’s a redirect chain. Google dislikes these.

  1. Use Redirect Checker or Screaming Frog
  2. Enter URLs that have been rewritten or moved
  3. Check the redirect path—it should be ≤2 hops (direct 301 is ideal)

Https vs http consistency

If you’ve migrated to HTTPS, ensure:

  1. HTTP version (non-secure) redirects to HTTPS
  2. Old URLs don’t have canonical tags pointing to different versions
  3. Search Console has both versions added (HTTP and HTTPS)

How to Check It

Tools:

  • Chrome DevTools (right-click > Inspect, search for )
  • Screaming Frog (Crawl → check Canonicalisation column)
  • Ahrefs Site Audit

Steps:

  1. Crawl your site with Screaming Frog (or equivalent)
  2. Export the crawl data
  3. Filter for pages with canonical issues (self-canonical mismatches, pointing to noindex pages, or external canonicals)

What Passing Looks Like

  • Every page has a self-referential canonical
  • No redirect chains (A → B → C)
  • HTTPS and HTTP are consistently redirected or canonicalised
  • Canonicals point to publicly accessible pages (not noindex, not blocked)

What Failing Looks Like

  • Pages missing canonical tags entirely
  • Canonicals pointing to HTTPS when the site is HTTP (or vice versa)
  • Redirect chains: oldurl.com → interim.com → newsite.com
  • Canonicals pointing to pages that return 404 or 410

How to Fix It

Add missing canonicals:

  1. If using WordPress with Yoast or RankMath, they auto-add self-referential canonicals—nothing to do
  2. If custom site: add to the of every page
  3. Test one page with URL Inspection in GSC to confirm Google sees the canonical

Break redirect chains:

  1. Identify the final destination (e.g., newsite.com)
  2. Update any intermediate redirects to point directly to the destination
  3. Example (bad): A → B → C. Fix: A → C, B → C

Migrate HTTPS without losing rankings:

  1. Set up SSL certificate (most hosts include Let's Encrypt free)
  2. Update all internal links to HTTPS URLs
  3. Set up 301 redirects from HTTP to HTTPS at the server level (.htaccess on Apache, or server config on Nginx)
  4. Update XML sitemap to reference HTTPS URLs
  5. In GSC, add the HTTPS version as a new property
  6. Submit sitemap to HTTPS property
  7. In GSC Settings for old HTTP property, change the "Preferred domain" to HTTPS (under "Site settings")

3. Mobile Friendliness and Core Web Vitals

Google has explicitly stated that page experience (including mobile usability and Core Web Vitals) is a ranking factor. This is non-negotiable in 2026.

What to Check

Mobile usability

  1. Go to GSC Mobile usability
  2. Check for errors:
  • Clickable elements too close together (button spacing issues)
  • Text too small to read
  • Viewport not configured (no tag)

Core Web Vitals

Three metrics matter:

  • LCP (Largest Contentful Paint): Time until the largest element on the page loads. Target: <2.5 seconds.
  • INP (Interaction to Next Paint): Time from user interaction (click) to the page responding. Target: <200 milliseconds. (This replaced FID in 2024.)
  • CLS (Cumulative Layout Shift): Visual stability as the page loads. Target: <0.1.

How to Check It

Tools:

  • Google Search Console > Core Web Vitals (free, real-world data from actual visitors)
  • PageSpeed Insights (free, at pagespeedinsights.web.dev)
  • WebPageTest (free, webpagetest.org)

Steps:

  1. Open PageSpeed Insights for your homepage and a typical blog post
  2. Check the "Core Web Vitals" section
  3. In GSC, go to Core Web Vitals to see real-world data (this is what actually affects rankings)
  4. Look for pages with "Poor" status

What Passing Looks Like

  • LCP: >75% of page views have LCP <2.5s
  • INP: >75% of page views have INP <200ms
  • CLS: >75% of page views have CLS <0.1
  • No mobile usability errors in GSC
  • PageSpeed Insights shows "Good" (green) for Core Web Vitals

What Failing Looks Like

  • LCP 4–5 seconds (common on image-heavy sites)
  • INP 300–500ms (common on sites with heavy JavaScript)
  • CLS 0.2+ (pages shift as ads or late-loading content appears)
  • Mobile usability errors visible in GSC
  • PageSpeed shows "Poor" (red) for any metric

How to Fix It

Improve LCP (load time of largest element):

  • Compress images (use WebP format, target <100KB per image)
  • Lazy-load images below the fold (native loading="lazy")
  • Upgrade hosting or use a CDN (Cloudflare free tier helps)
  • Minimise render-blocking CSS/JavaScript (defer non-critical JS)
  • Enable browser caching

Improve INP (response time to user interaction):

  • Defer heavy JavaScript (don't run large scripts on page load)
  • Break up long JavaScript tasks into smaller chunks
  • Remove or replace slow third-party scripts (analytics, ads, chat widgets)
  • Optimise event listeners (don't attach listeners to every element—use event delegation)

Improve CLS (layout stability):

  • Set explicit dimensions on images (width and height attributes)
  • Avoid inserting content above existing content (e.g., sticky ads that push content down)
  • Use transform instead of top/left for animations (GPU-accelerated, doesn't trigger layout shifts)

Fix mobile usability issues:

  • Use Google's Mobile-Friendly Test to identify specific issues
  • Ensure buttons and links are at least 48x48 pixels
  • Use responsive design (same HTML, different CSS for mobile)
  • Test on actual devices, not just Chrome DevTools

4. Crawlability: Letting Google In

If Google can't crawl your site, it can't index it. Crawlability issues are usually about blocking, but sometimes about structure.

What to Check

robots.txt file

  1. Visit yoursite.com/robots.txt
  2. Check for:
  • Broad blocks: Disallow: / (blocks entire site—usually a mistake)
  • Blocking important folders: Disallow: /blog/ (blocks all blog posts)
  • Unnecessary Disallow rules

Blocked resources in GSC

  1. In GSC, go to Settings > Crawl stats
  2. Check "Blocked by robots.txt" and "Other resources blocked"
  3. High numbers here mean Google can't crawl parts of your site

Blocked resources (CSS, JS, images)

  1. Go to GSC Coverage > Excluded
  2. Look for reasons like "Blocked by robots.txt" or "Crawled but not indexed"
  3. Check if critical resources are blocked in robots.txt

How to Check It

Tools:

  • Google Search Console (free)
  • Screaming Frog (crawl with "obey robots.txt" enabled, then disabled, and compare)

Steps:

  1. Visit your robots.txt
  2. Note any Disallow rules
  3. In Screaming Frog, crawl your site twice—once with robots.txt obeyed, once without
  4. Compare the results; the difference shows what's being blocked
  5. Ask: is this intentional?

What Passing Looks Like

  • robots.txt exists and is minimal
  • No broad site-wide blocks (unless you intentionally want a private site)
  • Important resources (CSS, JS, images for public pages) are not blocked
  • GSC Crawl Stats show low "Blocked by robots.txt" count

What Failing Looks Like

  • robots.txt blocks /blog/ or other major sections
  • robots.txt blocks all CSS and JavaScript (prevents Google from seeing modern JavaScript-rendered content)
  • Crawl stats show thousands of blocked URLs

How to Fix It

Update robots.txt:

A typical good robots.txt looks like:

` User-agent: * Disallow: /private/ Disallow: /admin/ Disallow: /cart/ Disallow: /account/ Allow: /

Sitemap: https://yoursite.com/sitemap.xml `

Key rules:

  • Only block pages you don't want indexed (private areas, login pages, cart, etc.)
  • Do NOT block CSS, JS, or images
  • Do NOT use Disallow: / (blocks entire site)
  1. Edit your robots.txt (usually in your site root)
  2. Remove unnecessary blocks
  3. Add Sitemap: line pointing to your XML sitemap
  4. Save and test in Google's robots.txt Tester

In GSC, re-request crawl:

  1. After fixing robots.txt, request Google to re-crawl
  2. Visit Crawl stats again 24–48 hours later
  3. "Blocked by robots.txt" count should drop

5. Site Structure and Internal Linking

Your site's architecture determines how authority flows and how easily Google can navigate your content.

What to Check

Orphaned pages (0 internal links)

  1. Crawl site with Screaming Frog
  2. Filter for pages with 0 inlinks
  3. These pages won't be discovered organically—you must link to them

Hierarchy and category structure

  1. Map your site structure:
  • Root domain (homepage)
  • Top-level categories
  • Subcategories (if applicable)
  • Individual pages
  1. Ask: is it logical? Flat is usually better than deep.

Anchor text in internal links

  1. Audit internal links for anchor text quality
  2. "Click here" is useless; "technical SEO audit" is valuable
  3. Use Screaming Frog to export internal links and review anchor text

How to Check It

Tools:

  • Screaming Frog (inlink analysis, site visualization)
  • GSC Links report (shows external links, not internal)

Steps:

  1. Crawl with Screaming Frog
  2. Go to Inlinks and sort by inlinks (ascending)
  3. Pages with 0 inlinks are orphaned
  4. Check your navigation and main content areas—most pages should be 1–3 clicks from the homepage

What Passing Looks Like

  • No orphaned pages (every page has at least 1 internal link)
  • Homepage is 0 clicks, main sections are 1–2 clicks, content is 2–3 clicks
  • Internal anchor text includes keywords and is descriptive
  • Top pages (by authority) link to important pages

What Failing Looks Like

  • Hundreds of orphaned pages
  • Deep nested structure (5+ levels to reach a page)
  • Anchor text like "click here" or "more"
  • Low-authority pages don't link to important pages

How to Fix It

Link orphaned pages:

  1. Find pages with 0 inlinks (use Screaming Frog)
  2. For each: add a relevant internal link from your main navigation, category page, or related content
  3. Use descriptive anchor text that includes a relevant keyword

Flatten your hierarchy:

  1. Avoid more than 3 levels of nesting (Homepage > Category > Subcategory > Page is the limit)
  2. Link important pages from the homepage or main nav
  3. Use breadcrumb navigation so users and Google understand hierarchy

Audit and improve anchor text:

  1. Export internal links from Screaming Frog
  2. Replace generic anchor text ("read more", "click here") with descriptive text
  3. Example: "For more on Core Web Vitals, see our Core Web Vitals Australia guide" instead of "Read more here"

6. Schema Markup and Structured Data

Structured data (schema markup) helps Google understand your content and can unlock rich results (star ratings, FAQs, recipes, etc.).

What to Check

Presence of schema markup

  1. Visit homepage and a few blog posts
  2. View page source (Ctrl+U)
  3. Look for `

    For your organisation (in homepage ): `html `


    7. HTTPS and Site Security

    HTTPS (Secure Sockets Layer/TLS) is an explicit ranking factor and a trust signal. Google has prioritised it since 2014.

    What to Check

    HTTPS enabled on all pages

    1. Visit your homepage—does the URL start with https://?
    2. Check the address bar: is there a green lock icon?
    3. Visit a few pages and confirm

    No mixed content (HTTP resources on HTTPS page)

    1. Visit your HTTPS homepage
    2. Open Chrome DevTools (F12)
    3. Go to Console tab
    4. If you see errors about "insecure resources" or "mixed content", you have a problem

    SSL certificate validity

    1. Click the lock icon in the address bar
    2. Click "Certificate is valid" (or "This site is not secure" if it's not)
    3. Check the expiry date—should be ≥6 months away

    How to Check It

    Tools:

    • Chrome DevTools (F12 > Console, look for mixed content warnings)
    • Why No HTTPS? (identifies mixed content issues)
    • SSL Labs (ssllabs.com, validates certificate strength)

    Steps:

    1. Visit your site—does it have HTTPS?
    2. Check the Chrome console for mixed content warnings
    3. Click the lock icon to view certificate details
    4. Run through Why No HTTPS? for a full report

    What Passing Looks Like

    • All pages are HTTPS
    • Green lock icon in address bar
    • No mixed content errors in console
    • Certificate is valid and not expiring soon

    What Failing Looks Like

    • Site is HTTP (not secure)
    • Mixed content warnings in DevTools console
    • Certificate is expired or self-signed
    • Browser shows "This site is not secure"

    How to Fix It

    Migrate from HTTP to HTTPS:

    1. Get an SSL certificate:
    • Most hosting providers include free Let's Encrypt certificates
    • No need to pay for expensive certificates in 2026
    1. Enable HTTPS in your hosting control panel (cPanel, Plesk, etc.)
    2. Set up automatic redirects from HTTP to HTTPS (.htaccess for Apache):

    `apache RewriteEngine On RewriteCond %{HTTPS} off RewriteRule ^(.*)$ https://%{HTTP_HOST}%{REQUEST_URI} [L,R=301] `

    1. Update your WordPress home and site URL to HTTPS (Settings > General)
    2. Update XML sitemap to reference HTTPS URLs
    3. Update internal links to HTTPS
    4. Test with "Why No HTTPS?" tool

    Fix mixed content (resources loaded via HTTP):

    1. Identify which resources are insecure (check console in DevTools)
    2. Update image URLs, scripts, stylesheets to HTTPS
    3. For external resources you don't control, load via protocol-relative URLs: //domain.com/resource instead of http://domain.com/resource

    8. JavaScript and Rendering Issues

    Modern sites use JavaScript heavily. Google can render JavaScript, but it's slower and sometimes buggy.

    What to Check

    JavaScript is not required to load critical content

    1. Open your homepage in an incognito window
    2. Open DevTools (F12)
    3. Go to Network tab
    4. Disable JavaScript: Ctrl+Shift+P, type "Disable JavaScript", press Enter
    5. Does the page still show your main content (headline, body text)? If yes, good.

    Third-party scripts aren't blocking page load

    1. In DevTools Network tab, reload the page
    2. Look for slow-loading scripts (ads, analytics, chat widgets)
    3. Anything >3 seconds is blocking performance

    How to Check It

    Tools:

    • Chrome DevTools (Network, Lighthouse)
    • Lighthouse report (in Chrome, right-click > Inspect > Lighthouse tab)

    Steps:

    1. Open Lighthouse in DevTools
    2. Run an audit for Performance
    3. Look for "Opportunities" section
    4. "Reduce JavaScript execution time" and "Unused JavaScript" are common issues

    What Passing Looks Like

    • Critical content (headline, body text) loads with JavaScript disabled
    • No render-blocking JavaScript in Network tab
    • Lighthouse Performance score >80

    What Failing Looks Like

    • Page is blank with JavaScript disabled
    • Large third-party scripts taking 5+ seconds
    • Lighthouse Performance score <50

    How to Fix It

    Defer non-critical JavaScript:

    1. In your HTML, change:

    `html ` To: `html ` The defer attribute tells the browser to load the script after the page renders.

    1. For scripts that aren't needed immediately, use async instead

    Lazy-load third-party scripts:

    For ads and chat widgets, load them only when needed:

    `html `


    9. Common Technical SEO Issues and Fixes

    We've covered the main categories. Here are some other common issues worth monitoring:

    Redirect loops

    • Page A redirects to B, B redirects back to A
    • Fix: trace the redirect chain and break the loop

    4xx and 5xx errors

    • GSC Coverage shows pages returning 404 (not found) or 500 (server error)
    • Fix: either redirect them properly or restore them

    Pagination issues

    • Old rel=prev/next is deprecated (Google stopped using it in 2019)
    • Modern approach: use rel="next" and rel="prev" only if pages are truly sequential, or just use direct links between pages

    Hreflang tags

    • If you have content in multiple languages or regions, use hreflang to tell Google
    • For Australian site with content for AU, NZ, UK: add

    Crawl budget

    • Large sites (50,000+ pages) need to be mindful of crawl budget
    • Remove low-value pages from indexing (thin pages, outdated content)
    • Fix crawlability issues so Google spends crawl budget on important content

    Creating Your Audit Checklist

    Use this checklist to conduct your own audit:

    • [ ] GSC indexed page count matches expected
    • [ ] Sitemaps submitted and valid
    • [ ] No widespread unintentional noindex tags
    • [ ] Every page has self-referential canonical
    • [ ] No redirect chains (≤2 hops)
    • [ ] HTTPS is enabled site-wide
    • [ ] Mobile usability: no errors in GSC
    • [ ] Core Web Vitals: >75% of traffic meets targets
    • [ ] robots.txt exists and is minimal
    • [ ] CSS, JS, images are not blocked in robots.txt
    • [ ] No orphaned pages (every page has ≥1 internal link)
    • [ ] Site hierarchy is logical (≤3 levels deep)
    • [ ] Internal anchor text is descriptive
    • [ ] Organization and Article schema markup is present and valid
    • [ ] No mixed content (HTTP on HTTPS)
    • [ ] No render-blocking JavaScript

    What to Do Next

    If you've identified issues, prioritise them:

    1. Fix indexation issues first (noindex, robots.txt blocks)
    2. Then canonicalisation and redirects
    3. Then Core Web Vitals and mobile
    4. Then crawlability
    5. Then internal linking and structure
    6. Then schema markup

    For a comprehensive technical SEO audit with a detailed remediation plan tailored to your site, Anitech performs full audits and guides you through the fixes. We'll identify issues, prioritise them, and provide step-by-step remediation instructions.

    Get a technical SEO audit


Related Articles

  • May 2, 2026

A/B Testing Guide for Australian Marketing Teams (2026)

A/B Testing Guide for Australian Marketing Teams (2026) You’ve redesigned your landing page. Your...

  • May 2, 2026

How to Optimise for Google AI Overviews: 2026 Practical Guide

How to Optimise for Google AI Overviews: 2026 Practical Guide Google AI Overviews have...

  • May 2, 2026

How to Optimise for Google AI Overviews: 2026 Practical Guide

How to Optimise for Google AI Overviews: 2026 Practical Guide Google AI Overviews have...

  • May 2, 2026

Landing Page Optimisation Australia: Turning Visitors Into Customers

Landing Page Optimisation Australia: Turning Visitors Into Customers A visitor clicks your Google Ads...

  • May 2, 2026

Technical SEO Audit: The Complete 2026 Checklist for Australian Businesses

Technical SEO Audit: The Complete 2026 Checklist for Australian Businesses A solid SEO strategy...

Need SEO Help?

Get a free SEO audit and discover how we can help improve your rankings.
Australia’s leading SEO agency specializing in search engine optimization, digital marketing, and backlinking services. We help businesses dominate Google rankings and drive sustainable growth.

Services

Quick Links

Contact Info

© 2026 Anitech. All rights reserved.

Terms of Use I Privacy Policy