perfecting-technical-seo-featured-image.jpg

Perfecting Technical SEO

Engineering - 115 Min read

February 22, 2024

As an early-stage startup, we compete with some well-established companies in the heavy equipment sales industry. As such, we have less domain authority than our competitors and need to use all the tools at our disposal to achieve better rankings in search engine results and more organic visibility to our customers. A few months ago, we invested some of our engineering bandwidth on Technical SEO on our marketplace to improve not only our performance as Google sees it but our customers’ experience as well.

What is Technical SEO and why is it important?

Technical SEO is the process of optimizing a website, so it meets the technical requirements of modern search engines, such as Google. Great Technical SEO allows a website to achieve better organic rankings in search engines. These optimizations are important because even with great content, a poor technical SEO score means that our site won’t rank in search engines, reducing our visibility to potential customers. There are a lot of factors that contribute to a great (or poor) Technical SEO score, but for us, the biggest item to focus on was page speed and load times, which have had an outsized impact on our site’s performance. Our secondary goal was to ensure that our sitemap, robots.txt configuration, internationalization, and meta tags were all configured correctly so search engines could crawl our pages.

Goals

Our primary goal was to improve load times on our equipment search page. Ideally, we wanted the page to be as performant as possible but also needed to set realistic, achievable, and slightly challenging goals. We set a slightly lower performance threshold for mobile since we knew it would probably be harder to achieve. Our goals were as follows:

  1. Achieve 75% good URLs on the desktop in Google Search Console.

  2. Achieve 50% good URLs on mobile in Google Search Console.

  3. Achieve at least a C rating on the /equipment page in GTMetrix on desktop and mobile.

  4. Achieve at least an A rating on static landing pages and blog pages on desktop.

Implementation

We use Next.js to power the front end of our website. We chose this framework because of it’s first-class support for SEO through Server Side Rendering and Static Generation, on-demand image optimization, excellent developer experience, a large ecosystem of packages, and team familiarity with the framework. 

Statically generate pages

Server Side Rendering (SSR) is great for content SEO, but not as good for performance because we’re rendering a page on every request. Static Site Generation (SSG) combined with Incremental Static Regeneration (ISR), on the other hand, allows us to pre-render all of our pages at build time and incrementally update them as the content changes - either after a set period of time, or on-demand when our team updates content. This means we reap the benefits of great content SEO and a blazingly fast initial load time.

Optimize media

We use Cloudinary to serve precisely optimized images on our equipment pages. For non-equipment images, we use the next/image component to optimize media on demand. Both of these tools allow us to send high-quality images in smaller file sizes using next generation formats (like webp) while not increasing load times.

Reduce the application bundle size

We reviewed our client-side bundle using the @next/bundle-analyzer package to see how our code was split and what packages were included. Removing unnecessary imports, replacing larger legacy packages with smaller ones (eg using dayjs instead of momentjs ), and consolidating code allowed us to reduce our final bundle size by about 25%.

Ensure the sitemap is correct

A valid sitemap allows us to explicitly tell search engines the URLs on our site to crawl, along with any language variants.

# sample sitemap.xml
<?xml version="1.0" encoding="UTF-8"?>
<urlset 
    xmlns="http://www.sitemaps.org/schemas/sitemap/0.9"
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xmlns:xhtml="http://www.w3.org/1999/xhtml"
    xsi:schemaLocation="http://www.sitemaps.org/schemas/sitemap/0.9 http://www.sitemaps.org/schemas/sitemap/0.9/sitemap.xsd http://www.w3.org/1999/xhtml http://www.w3.org/2002/08/xhtml/xhtml1-strict.xsd"
>
    <url>
        <loc>https://www.boomandbucket.com/</loc>
    <lastmod>2022-08-03T16:25:53.314Z</lastmod>
    <changefreq>daily</changefreq>
    <priority>0.8</priority>
    <xhtml:link
         rel="alternate"
       hreflang="es"
       href="https://www.boomandbucket.com/es/"/>
   </url>        
</urlset> 

Modify robots.txt

We modified our robots.txt configuration to allow crawling only on specific folders of our site, so we don’t expend our crawl budget on less important pages, like variations of our equipment search page with URL parameters, or some folders automatically generated by Next.js.

# robots.txt
User-agent: *
Allow: /
Disallow: /bugs
Disallow: /features
Disallow: /blog/category/[slug]
Disallow: /blog/[slug]
Disallow: /*.json$
Disallow: /*_buildManifest.js$
Disallow: /*_middlewareManifest.js$
Disallow: /*_ssgManifest.js$
Sitemap: https://www.boomandbucket.com/sitemap.xml

Configure meta tags for content and alternate languages

Finally, we audited our <head> to ensure we’re using the right meta tags, including title, description, OpenGraph , canonical, and alternate language tags for our pages.

# Head component
<head>
  <title>Boom &ama Bucket - Buy and Sell Used Heavy Equipment</title>
  <meta name="description" content="Boom &amp; Bucket is the trusted marketplace for buying and selling used heavy equipment. View our inventory online or contact our team today for a free consultation on your fleet disposition needs." />
  <meta charSet="utf-8" />
  <meta property="og:type" content="website" />
  <meta property="og:url" content="https://www.boomandbucket.com/" />
  <meta property="og:title" content="Boom &amp; Bucket - Buy and Sell Used Heavy Equipment" />
  <meta property="og:description" content="Boom &amp; Bucket is the trusted marketplace for buying and selling used heavy equipment. View our inventory online or contact our team today for a free consultation on your fleet disposition needs." />
  <meta property="og:image" content="/images/meta.jpg/m/800x500" />
  <meta property="twitter:card" content="summary_large_image" />
  <meta property="twitter:url" content="https://www.boomandbucket.com/" />
  <meta property="twitter:title" content="Boom &amp; Bucket - Buy and Sell Used Heavy Equipment" />
  <meta property="twitter:description" content="Boom &amp; Bucket is the trusted marketplace for buying and selling used heavy equipment. View our inventory online or contact our team today for a free consultation on your fleet disposition needs." />
  <meta property="twitter:image" content="/images/meta.jpg/m/800x500" />
  <link rel="icon" type="image/png" href="/favicon.ico" />
  <link rel="apple-touch-icon" href="/favicon.ico" />
  <link rel="canonical" href="https://www.boomandbucket.com/" />
  <link rel="alternate" href="https://www.boomandbucket.com/" hrefLang="en" />
  <link rel="alternate" href="https://www.boomandbucket.com/es/" hrefLang="es" />
  <link rel="alternate" href="https://www.boomandbucket.com/equipment" hreflang="x-default">
</head> 

Analyzing our results

GTMetrix

GTMetrix was critical in testing our site from real browsers across the country to get an accurate picture of performance, including detailed results of how our pages loaded on mobile and desktop. This also gave us access to historical trends so we could see how page performance improved over time. Specifically, we looked at metrics like Largest Contentful Paint, Total Blocking Time, and Cumulative Layout Shift.

gtmetrix-initial.webp

We also implemented a soft quality gate as part of our build pipeline. When opening and updating a pull request, a GitHub Action would check certain pages on GTMetrix and include the result in our workflow. This ensured that we didn’t introduce new changes to the codebase that would cause regression of our SEO metrics on key pages, like the home and equipment search pages.

Below is a sample script we used for running the GTMetrix quality check as part of our build pipeline:

# test-gtmetrix.yml

name: Technical SEO
on:
  [push]
jobs:
  gtmetrix:
    runs-on: ubuntu-latest
    steps:
			# Wait for all tests to pass before running this check.
      - name: Wait for tests to succeed
        uses: lewagon/wait-on-check-action@v1.0.0
        with:
          ref: ${{ github.ref }}
          running-workflow-name: Technical SEO
          repo-token: ${{ secrets.GITHUB_TOKEN }}
          wait-interval: 10

      - uses: actions/checkout@v2
			
			# Run the check using the configuration file (see next snippet)
      - name: Home Page
        id: home
        uses: ingeno/gtmetrix-action@v1.0
        with:
          api_key: <gtmetrix api key>
          configuration_file: ./.github/home.yml
			
			# Update the pull request with a comment displaying the results
      - name: Update Pull Request
        uses: actions/github-script@v6.0.0
        with:
          github-token: ${{ secrets.GITHUB_TOKEN }}
          script: |
            const output = `
            ## Technical SEO Quality Gate
            - [Home](${{ steps.home.outputs.report_url }}) -> \\`${{ steps.home.outcome }}\\`
            _____________

            <details>
              <summary>Home</summary>
              <p>
                - structure_score: needs \\`${{ steps.home.outputs.structure_score }}\\` and is `95`
                - onload_duration: needs \\`${{ steps.home.outputs.onload_duration }}\\` and is `6`
                - onload_time: needs \\`${{ steps.home.outputs.onload_time }}\\` and is `6000`
                - largest_contentful_paint: needs \\`${{ steps.home.outputs.largest_contentful_paint }}\\` and is `6000`
                - first_paint_time: needs \\`${{ steps.home.outputs.first_paint_time }}\\` and is `1800`
                - gtmetrix_grade: needs \\`${{ steps.home.outputs.gtmetrix_grade }}\\` and is `C`
                - page_requests: needs \\`${{ steps.home.outputs.page_requests }}\\` and is `58`
              </p>
            </details>

            *Pusher: @${{ github.actor }}, Action: \\`${{ github.event_name }}\\`*`;
            const owner = context.repo.owner
            const repo = context.repo.repo
            const commit_sha = context.sha

            const opts = await github.rest.repos.listPullRequestsAssociatedWithCommit({
              owner,
              repo,
              commit_sha
            })

            const issues = await github.paginate(opts)

            for (const issue of issues) {
              github.rest.issues.createComment({
                issue_number: issue.number,
                owner: context.repo.owner,
                repo: context.repo.repo,
                body: output
              })
            }

Requirements for the home page:

# home.yml

test_configuration:
  url: <url to check>
  location: 9
requirements:
  structure_score: 95
  onload_duration: 6
  onload_time: 6000
  largest_contentful_paint: 6000
  first_paint_time: 1800
  gtmetrix_grade: C
  page_requests: 58

Google Search Console

Google Search Console allows us to measure our site's Google Search traffic and performance, monitor and fix issues, and see how Google views our site. We used this tool to track the number of good URLs on desktop and mobile as we applied our technical SEO improvements.

Vercel

One of the biggest changes we made was migrating our site hosting from Amazon Web Services to Vercel. Previously we self-hosted our frontend on AWS using Terraform and the tf-next package. However, we ran into some limitations including errors serving localized versions of statically generated pages, no support for Incremental Static Regeneration, and the cost and complexity of additional resources on AWS.

Vercel designed its hosting platform with Next.js as a first-class citizen. Migrating our front end to Vercel was relatively painless and greatly decreased our initial server response time, reduced our overall hosting cost and build times, and improved our overall developer experience too. Using Vercel also allowed us to integrate various analytics, such as Vercel’s homegrown analytics service as well as a third-party service like Checkly.

Vercel Analytics

Vercel Analytics monitors every deployment to ensure we push quality code that doesn’t negatively impact our performance scores on key metrics like First Contentful Paint, Largest Contentful Paint, and Cumulative Layout Shift.

vercel-analytics.webp

Checking

Checking not only runs some automated tests for us to ensure pages on our site load correctly but also provides us with more performance reports. It integrates directly with Vercel so we could run these tests as part of our build pipeline.

We also use Checkly as part of our continuous delivery approach. Vercel deploys our marketplace to a separate domain name, which Checkly will then paginate through the site verifying that each page looks as it should, listings load, and our performance is in line. Once it passes, Vercel then applies the correct DNS records, and the site goes live. In effect, this means we can never have a rogue deployment or a subpar performance deployment of the website.

Challenges

Deciding which improvements to tackle first

We used the most obvious approach - implement the updates that would provide the biggest gains first and then focus on any smaller incremental gains. Because Google requires several weeks of sampling to determine if a page is passing, knocking out the big-ticket items would give us a head start. Statically generating pages and optimizing our media translated into massive gains initially. Optimizations like reducing the bundle size came later.

Prioritizing technical SEO against other features and bug fixes

There are multiple features, platform improvements, and bugs in the engineering pipeline at any given time. However, we made a conscious decision to prioritize the technical SEO updates because we understood that the time invested now would provide a much larger return later on. Speed is critical in an early-stage startup. Considering it could take weeks to get meaningful feedback from our analytics software, we tackled this in pieces. Once the results in GTMetrics, Vercel Analytics, and Google Search Console, looked satisfactory we would resume working on other features and bug fixes. As we collected more information, we would then spend another cycle on technical SEO improvements.

Maintaining Technical SEO Performance and preventing regressions

We learned that technical SEO is an ongoing process - as the site changes, whether that’s from code updates or revised content, performance will change too. Code updates, like adding a new third-party script for analytics, can make the page load slower. Technical SEO is not a one-and-done action item - it should be continuously monitored and revised so your platform can best support your customers and business goals.

Results

After spending a few cycles on this, we saw some significant gains.

100% good URLs on the desktop

google-search-console-desktop.webp

17% good URLs on mobile

At peak, we achieved 17% of good URLs on mobile. This proved to be more challenging than we anticipated and we fell short of our 50% goal, but we’re working on improving this.

Achieve at least a C rating on the find equipment page in GTMetrix on desktop and mobile.

Mobile:

undefined

Desktop:

gtmetrix_equipment_desktop.webp

Achieve at least an A rating on static landing pages and blog pages on the desktop.

gtmetrix_landing_page.webp

gtemtrix_blog_desktop.webp

Final thoughts

We achieved most of our goals, but there’s still more work to do. The next and most immediate challenge is improving our mobile performance. Notably, the Total Blocking Time is still quite high on many pages. Mobile users are more likely to be impacted by this because they are using devices with slower processing power and slower connections.

We’ll also spend some time examining how we use third-party scripts (eg for analytics) on the production site, which can have a death-by-a-thousand-cuts impact on performance.

Migrating our site to Vercel and utilizing many of the features in NextJs offered huge performance gains on our site right out of the box. Integrating Checkly, Vercel Analytics, and GTMetric into our deployment pipeline allows us to easily monitor performance on our marketplace and continually improve our customers’ experience.

Since we completed these performance optimizations, we’ve rebuilt the search experience on our site to make it easier for our customers to find, buy, and sell used heavy equipment. Check it out!

Visit Boom & Bucket to buy and sell your machinery.

Join our inventory mailing list to get early access to our best deals