Launching a new website is exciting—you’ve got fresh content, clean design, and high hopes for
organic traffic. But that excitement can quickly turn to frustration when weeks pass and Google
barely acknowledges your site exists. I’ve seen this pattern countless times: thoughtful content
that should rank, sitting invisible because basic technical SEO was overlooked during launch.

The reality is that technical SEO creates the foundation everything else builds upon. You can write
the best content in your niche, but if search engines can’t properly crawl, understand, and index
it, that content never gets the chance to compete. For new domains without established authority or
trust signals, getting the technical basics right is especially critical—you can’t afford to
compound new-site challenges with preventable technical barriers.

Over the years, I’ve launched dozens of sites and consulted on the launches of many more. The
technical SEO checklist I’ll share here comes from that accumulated experience—every item represents
either a problem I’ve encountered or a best practice that consistently produces good results. Think
of this as the checklist I wish someone had given me before my first site launch.

The First Week: Critical Foundation Tasks

Some technical SEO tasks are so fundamental that getting them wrong can prevent your site from
appearing in search results entirely. Before worrying about content optimization or link building,
verify these foundational elements are in place.

Verifying Your Site Is Indexable

The single most important check for a new site: can search engines actually access and index your
content? This sounds obvious, but it’s surprisingly easy to block indexing accidentally, especially
when launching sites that were developed in staging environments.

Start by visiting your robots.txt file directly at yoursite.com/robots.txt. Look for any “Disallow:
/” directive that blocks everything. I’ve seen sites launch with full blocking because developers
used this during staging and nobody removed it. Your robots.txt should generally allow access to
content areas while blocking only administrative sections.

Next, check your page source code for noindex meta tags. Look for content=”noindex” in meta tags.
This can exist in your HTML head or be added dynamically by plugins or themes. Even one unexpected
noindex directive on important pages prevents indexing.

For WordPress sites specifically, go to Settings > Reading and verify that “Search engine visibility”
is unchecked. This option adds site-wide noindex when checked—it’s intended for development but
sometimes persists into production. I’ve troubleshot sites where traffic dropped because someone
accidentally clicked this option during unrelated settings changes.

Check HTTP response headers for X-Robots-Tag directives. Some server configurations add noindex
headers that aren’t visible in page source. You can check headers using browser developer tools
(Network tab) or online tools like httpstatus.io.

Google Search Console Setup

Search Console is essential for understanding how Google sees your site and diagnosing problems. Set
it up within the first week—ideally before launch if possible.

When adding your property, choose the Domain property type if possible (requires DNS verification).
This covers all variations of your domain—www and non-www, HTTP and HTTPS—under a single property.
URL prefix properties work too but require you to add each variation separately.

Verification can happen through DNS (adding a TXT record), HTML file upload, HTML meta tag, Google
Analytics, or Google Tag Manager. DNS verification is most reliable if you have domain control.
Complete whichever method is easiest given your setup.

After verification, immediately submit your XML sitemap. Navigate to Sitemaps in the left menu, enter
your sitemap URL (typically /sitemap_index.xml or /sitemap.xml), and click Submit. This prompts
Google to discover your pages faster than waiting for natural crawling.

Use the URL Inspection tool to request indexing for your homepage and any other priority pages. This
doesn’t guarantee immediate indexing, but it signals to Google that you want these pages crawled.
For new sites, every acceleration helps.

HTTPS Configuration

HTTPS isn’t optional for new sites—it’s effectively mandatory. Google has confirmed HTTPS as a
ranking signal, and browsers actively warn users about non-HTTPS sites. Launching without HTTPS
creates problems you’ll have to fix later anyway, so do it right from the start.

Verify your SSL certificate is properly installed by visiting your site via HTTPS and checking for
the lock icon in the browser. Click the lock to verify the certificate is valid, issued to your
domain, and not expired. Free certificates from Let’s Encrypt work perfectly—you don’t need
expensive certificates for standard sites.

Configure server-side redirects so that all HTTP requests automatically redirect to HTTPS via 301
redirects. This applies to all URLs, not just the homepage. Test by visiting
http://yoursite.com/random-page and verifying you’re redirected to the HTTPS version.

Check for mixed content—HTTP resources loaded on HTTPS pages. Open browser developer tools console
and look for mixed content warnings. These occur when your HTTPS pages reference images, scripts, or
stylesheets via HTTP. Update those references to HTTPS or protocol-relative URLs.

Review internal links throughout your site to ensure they use HTTPS URLs. Content editors sometimes
copy-paste HTTP links, especially from older resources. While redirects will handle the navigation,
direct HTTPS links are cleaner and faster.

URL Structure and Site Architecture

How your URLs are structured and how pages relate to each other affects both search engine
understanding and user experience. Set these patterns correctly from the start—changing them later
requires redirects and risks losing established rankings.

Consistent URL Format

Choose a URL format and stick with it. The decisions to make: www vs. non-www (I prefer non-www for
simplicity, but either works), trailing slashes or not (be consistent; don’t have some URLs with and
some without), and lowercase only (mixed case creates potential duplicate content).

Once you’ve decided, configure redirects so that non-canonical variations redirect to canonical
versions. If you choose non-www without trailing slashes, then www.yoursite.com/page/ should 301
redirect to yoursite.com/page.

For WordPress, permalink settings control URL structure. Navigate to Settings > Permalinks and choose
a readable structure. I typically recommend Post name (yoursite.com/post-title/) for most blogs, or
a structure including category for sites with clear topical organization
(yoursite.com/category/post-title/).

Avoid URL parameters where possible for primary content. URLs with ?id=123 or similar parameters are
harder to remember, harder to share, and potentially create duplicate content issues. WordPress’s
default pretty permalinks handle this, but custom implementations sometimes fall back to
parameter-based URLs.

Logical Site Hierarchy

Your site’s hierarchy should be logical and shallow. Important pages should be reachable within three
clicks from the homepage. Deeply buried content—requiring four, five, or more clicks to
reach—receives less crawl priority and less internal link equity.

Think about your site structure before creating content. Categories should represent your main topic
areas. Each category page serves as a hub linking to related content. The homepage links to
categories and perhaps featured content. Individual pages link to related pages within their
category and back up to category and homepage.

This hierarchical structure helps search engines understand topical relationships. When Google sees
that five articles all link to a category page about “WordPress Security,” it understands that
category page represents a key topic and those five articles are related content on that topic.

Navigation That Crawlers Can Follow

Your site navigation must use standard HTML anchor tags that search engine crawlers can follow.
Navigation built entirely in JavaScript without server-rendered HTML links may be invisible to
crawlers, leaving large portions of your site undiscoverable.

Modern JavaScript frameworks can create this problem if not configured for server-side rendering or
static generation. If you’re using React, Vue, Next.js, or similar, ensure that navigation links
exist in the initial HTML response, not just in JavaScript-generated DOM.

Breadcrumb navigation provides additional crawlable links and helps users understand their location
in your site hierarchy. Breadcrumbs like “Home > Category > Article” provide clear paths that both
users and crawlers can follow.

Internal search results pages present a special consideration. These pages are dynamically generated
based on user queries and shouldn’t be indexed (they’d create infinite variations of thin content
pages). Configure search results pages with noindex meta tags, and consider blocking the search URL
pattern in robots.txt to save crawl budget.

On-Page Technical Elements

Every page needs certain technical elements properly configured. These on-page factors tell search
engines what each page is about and how it should be presented in search results.

Title Tags: Your First Impression

The title tag is what appears as the clickable headline in search results. It’s your first impression
for potential visitors and a significant ranking factor. Get it right for every page.

Every page must have a unique title tag. Duplicate titles across pages create confusion about which
page should rank and dilute the effectiveness of all affected pages. If Search Console reports
duplicate title tags, address them immediately.

Keep titles under 60 characters to avoid truncation in search results. Truncated titles with “…”
look unprofessional and may cut off important information. I aim for 50-55 characters to provide
margin against slightly different display widths.

Include your primary keyword naturally, preferably early in the title. “WordPress Security Guide:
Protect Your Site in 2024” puts the key topic first. “Complete Guide to Everything You Need to Know
About WordPress Security” buries the keyword and will likely be truncated anyway.

For blog posts and articles, I typically structure titles as “Primary Topic: Secondary Detail – Site
Name” or “Primary Topic – Site Name.” The site name branding is optional depending on your brand
recognition goals.

Meta Descriptions: Selling the Click

Meta descriptions don’t directly affect rankings, but they significantly impact click-through rates.
A compelling description can mean the difference between a user clicking your result versus a
competitor’s.

Like titles, meta descriptions should be unique per page. Duplicate descriptions suggest similar or
duplicate content and waste the opportunity to craft targeted messaging for each page.

Keep descriptions between 150-155 characters. Longer descriptions get truncated, shorter descriptions
may not fully utilize available space. Include your target keyword—Google often bolds matching terms
in descriptions, which catches the eye.

Write descriptions as sales copy, not just summaries. “Learn WordPress security” is informational.
“Discover the exact security steps I use to protect 50+ client sites from hackers and malware”
creates curiosity and establishes expertise. Think about what would make someone click.

Note that Google sometimes rewrites meta descriptions to better match specific queries. You can’t
prevent this, but a well-written description gives Google good material to work with.

Heading Structure: Organizing Content

HTML heading tags (H1-H6) provide semantic structure to your content. Proper heading usage helps
search engines understand content hierarchy and assists screen readers for accessibility.

Each page should have exactly one H1 tag containing the main topic. This is typically your article
title or page heading. Having multiple H1s or no H1 creates ambiguity about the page’s primary
topic.

Use H2 tags for main sections within the content, H3 for subsections within those sections, and so
on. Don’t skip heading levels—going from H2 directly to H4 breaks the logical hierarchy. This
structure should reflect how the content is actually organized, not be manipulated for perceived SEO
benefit.

Include relevant keywords in headings naturally. If your article is about WordPress security plugins,
having H2 sections like “Best Free Security Plugins” and “Premium Security Plugin Options” naturally
incorporates relevant terms while accurately describing section content.

Image Technical Optimization

Images require specific technical attention beyond just looking good. Properly optimized images load
faster, rank in image search, and are accessible to users with visual impairments.

Every image must have an alt attribute. Alt text serves as alternative text when images can’t load
and provides descriptions for screen readers. Write alt text that describes the image content:
“WordPress dashboard showing security plugin settings” rather than just “screenshot” or leaving it
empty.

File names should be descriptive. Upload “wordpress-security-dashboard.webp” not “IMG_3847.jpg.”
Search engines read file names as signals about image content, and descriptive names also help you
organize your media library.

Specify width and height attributes on image elements. This allows browsers to reserve the correct
space before images load, preventing layout shifts that hurt Core Web Vitals scores. For responsive
images, use CSS to control display size while still specifying intrinsic dimensions.

Compress images appropriately for web delivery. Uncompressed images waste bandwidth and slow page
loads. Modern formats like WebP provide better compression than JPEG for most use cases. Tools like
Squoosh or ImageOptim help with compression. WordPress plugins like ShortPixel or Smush can automate
this.

Sitemaps and Search Engine Communication

Sitemaps and robots.txt form your direct communication channel with search engines. Configure them
correctly to ensure efficient crawling of your important content.

XML Sitemap Essentials

Your XML sitemap tells search engines which pages exist and helps them discover content they might
miss through normal crawling. For new sites without many inbound links, sitemaps are especially
important for discovery.

Generate your sitemap automatically using an SEO plugin (Yoast, Rank Math) or a dedicated sitemap
plugin. Manual sitemap maintenance is error-prone and tedious. Automatic generation ensures new
content is added and removed content is excluded.

Include pages you actually want indexed: published posts, pages, and appropriate taxonomies (usually
categories, sometimes tags). Exclude content marked noindex, admin pages, search results, and other
content not meant for search users.

Verify your sitemap is accessible by visiting its URL directly. Check that it contains the pages you
expect and nothing you don’t want indexed. Submit it through Search Console for explicit
notification to Google.

Robots.txt Configuration

Robots.txt provides instructions to search engine crawlers about which parts of your site to access.
For most sites, a simple configuration works well.

The recommended WordPress robots.txt includes: User-agent: * (applying rules to all crawlers),
Disallow: /wp-admin/ (blocking admin area), Allow: /wp-admin/admin-ajax.php (allowing AJAX
functionality needed for proper page rendering), and Sitemap: https://yoursite.com/sitemap_index.xml
(pointing to your sitemap).

Avoid over-blocking. Common mistakes include blocking /wp-content/ (which includes your images and
theme files) or blocking all query strings (which can affect legitimate URLs). When uncertain, err
toward allowing crawling—it’s safer to let crawlers access non-essential pages than to accidentally
block important ones.

Remember the critical distinction: robots.txt controls crawling, not indexing. Disallowing a URL
doesn’t prevent it from appearing in search results—it prevents Google from crawling to see the
content. For pages you want excluded from search results entirely, use noindex meta tags or
X-Robots-Tag headers.

Mobile Experience and Performance

With mobile-first indexing, Google primarily uses the mobile version of your site for ranking.
Performance metrics like Core Web Vitals directly affect rankings. These aren’t nice-to-haves;
they’re requirements for competitive search performance.

Mobile-Friendliness Requirements

Your site must work well on mobile devices. This means responsive design that adapts to screen sizes,
adequately sized tap targets, readable text without zooming, and no horizontal scrolling.

Use the viewport meta tag to control how your site renders on mobile: viewport
content=”width=device-width, initial-scale=1.0″. Without this, mobile browsers may render your site
as a scaled-down desktop version, making it difficult to use.

Test on actual mobile devices, not just desktop browser emulation. Emulation is useful for
development but doesn’t catch all issues. Touch behavior, performance on mobile networks, and
real-world usability require real device testing.

Verify mobile-friendliness in Search Console’s Mobile Usability report. This shows pages with mobile
issues like text too small, content wider than screen, or clickable elements too close together.
Address any reported issues systematically.

Core Web Vitals: The Performance Metrics That Matter

Google measures specific performance metrics as part of its ranking algorithm. Core Web Vitals
include Largest Contentful Paint (LCP), Cumulative Layout Shift (CLS), and Interaction to Next Paint
(INP, replacing First Input Delay).

LCP measures how quickly the largest visible content element loads. For most pages, this is the hero
image or main heading. Target under 2.5 seconds. Improve LCP by optimizing server response time,
optimizing images (compression, appropriate formats, lazy loading for below-fold images), and
ensuring fonts load efficiently.

CLS measures visual stability—how much elements shift around during page load. Unexpected layout
shifts are frustrating for users who click on something only to have it move. Target under 0.1.
Improve CLS by setting explicit dimensions on images and media, reserving space for ads, and
avoiding dynamically injected content above visible content.

INP measures responsiveness to user interactions like clicks and key presses. Target under 200ms.
Improve INP by optimizing JavaScript execution, breaking up long tasks, and reducing main thread
blocking.

Use PageSpeed Insights to measure Core Web Vitals for specific pages. Search Console provides
site-wide Core Web Vitals data over time. Address any pages marked as needing improvement.

Practical Page Speed Optimization

Beyond Core Web Vitals specifically, overall page speed affects user experience and indirectly
affects SEO through engagement signals. Faster pages provide better experiences that users stay on
longer.

Enable browser caching so returning visitors don’t re-download unchanged resources. Set appropriate
cache headers for static assets like images, CSS, and JavaScript. WordPress caching plugins handle
this automatically.

Minify CSS and JavaScript to remove unnecessary characters without changing functionality. Combine
files where possible to reduce HTTP requests. Again, plugins and build tools handle this
automatically for most sites.

Compress images appropriately for their display size. Don’t upload 4000px wide images to display at
800px. Resize before upload and compress for web delivery. Use modern formats like WebP where
browser support is sufficient.

Consider using a Content Delivery Network (CDN) to serve static assets from servers closer to your
visitors. Cloudflare offers a free tier that provides CDN along with other performance and security
benefits.

Structured Data Implementation

Structured data helps search engines understand your content and can enable rich results—enhanced
search listings with additional information like ratings, images, or step-by-step instructions.

Essential Schema Types for New Sites

For a typical content site, several schema types provide value. Organization schema identifies your
brand, including name, logo, and social profiles. This helps establish brand identity in search
results and Knowledge Graphs.

Article schema applies to blog posts, news articles, and similar content. It identifies the headline,
author, publication date, and image—useful for news or blog-focused sites. Article schema is a
foundation for other rich result opportunities.

Breadcrumb schema mirrors your visible breadcrumb navigation, showing search engines your site
hierarchy. Breadcrumb-rich results display the path in place of the URL in search results, making
listings more informative.

Depending on your content, additional schema becomes relevant: HowTo for step-by-step tutorials, FAQ
for question-and-answer content, Recipe for cooking sites, Product for e-commerce, and many others
for specific use cases.

Implementation and Validation

Implement structured data using JSON-LD format embedded in your HTML. SEO plugins like Yoast and Rank
Math generate common schema types automatically. For custom needs, you may need to add JSON-LD
manually or through theme customization.

Validate your structured data using Google’s Rich Results Test tool. Enter a page URL and see what
structured data Google detects, whether it’s valid, and whether the page is eligible for rich
results. Address any errors or warnings the tool identifies.

Monitor structured data health in Search Console’s Enhancements reports. These show valid pages,
pages with warnings, and pages with errors for each schema type. Track these over time to catch
issues before they impact visibility.

Post-Launch Monitoring and Maintenance

Technical SEO doesn’t end at launch. Ongoing monitoring catches problems early, and maintenance keeps
your technical foundation solid as your site evolves.

First Month Monitoring Schedule

During the first week after launch, verify through Search Console that key pages are being indexed.
Use URL Inspection to check specific URLs. Indexing may take days to weeks for new sites, but check
that the process is progressing.

In week two, review the Index Coverage report for any crawl errors, server errors, or unexpected
exclusions. Some exclusions are normal (noindexed pages, canonicalized pages), but anything
unexpected warrants investigation.

Week three, check the Core Web Vitals report. This data takes time to accumulate, but by now you
should have some indication of performance. Address any pages marked as needing improvement.

By month end, the Performance report should show initial impressions and potentially some clicks.
Verify that key pages are appearing in search for intended queries. If pages aren’t appearing for
any queries after four weeks, investigate potential issues.

Ongoing Monitoring Practices

Monthly, review Search Console for any new errors, warnings, or degradations. Check that sitemap
remains accessible and error-free. Review any security issues or manual actions (hopefully none, but
catch quickly if they occur).

Quarterly, run a comprehensive site audit using tools like Screaming Frog or Sitebulb. These crawl
your site like a search engine and report technical issues: broken links, missing meta tags, slow
pages, redirect chains, duplicate content, and more. Address the most important findings and track
improvement over time.

After major changes—redesigns, CMS updates, hosting migrations, plugin updates—verify that technical
SEO elements remain intact. These events commonly cause regressions like broken redirects, changed
robots.txt, SSL issues, or performance degradation.

Common Problems and How to Catch Them

Some problems recur frequently and are worth specifically monitoring. Broken links develop naturally
as external sites change or as internal content is deleted without proper redirects. Regular crawls
catch these. Broken internal links waste link equity and frustrate users; fix them promptly.

Page speed degradation happens as sites grow, plugins accumulate, and content becomes heavier.
Monitor PageSpeed Insights scores over time. If scores drop, investigate what changed—often it’s a
new plugin, unoptimized images, or accumulated performance debt.

Redirect chains grow over time if not managed. Page A redirects to Page B, and later Page B redirects
to Page C. This creates a chain that slows navigation and dilutes link equity. Periodically audit
redirects to collapse chains into direct redirects.

Index bloat occurs when low-value pages (parameter variations, paginated archives, tag pages) get
indexed, diluting your site’s perceived quality. Monitor indexed page counts in Search Console. If
indexed pages greatly exceed your intentional content pages, investigate what’s being indexed that
shouldn’t be.

The Complete Launch Checklist

To pull this together into an actionable checklist, here are the essential tasks organized by timing:

Before launch, verify development noindex settings will be removed. Confirm HTTPS is configured with
valid certificate. Set up 301 redirects from HTTP to HTTPS. Test on mobile devices. Verify page
speed is acceptable.

At launch, remove any development noindex settings. Verify robots.txt is correct. Confirm sitemap is
accessible and accurate. Set up Google Search Console. Submit sitemap. Request indexing for key
pages.

First week, monitor for indexing in Search Console. Check for crawl errors. Test robots.txt with
Search Console tester. Verify all pages load correctly on mobile and desktop.

First month, review Index Coverage for issues. Check Core Web Vitals data. Verify pages beginning to
appear in search results. Address any Search Console notifications.

Ongoing, monthly Search Console review. Quarterly comprehensive technical audit. Post-update
verification after any major changes. Continuous monitoring for security and performance.

Conclusion

Technical SEO for new domains requires attention to fundamentals that more established sites might
take for granted. Without existing authority or established crawl patterns, new sites depend on
correct technical configuration to even get discovered, let alone to rank.

The checklist I’ve outlined here represents the essential elements: ensuring indexability, setting up
proper tracking, configuring secure and consistent URLs, optimizing on-page technical elements,
establishing search engine communication through sitemaps and robots.txt, delivering solid mobile
and performance experiences, implementing useful structured data, and maintaining ongoing
monitoring.

None of these individual elements is particularly complex, but missing any one of them can undermine
your entire SEO strategy. The sites that succeed are often simply the ones that got the basics right
consistently, creating solid foundations that support everything else they build.

Take the time to work through this checklist systematically. Verify each element rather than assuming
it’s configured correctly. The initial investment pays dividends through faster indexing, better
crawl efficiency, and rankings opportunity that would otherwise be blocked by preventable technical
problems.