How I Made My Website SEO Optimized

How I Made My Website SEO Optimized

When I started building my website for Jai Maa Bhawani Dal Bati Churma, I knew I wanted it to be fast, user-friendly, and most importantly—SEO-optimised. After all, what’s the point of creating a website if no one can find it on Google?

In this blog, I’ll explain the steps I took to optimize my site for search engines. These are practical ways to boost your site’s visibility and ranking.


1. Setting Up Metadata for Better Search Rankings

One of the first things I tackled was meta tags—they tell search engines what your website is about. I created a reusable function to generate metadata dynamically for each page.

What I Did:

  • Used title and description for better indexing.

  • Implemented Open Graph (OG) tags for rich previews on social media.

  • Added Twitter meta tags for improved sharing.

Example Code:

const getPageMetadata = ({ title, description, image, path = '' }) => {
  const metaTitle = `${title} | ${siteMetadata.title}`;
  const metaDescription = description ?? siteMetadata.description;
  const metaImage = image ? image : siteMetadata.socialBanner;
  const sitePath = `${siteMetadata.siteUrl}/${path}`;

  return {
    title: metaTitle,
    description: metaDescription,
    openGraph: {
      title: metaTitle,
      description: metaDescription,
      url: sitePath,
      images: [{ url: metaImage, width: 1200, height: 630 }],
    },
    twitter: {
      card: 'summary_large_image',
      images: [{ url: metaImage }],
    },
  };
};

Why It Matters?

  • Helps search engines understand my content.

  • Improves social media link previews.

  • Boosts CTR (Click-Through Rate) in search results.

📸 Lighthouse SEO Audit Screenshot:


2. Creating a Sitemap for Better Indexing

A sitemap.xml file helps search engines discover and crawl all pages on my website.

What I Did:

  • Created a dynamic sitemap.ts file in Next.js.

  • Listed all important routes (blog, about-us, contact-us, etc.).

Example Code:

export default function sitemap() {
  const siteUrl = siteMetadata.siteUrl;
  const routes = ['', 'blog', 'about-us', 'contact-us'].map(route => ({
    url: `${siteUrl}/${route}`,
    lastModified: new Date().toISOString().split('T')[0],
  }));
  return [...routes];
}

Why It Matters?

  • Ensures search engines find all pages.

  • Improves SEO ranking by helping Google crawl the site efficiently.

📸 Rich Results Test Screenshot:


3. Configuring Robots.txt to Guide Crawlers

To control how search engines crawl my site, I added a robots.txt file.

What I Did:

  • Allowed all pages to be indexed (allow: '/')

  • Added my sitemap link for better discovery.

Example Code:

export default function robots() {
  return {
    rules: [{ userAgent: '*', allow: '/' }],
    sitemap: `${siteMetadata.siteUrl}/sitemap.xml`,
  };
}

Why It Matters?

  • Ensures search engines index the right content.

  • Prevents duplicate or unnecessary pages from being indexed.

📸 Structured Data Linter Screenshot:


Final Thoughts

SEO is not just about adding keywords; it’s about making your website easier to discover, crawl, and rank. These optimizations helped me make my website faster, more structured, and search-engine friendly.

If you’re building your website, start with these essentials and see the difference it makes! 🚀


What’s Next?

I plan to improve SEO further by: ✅ Adding lazy loading for images. ✅ Improving page speed with better caching. ✅ Writing high-quality blog content to boost organic traffic.

What strategies do you use for SEO? Let’s discuss this in the comments! 👇