Skip to main content
Practice

SEO Tools - robots, sitemap, rss

In SEO, there are additional tools available to effectively introduce your website to search engines.

In this lesson, we will cover commonly used SEO tools: robots.txt, sitemap, and rss.


robots.txt

robots.txt is a text file within a website that tells search engines which pages should or should not be crawled.

robots.txt Example
User-agent: *
Disallow: /private/
Allow: /public/

The above robots.txt instructs all search engines to disallow access to the /private/ directory and allow access to the /public/ directory.


Sitemap

A sitemap is a file that lists the pages of a website and indicates their relationships.

The sitemap informs search engines about the structure of a website and acts as a map showing links within web pages.

sitemap Example
<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
<url>
<loc>https://example.com/page1</loc>
<lastmod>2023-10-15</lastmod>
<changefreq>daily</changefreq>
</url>
<url>
<loc>https://example.com/page2</loc>
</url>
</urlset>

This sitemap example lists two web pages, and page1 includes its last modification date and change frequency.


RSS

rss is a file that automatically informs subscribers about content updates on a website. It is commonly used by blogs, newsletters, and web services.

rss Example
<?xml version="1.0"?>
<rss version="2.0">
<channel>
<title>My Website News</title>
<link>https://example.com</link>
<description>Latest updates from My Website</description>
<item>
<title>New Article</title>
<link>https://example.com/new-article</link>
<description>This is a new article.</description>
</item>
</channel>
</rss>

In the above example, the update information is presented under the title 'My Website News,' introducing the content titled 'New Article.'

Want to learn more?

Join CodeFriends Plus membership or enroll in a course to start your journey.