To help crawlers indexing your website, it’s a good practive to add both a Sitemap.xml and a robots.txt file. Although you can create those manually, it’s quite nice to use Jekyll here too.

## Sitemap.xml

To create a valid sitemap, I’m using the Jekyll Sitemap Plugin. You just need to add gem 'jekyll-sitemap' to your Gemfile and add the plugin to your _config.yml:

After your next build, you’ll get a nice shiny sitemap.xml.

## Robots.txt

The robots.txt is even easier. All you need to do is adding a new file robots.txt with the following content into your Jekyll root folder:

---
---
User-agent: *
Sitemap: {{ site.url }}/sitemap.xml