Advertisements

The Nginx configurations that I had created to serve only robots.txt and sitemap-index.xml over http

I had initially configured a set of Nginx rules that will redirect every single request made over http to https. However, that breaks my doing so breaks my reference on how to send HTTP GET request with Java without using any external libraries.

Since the robots.txt is not sensitive data, I had decided to allow it to be retrieved via http as well. And since my robots.txt points to sitemap-index.xml via http, I also allowed sitemap-index.xml to be retrieved via http as well.

This post documents the changes that I had made to my Nginx configurations to serve only robots.txt and sitemap-index.xml over the http protocol.

The Nginx configurations that redirects all http traffic to https

My initial Nginx configurations for redirecting all http traffic to https was as follows:

server {

    listen 80;
    server_name   techcoil.com www.techcoil.com;
    return 301 https://www.techcoil.com$request_uri;    

}

What this set of code does is straightforward; it tells Nginx to return a response with the 301 Moved Permanently status code whenever there are HTTP requests received from port 80 which are directed at the domains techcoil.com or www.techcoil.com.

The resultant Nginx rules for allowing HTTP accesses to only robots.txt and sitemap-index.xml

The resultant Nginx rules for allowing HTTP accesses to only robots.txt and sitemap-index.xml is as follows:

server {

        listen 80;
        server_name   techcoil.com www.techcoil.com;
        root    /path/to/directory/that/contains/robot.txt/and/sitemap-index.xml;

        location /robots.txt {

        }

        location /sitemap-index.xml {

        }

        location / {
                return 301 https://www.techcoil.com$request_uri;
        }
}

I included the root command that will point Nginx to read files from the directory that contains robots.txt and sitemap-index.xml.

I then added 3 new location blocks:

  • The location /robots.txt block detects HTTP requests directed at http://techcoil.com/robots.txt or http://www.techcoil.com/robots.txt.
  • The location /sitemap-index.xml block detects HTTP requests directed at http://techcoil.com/sitemap-index.xml or http://www.techcoil.com/sitemap-index.xml.
  • The location / block detects all other HTTP requests and return a response with the 301 Moved Permanently status code to tell the browser to change the access protocol to https.

For my Nginx server, the empty lines between the braces of the first two blocks are necessary to make Nginx return robots.txt and sitemap-index.xml.

Advertisements

About Clivant

Clivant a.k.a Chai Heng enjoys composing software and building systems to serve people. He owns techcoil.com and hopes that whatever he had written and built so far had benefited people.

Udemy deals
Advertisements