How To Prevent Robots.txt Passing From Staging Env To Production?
I had happen in the past that one of our IT Specialist will move the robots.txt from staging from production accidentally. Blocking google and others from indexing our customers' site in production. Is there a good way of managing this situation?
Thanks in advance.
Ask your IT guys to change the file permissions on robots.txt to "read-only" for all users, so that it takes the extra steps of:
- becoming Administrator/root
- changing the permissions to allow writes
- overwriting robots.txt with the new file
- → Declare an array variable to the controller/component [Laravel, October CMS]
- → Eloquent Multitable query
- → Throttling dispatch in redux producing strange behaviour
- → Expanding search bar not expanding before search
- → Laravel 5 - Elasticquent / ElasticSearch Missing404Exception with (Re)Mapping
- → List of react-native StyleSheet properties and options
- → Elasticsearch aggregate function error on laravel
- → Access ES aggregation bucket list objects in laravel
- → Creating indexes for Elasticquent in Laravel
- → search array without duplicates React.js
- → How to remove parameters from the root URL if it does I18n
- → concat ' in JS Code (Jquery)
- → searchByQuery function returns only 10 rows