Disallow is a rule used in robots.txt files to determine what parts of the website search engines? bots should not crawl. This is often used to maximize the crawl budget and prevent from duplicate content issues. When using it, make sure you?re not blocking any page rendering resources, such as CSS or JavaScript. If you?re unsure whether your rules will block those resources, use robots.txt tester in Google Search Console.