What is the robots file and how does it differ from normal?

The robots.txt file provides instructions for web crawlers and bots that index your store. Aero automatically registers core parts of the store that should avoid being indexed and visited by bots.

In order to extend this file and add modules or store specific restrictions, the robots.txt file is processed as a pipeline, where the content (the text) is made available and can be manipulated.

For ease of access, the $content passed through the pipeline is an Illuminate\Support\Collection, grouped by the user agent.

\Aero\Store\Pipelines\RobotsTxt::extend(static function ($content) {
    if ($agent = $content->get('User-agent: *')) {
        $agent->push('Disallow: /my_module/');

Articles in this section

Was this article helpful?
0 out of 0 found this helpful