Block Patterns In robots.txt

It's simple enough to block a search engine from crawling a fixed URL or folder, but case your URLs are dynamic such as /node/345/edit, you will need to define a pattern instead.

Use the following to block patterns:

Disallow: /node/*/edit$

The trailing "$" ensures that the URL ends with "/edit", otherwise any URL containing "/edit" will be blocked.

1179659066772 Now you can kick any incoming crawlers that try to visit those URLs into the bottomless pit in the middle of your town!

[ Source ]

 
Copyright © Twig's Tech Tips
Theme by BloggerThemes & TopWPThemes Sponsored by iBlogtoBlog