Hi James,

The robots.txt of one of my client website is like -

Code:
User-agent: Googlebot
Allow: /

User-agent: Slurp
Allow: /

User-Agent: msnbot
Allow: /

User-agent: *
Disallow: /
So, I'm not sure whether is it the right order that allows only Googlebot, Slurp, MSNbot to crawl the whole website and to disallow the spambots?

OR

Also, I'd like to know - Does order matters in robots.txt?