[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: [syndication] site-wide metadata discovery
> One concern is that search engines, at least traditionally, have implemented
size
> limits for robots.txt and excluded sites with overly large robots.txt files.
Not
> sure if that restriction still applies (most robots requesting robots.txt will
not
> use the additional information though so robots.txt is not an ideal place for
this)
Good point. I think we're talking of using something no larger than:
Site-index: default
href: http://www.syndic8.com/feedlist.ocs
type: http://InternetAlchemy.org/ocs/directory#
title: Syndic8 feed list in OCS format
So it's not like that should be an unreasonably large addition.
Likewise, I think the plan is to have the default URL have semantics that allow
for delegation and alternatives. That default document could well also contain
links to many, many others related to the site. This would help to allow
delegation and avoid cluttering the robots.txt file.
-Bill Kearney