[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
RE: [syndication] Aggregating and displaying feeds
>Where I'm going with this is of course, "The Daily Me". I'm curious to
>know just how close we can get to this, using the data that's available
>right now and without introducing new elements in the standards or a
>huge push to get people to produce richer feeds.
I'm not as deep into the theory and capabilities the people on this list
seem to be, but I'm definitely trying to learn -- I love XML and RSS, very
interesting stuff!
On your question: while not nearly as ambitious as what you are looking for,
I tinkered around in my spare time some 8-9 months back with a "personal
portal" app in ASP. The whole thing would run on my machine only (since it
was only for my use) so I had a lot of leeway in how I handled it.
Basically, I took Userland as the basic concept. I built a "poor-man's
aggregator" that ran through a list of feeds every hour and compiled the
feeds into one big XML document (bad idea in hindsight) broken down into
sections. You could point your browser to the ASP script which would read in
the compiled XML data file and pipe it through a stylesheet to produce a web
page. The original list of feeds was listed in an XML file by section, e.g.
one section was "Tech News", one was "International News", etc and the
compiled data file was broken down by these sections. The resulting web page
had tabs across the top of the page, one for each section specified in the
original XML file. This was all IE-specific (since it was meant for my
desktop) with a little DHTML to handle the tab switches. So I had a "Tech
News" tab containing only feeds specific to that section, etc.
It was all manually built, i.e. I had to find the feed and add it to the XML
file in whichever section I wanted it to appear.
The aggregator was supposed to run on a schedule, so I had a Windows Script
Host file set up in the Windows Task Scheduler, and set it to read in the
feed list and pull in the data to compile the XML data file. However, for
some stupid reason (can't remember now) I couldn't get it to run on a
schedule, so it kind of defeated the whole purpose of having the
most-current news at all times...
Anyways, it was a play-in-my-spare-time project at work, mainly as an
exercise to play with XML/XSL and RSS (and the XMLHTTP objects in the MSXML
parser), and while I would consider it definitely beta code, it worked. I
was working on expanding it to handle RSS 1.0 and scriptingNews formats
(only because Tomalak's Realm still uses that) but ran out of steam and
wound up working on other projects. It's been so long now I don't think I
even have any of it left anymore. :(
In hindsight, I probably should have simply downloaded the RSS files into
their own folder and just parsed them, or maybe a separate folder for each
section, or something similar. For something more scalable, a database
solution would be ideal.
Question: in your database, how would you store RSS data? That would be a
new task to me, and I'd like to know what the "accepted best practice" is.
Sounds like you are storing individual items as unique records tied to title
and url as primary keys? Assuming you have a separate table (also keyed off
title and url) that holds the "meta" information on the feed, such as
author, date, etc? And why not just key off the url, since it has to be
unique?
Thanks,
-dave
________________________________________________________
SSgt Dave Cantrell, USAF
Web Developer, Logistics Information Systems
[DSN] 596.6277 [COM] 334.416.6277
dave.cantrell@gunter.af.mil
https://web2.ssg.gunter.af.mil/IL (.mil/.gov only)
--------------------------------------------------------
If it's not useful or necessary, free yourself
from imagining that you need to make it.
-- http://c2.com/cgi/wiki?ShakerQuote
--------------------------------------------------------
This e-mail does not constitute endorsement of any
product by the U.S. Air Force, nor can it be used to
obligate the U.S. Air Force in any legal, financial,
or contractual arrangement.