__/ [ Larry ] on Thursday 16 March 2006 13:18 \__
> I have just started using Debian but have been around computing for 20
> What I want to do is capture the market data my broker feeds to me via
> their trading page. the data is not streaming, but changes fairly often.
> The web page containing the data is constantly updating,
> except I want to be able to take a snapshot about 20 times per hour for the
> "put/call" prices at a specified time so I can have a historical timeline.
> The broker does not provide a historical/daily/minute updates in a file
> format I can use so it is up to me to capture this data and massage it into
> a format I can use.
Scrape it, then interpret and process it.
,----[ Command ]
| man wget
If the data is provided in the form of feeds, you might also wish to parse it
using tools from the RSS 'family', e.g.:
> Several years ago I wrote some "c" file utilities that would parse out
> bad data from downloads but I don't think the above described application
> would be the same as this was kinda pre internet....still using Bulletin
> Boards back then with a 300 baud modem. Now, we have html,xml, etc and I
> was wondering how to get started capturing this information.
> I wish I could find a macro, as you see I don't know what is available but
> I have googled to death.
Let me know it you need further help because I have some useful scripts at
> Thanks in advance for any suggestions.
> Birmingham, Al.
Hope it helps,
Roy S. Schestowitz | Download Othello: http://othellomaster.com
http://Schestowitz.com | SuSE Linux ¦ PGP-Key: 0x74572E8E
6:35pm up 9 days 11:12, 7 users, load average: 0.22, 0.48, 0.43
http://iuron.com - Open Source knowledge engine project