"Roy Schestowitz" <newsgroups@xxxxxxxxxxxxxxx> wrote in message
> __/ [Paul H] on Monday 09 January 2006 10:38 \__
>> "Roy Schestowitz" <newsgroups@xxxxxxxxxxxxxxx> wrote in message
>>> __/ [Paul H] on Friday 06 January 2006 16:02 \__
>>>> I have over 100 external links on my website, I need to check who of
>>>> links back to me. I need to do this regularly.
>>>> Is there a software tool that does this?
>>> * Technorati.com enables you to do this almost in real time. Output is
>>> form of Web pages or RSS feeds
>> Couldn't get my head round that. On the about about page it
>> says.."Technorati is the authority on what's going on in the world of
>> weblogs." ??
> If the world of Weblogs is all about /links/, then yes. Tagging is also
> something that they dominate. They are best bar none in that area. Since a
> new blog is created every 1 second nowadays, their capacity is depleted
> though and they can't cope with more obscure sites that link heavily,
> for SEO purposes.
>>> * Try 'link:<your_site_address>' in Yahoo search or Google search. You
>>> pull results in the form of RSS feeds from both
>>> * Use one of a variety of meta search engines in http://gada.be/ to keep
>>> track of items that identify your site. Many links are included and
>>> delivered in RSS form.
>>> * Look at your referrals logs and try to see what comes up. valuable
>>> tend to lead actual visitors (traffic) to your site.
>>> I mentioned RSS form quite often because you sought a software tool. Use
>>> RSS reader (e.g. RSSOwl, Thunderbird or Web-based Feedlounge, Google
>>> Reader...). This means that you will have a comforable environment for
>>> keeping track of this any time, anywhere. You can also aggregate results
>>> form a variety of distinct sources.
>>> Hope this helps,
>>> Roy S. Schestowitz | make install -not war
>>> http://Schestowitz.com | SuSE Linux | PGP-Key: 0x74572E8E
>>> 4:35pm up 26 days 23:46, 12 users, load average: 0.63, 0.43, 0.20
>>> http://iuron.com - next generation of search paradigms
>> After looking at some of the solutions I now know *exactly* what I need.
>> currently have a links page with hundred of links on it that has never
>> maintained, links have been added and never checked for a reciprocal
>> I want to be able to scan every domain that is linked to from my links
>> and check if they link back to me.
>> So, for example, if I have a link to http://www.bananas.com on my site, I
>> want to scan the entire bananans.com website for a reciprocal link. Is
>> Thanks Roy,
> There are commercial tools for doing that if I remember correctly.
> how deep need you go in this voyage for that reciprocal link? People tend
> move links around, if not remove them altogether. You don't want to break
> 'link pacts' in vain.
> Will you be willing to crawl the entire site and, if so, how would your
> partner' feel about this? How often will you run such link checks? In
> one could do this rather simply, without any shrink-wrapped bloatware.
> Firstly, to check that all outgoing links are 'alive', pass your links
> and look at the summary. Then, what you need to do is descend into any
> site -- the external links, that is. One tool (among more) for the job is
> wget. You could get just the referred page downloaded or even fetch the
> entire site by following links recursively.
> You could then run a scanner like fgrep or grep on the files (similar
> front-end tools are available for Windows, albeit they cost money). You
> should hen attempt to find your domain name anywhere in the site, which is
> now mirrored locally. Something like:
> fgrep -R "mysite.com" *
> The scanner would tell you where the links reside, if anywhere. You can
> this in batch mode, automatically going through your full list of links
> then schedule it to become a cyclic job (e.g. UNIX cron).
> Hope it helps,
> Roy S. Schestowitz
> http://Schestowitz.com | SuSE Linux | PGP-Key: 0x74572E8E
> 3:35pm up 29 days 22:46, 14 users, load average: 0.79, 0.95, 0.80
> http://iuron.com - next generation of search paradigms
Thanks again for your time and wisdom. I have opted to go with Bloatware ;O)
I have been using the SEO Studio demo (http://www.trendmx.com/) and it
appears to be able to manage links pretty well. However, taking this route
does mean I will trash all my existing links and start from scratch.
The alternative appears to be going through them all manually or taking a
crash course in Linux or Perl, none of which appeals.