Qbasicnews.com

QbasicNews.Com => Challenges => Topic started by: anarky on May 16, 2005, 10:01:47 PM



Title: Multi-domain search spider.
Post by: anarky on May 16, 2005, 10:01:47 PM
Here's an idea inspired by the need for housing a million files andonly having a small space to do it.

A search engine which meets the follofing requirements:

- Search interface can be used on any site, and be formatted css style to suite the site, or modified manually if the css doesn't exist.
- Have one central point (in this case www.qbasicnetwork.com because I want it on my site) like any search engine.
- The ability to allow users to submit a folder of their site to the search engine.
- The ability for each site in the list to search through those folders for a MySQL database and update it to the central hub every 24/48 hours.

When a search query is sent, the query goes back to the hub and the databases are searched. The results would be displayed on that site the query was sent from in the style of the site, or not, depending on the site's admin.
- If a site goes offline, the results and links are still displayed, but shown as offline, with a reason.
- Reason for being offline will be mentioned by the file host when they conduct mainenance.

Any changes will be updated within 24/48 hours.

Pro's:
- Eliminates storage problems
- Database is never completely offline for mainenance
- Everyone contributes

Con's:
- Possible permission problems
- May not be possible due to a number of facts, but perhaps this can be eliminated?

Don't ask me about a prize, I have nothing to offer. I'd just like to see it done, since I have all these files and nowhere to put them.

If I am not being clear enough, let me know.

>anarky


Title: Multi-domain search spider.
Post by: anarky on May 23, 2005, 08:44:01 AM
No suggestions?

>anarky


Title: Multi-domain search spider.
Post by: MystikShadows on May 23, 2005, 09:08:07 AM
Oh I think you're clear enough yes...because of the nature of multi spidering...it probably could be done as long as you know what the "securities" are on each domain where the files are hosted and code things accordingly.


Title: Multi-domain search spider.
Post by: anarky on May 25, 2005, 09:43:04 AM
Well that's all included, but for the network to crawl a site, the admin must register a folder, and set permissions on their server accordingly.

>anarky