Qbasicnews.com
January 17, 2020, 06:18:46 PM *
Welcome, Guest. Please login or register.

Login with username, password and session length
News: Back to Qbasicnews.com | QB Online Help | FAQ | Chat | All Basic Code | QB Knowledge Base
 
   Home   Help Search Login Register  
Poll
Question: Good idea?  (Voting closed: May 23, 2005, 10:01:47 PM)
Yes - 4 (66.7%)
No - 2 (33.3%)
Total Voters: 6

Pages: [1]
  Print  
Author Topic: Multi-domain search spider.  (Read 3405 times)
anarky
Been there, done that
*****
Posts: 1231


The Blobworld Comics King


« on: May 16, 2005, 10:01:47 PM »

Here's an idea inspired by the need for housing a million files andonly having a small space to do it.

A search engine which meets the follofing requirements:

- Search interface can be used on any site, and be formatted css style to suite the site, or modified manually if the css doesn't exist.
- Have one central point (in this case www.qbasicnetwork.com because I want it on my site) like any search engine.
- The ability to allow users to submit a folder of their site to the search engine.
- The ability for each site in the list to search through those folders for a MySQL database and update it to the central hub every 24/48 hours.

When a search query is sent, the query goes back to the hub and the databases are searched. The results would be displayed on that site the query was sent from in the style of the site, or not, depending on the site's admin.
- If a site goes offline, the results and links are still displayed, but shown as offline, with a reason.
- Reason for being offline will be mentioned by the file host when they conduct mainenance.

Any changes will be updated within 24/48 hours.

Pro's:
- Eliminates storage problems
- Database is never completely offline for mainenance
- Everyone contributes

Con's:
- Possible permission problems
- May not be possible due to a number of facts, but perhaps this can be eliminated?

Don't ask me about a prize, I have nothing to offer. I'd just like to see it done, since I have all these files and nowhere to put them.

If I am not being clear enough, let me know.

>anarky
Logged

Screwing with your reality since 1998.
anarky
Been there, done that
*****
Posts: 1231


The Blobworld Comics King


« Reply #1 on: May 23, 2005, 08:44:01 AM »

No suggestions?

>anarky
Logged

Screwing with your reality since 1998.
MystikShadows
Ancient Guru
****
Posts: 542



WWW
« Reply #2 on: May 23, 2005, 09:08:07 AM »

Oh I think you're clear enough yes...because of the nature of multi spidering...it probably could be done as long as you know what the "securities" are on each domain where the files are hosted and code things accordingly.
Logged

hen they say it can't be done, THAT's when they call me ;-).




need hosting:  http://www.jc-hosting.net
All about ASCII: http://www.ascii-world.com
anarky
Been there, done that
*****
Posts: 1231


The Blobworld Comics King


« Reply #3 on: May 25, 2005, 09:43:04 AM »

Well that's all included, but for the network to crawl a site, the admin must register a folder, and set permissions on their server accordingly.

>anarky
Logged

Screwing with your reality since 1998.
Pages: [1]
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.21 | SMF © 2015, Simple Machines Valid XHTML 1.0! Valid CSS!