Web Hosting
the main problem

Y   
esterday my site was down for an entire day because of a server failure at Verio (my web host). I've had the same service for about 7 years now (two with Highway and from then on - Verio which bought them out). All was well till hurricane Wilma in 2005 killed my site (hosted in Boca Raton, FL). I believe it was transferred to Chicago afterward. Anyway, it's been off line now and then for hours at a time and the stats have often been difficult or impossible to access. Recently, my stats were completely lost for a week and weren't recovered. Now, my entire site went down for more than a day. I plan to move to 1&1 for a cheaper alternative (though I've been reading their "sucks" pages on the net). I figure I might as well pay $4.99 for bad service instead of $24.95 for the same bad service.

Is there a solution to this basic problem?

You bet your sweet bippy there is. But it needs a fix at the internet router level ... some new rules.

Here's how it goes. I get a second site ... same as the first site but on another company's server. When one goes down (even temporarily) the router sends the page request to the other site till the primary site is up again ... then resumes taking pages from the primary site.

Web sites are cheap ... like 5 dollars per month. It would be a nobrainer to make a contract with two of them. One would be designated "Primary" and the other "Secondary" or ... P and S. My domain name would be something like http://ebtx.p.com and my secondary site would be ... what?? ... http://ebtx.s.com ... wow, my head spins ... such difficult to understand concepts.

The router, for its part, would have to know if a site were down (you're sent a message to hat effect anyway) and upon discovering that a primary site was down ... would resend the request to the secondary site if such was available. Such an available secondary would have to be in the router database.

Remember, the two sites would be identical except for the "p" and the "s" designation ... and ... any pages that the webmaster failed to duplicate on the secondary site ... which will, of course, happen all the time due to the lethargy of all webmasters. When the router can't find the page on the secondary site ... it sends the error message that you normally get when your requested page is inavailable.

If such a system were in place, just about every serious site would be duplicated and the downtimes and screwups of hosting services would be covered ... for the most part. It would be quite rare for both to be down. You might end up with 99.9999% downtime ... that is ... one day in a million.

Why can' tthe internet allow for automatic redundancy of this sort?