List:General Discussion« Previous MessageNext Message »
From:Sasha Pachev Date:June 7 1999 5:20pm
Subject:Re: Is MySql strong enough ?
View as plain text  
Luigi Giacobbe wrote:
> 
> Hello,
> 
> I am new with Mysql, so ... apologizes if my question is stupid !
> 600 LAN with 8 PCs (4800 computers) must be connect to Internet through our
> infrastrucuture.
> They must use a proxy with a banned sites list.  Now the list contains +/- 64.000
> entries and increases each month.
> To solve our problem, we use the SQUID proxy and a redirector to realise filtering.
> We would like to put the list in a MYSQL database and the redirector should connect
> the DB to see if its a ban site.
> With our experience, 10.000 requests by minute are expected.
> With a good network design, its possible to decrease the requests number to 7.500
> (caching).
> To optimize the DB, we translate the url to long with the C strtol function but ....
> what kind of machine do we need because this means 7.500 connections by minutes with
> select request ?
> 
> Any advices ?
> 
> Luigi Giacobbe

The question here is really not so much about MySQL being fast enough,
because it faster for simple selects that any database I know, but more
about proper database design, MySQL configuration and smart choice of
hardware. Here is how I would approach the problem:

- build a modest system, maybe use something you already have or
something that will not cost you a lot
- install MySQL on it
- set up your filtering software with hooks to MySQL
- benchmark it and see the maximum number of requests it can handle
- if it is not good enough, optimize your tables/queries, then fiddle
with MySQL buffer sizes and other parameters
- if it is still not good enough, upgrade the RAM possibly to the size
of the entire database of URLs or as much as you can.
- if it is still not good enough because you are out of RAM, and you
cannot get any more, upgrade your disks to the fastest you can get. I
doubt this will be the bottleneck, since you need only about 100 bytes
per URL, so 100,000 URLs will fit into 128 MB RAM without very much
difficulty
- if you see very little disk activity at peak times, but things are
still slow, upgrade the CPU
- if you still have too much load to deal with, consider a duplicate
server solution , would be easy to implement since the database is
mostly static, but I doubt you would have to go that far  

And as a personal opinion, here is how I would solve the problem: if a
web server contains objectionable content, drop all IP traffic to and
from it on the router/firewall. Some people will complain, of course,
but here is what I think: if a web admin tolerates one piece of crap on
his server, he will probably feel OK about more of it, and it just gets
too expensive to keep track of everything , so tell him that he either
needs to get his act together or you just will not forward any traffic
to him. Rather radical, but I like radical solutions. 

-- 
Sasha Pachev
http://www.sashanet.com/ (home)
http://www.direct1.com/ (work)
Thread
Is MySql strong enough ?Luigi Giacobbe7 Jun
  • Re: Is MySql strong enough ?Sasha Pachev7 Jun