[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]

Re: firewall url filter



Bruno Wolff III wrote:
On Fri, Jan 23, 2009 at 00:08:28 +1030,
  Tim <ignored_mailbox yahoo com au> wrote:
On Thu, 2009-01-22 at 09:38 +0100, roland wrote:
The client wants to prevent users to connect to sex sites.

Can I use the fedora-box as a firewall, filtering several url's or filtering several keywords?
You can do that sort of thing.  A simplistic overview of how is:

Use the firewall to block direct the browsers directly connecting to any
website (i.e. all outgoing connections to port 80).  That'll stop nearly
all web browsing, other than sites on other unusual ports.  It's not a
100% catchall, but probably 99%.

That doesn't catch https connections. Of course the firewall wouldn't
be able to check URLs in that case anyways.

Depending on the requirements it may be best to block all direct access
to the outside from the clients machines and only allow access through
a proxy.

If there is a know set of web pages they should have access to then they
can use a whitelist to only allow connections to those web sites. If not,
trying to block undesirable sites isn't an easy problem to solve in
general.

If said firewall is the mentioned Fedora server, that shouldn't be a problem. Squid itself has a lot of tools (redirectors in squidspeak) that can handle content filtering, which is what is desired from the reading. If the Fedora server isn't the firewall, still squid it, and redirect all http/https to the squid, so it is a transparent proxy on the network.

Squidguard is what I use, which offers some good blacklists, though keeping them updated (it is amazing how big the 'adult' database is) but finding scripts to keep these updated is a little hard at times. I have a rather poor shell script I hacked from a different site (a how to had a decent ancient shell script to update, and I brought it sort of up to speed) that I am porting to perl, though I don't know perl well, I am sure it could be done even better, besides I am stuck on how I want to do the diff's of the textfiles before compiling them into the binary files that the content filter uses. These scripts do a good job of keeping things updated. I can provide that script if asked.

About https, that can be a little more tricky. The only way I have ever seen good HTTPS content filtering done was either to block access to https (which is a bad thing) or some commercial proxies can do this (free versus 20-35k is hard to justify).

Attachment: smime.p7s
Description: S/MIME Cryptographic Signature


[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]