The most common means of implementing a Web-filtering system is to set up a server at the customer's site that contains the categorized UBL database and the user policies, and to configure the customer's firewall to divert all URL requests to this server for processing. While seemingly straightforward, this approach has a number of disadvantages.
First, the URL database is often large, as it needs to represent millions of domains and billions of individual pages, and may require from 50-100 megabytes of storage space. In addition, Web pages are continually being added, deleted and changed, requiring regular updates to the Web-filtering server. Failing to recognize a particular URL can have serious consequences, as the choices are either to allow an uncategorized site through (and risk damage from malicious or inappropriate content) or to block all uncategorized sites, which can greatly annoy users and reduce their ability to do their jobs.
Another disadvantage of the "on-site" database approach is the latency that can be introduced as a result of the time required for the firewall to redirect every Web request, and for the Web-filtering server to process the request and make an allow-or-deny decision. In high traffic environments, the Web-filtering server can become a bottleneck and reduce the performance of all Web applications.
An alternative method for providing Web-filtering services employs an intelligent, next-generation content-processing firewall in conjunction with a URL database maintained "in the cloud." Under this deployment scenario, the content-filtering policies for each user are configured in the firewall (or in an associated directory server) and are executed in the firewall itself.
The URL database is maintained in a series of publicly reachable sites distributed across the Internet, and no local copy needs to be maintained at each customer's site. When a URL is requested, the content-processing firewall sends a copy of the requested URL to the "nearest" (or least busy) URL database server, which responds with a list of the categories matched by the requested URL. The content-processing firewall checks the policy for the user and then allows or denies the requested page accordingly.
With this approach, the database is updated continually in real time, making it current and accurate, unlike a local copy placed on a server, which can quickly get out of date. The firewall does not need to query the remote URL database for every URL request; instead, responses from the remote database can be locally cached in the firewall. This reduces the number of remote lookups required and speeds performance for often-requested URLs. In addition, purchasing and maintaining a separate server to host a local URL database and policy-enforcement engine is not necessary, leading to improved ROI.
"In-the-cloud" deployment lends itself to a per-site vs. per-seat pricing model, which can dramatically reduce the ongoing subscription costs for URL database updates.
A content-processing firewall enables the actual HTML text contained within Web pages to be screened for user-configurable words and phrases. This provides an additional layer of protection that enables both categorized and uncategorized pages to be screened more effectively. In addition, content scanning can he used to detect and block pages that contain viruses, worms, malicious script, spyware, adware, pop-ups and other threats that prevent the network and employees from working to their potential.
For more information from Fortinet: www.rsleads.com/410cn-256
This article was provided by Rick Kagan, vice president of marketing for Fortinet, Sunnyvale, Calif.
|Printer friendly Cite/link Email Feedback|
|Title Annotation:||usage of firewalls for internet security|
|Date:||Oct 1, 2004|
|Previous Article:||No match for new threats: corporate security measures typically include a combination of one or more firewalls and antivirus software.|
|Next Article:||Intelligent MCs debunk perceptions; today's media converters offer many more capabilities than their predecessors.|