Content Filtering

Content Filtering

The key to content filtering technology is its ability to monitor and filter content from the Internet, chat rooms, Instant Messaging, e-mail, e-mail attachments, Word, PowerPoint, and all other Windows applications. Additionally, content filtering will only report on violations identified in those applications. Content filtering is accomplished using a library of terminology, words and phrases that are compared to those emanating from the content of the Internet browser and Windows applications. When accessing, receiving, or sending content, the data is analysed against this library, and if a match occurs the data can be filtered, captured, blocked, and the application closed, or any combination thereof.

Content Filtering

Content filtering requires an agent on each workstation that checks the content data to determine whether it violates the organisations Acceptable Use Policy. If captured content data violates the Acceptable Use Policy, a capture of the violating screen is stored on the server with user, time, date, application and violation stamp for reporting and review purposes.

The utilisation of a library of explicit terminology allows an organisation to focus on specific content that violates policy. For example, the pornographic and sexually explicit library contains all data specific to this industry. Content filtering technology allows those words that used within a scientific or medical context to pass through the filter without causing a violation to be reported or logged. This same library approach also enables an organisation to monitor for unexpected or unauthorised flow of confidential information.

Advantages:

  • Content filtering allows filtering in all applications including: Internet, chat, instant messaging, e-mail, e-mail attachments, Word, PowerPoint and all other Windows applications. The software is integrated at the
  • Operating System level so that the content in any windows event are compared against the libraries to determine if the content is inappropriate, such as opening files in explorer.
  • Content filtering fills the largest security hole in a company’s network. Statistics (CSI) state that 70 to 80% of all security breaches occur from within the organization. Content monitoring can monitor for and stop the accidental or intentional disclosure of a companies Intellectual Property, confidential information or other non public content, that can be accessed or disclosed electronically.
  • Working in conjunction with Human Resources training and Acceptable Use Policy deployment, the Acceptable Use Policy informs the employee of what is expected from them as a computer user, while the content monitoring/filtering monitors and reports on compliance thus changing a computer user’s behaviour by making them responsible for adhering to the organizations Acceptable Use Policy and computing activities.
  • Content filtering uses screen captures of each violation with user name, date, time, application and violation stamp, to assist as part of a solution to provide the forensic data needed to protect the company.
  • Content monitoring is ideal for establishing an employee awareness program. When inappropriate data is discovered, organizations can choose to make users aware of the policy or notify employees by blocking the offensive content.
  • Utilisation of the Policy Central application results in full disclosure of the organisation’s policy to all employees so it remains non-repudiated.
  • Does not require the daily updates to keep the database effective and current.
  • Content filtering does not filter out the good content with the bad. Content filtering libraries have been developed to distinguish the difference between pornographic and sexually explicit material vs. material that is scientific and medical in nature. Content filtering also eliminating the need to block out vast amounts of educational material to stop small amounts of pornographic material found on a particular site.

 

Top

Call us 0333 370 2202

Or email us: [email protected]