Keeping kids safe -
Keeping your kids safe on the
Internet is important, here is another safe Internet tip from Clear Traffic.
Keeping your kids safe on the
Internet is important, here is another safe Internet tip from Clear Traffic.
Filters are a great tool, but remember no filter works 100% of the time.
BESS - manufactured
by N2H2, provides its Internet-filtering services in one of 2 ways: either as a
proxy server, whereby each Web request is passed through a server located at
N2H2 itself, or in the form of a dedicated server called the "Internet Filtering
Manager," installed on a local computer or system. Dedicated-server
administrators can enable or disable any of BESS's blocking categories, as well
as BESS's keyword filtering features; users on BESS proxy servers cannot. In
both scenarios, BESS provides 29 categories of blocked content under its
"Typical School Filtering" setting, ranging from "Adults Only" and "Alcohol" to
"Gambling," "Lingerie," "Personals," and "Tasteless/Gross." N2H2 states that 4
of the 29 classifications—"History," "Medical," "Moderated," and "Text/Spoken
Only"—are designed to distinguish between sites falling squarely into BESS's
blocking categories and those that may contain sexually oriented, violent, or
other question-able content but also some educational merit, such as the Starr
report to Congress on President Clinton's sexual transgressions.
Under the "Maximum
Filtering" setting, all 29 categories, as well as employment sites, message and
bulletin boards, investment-related sites, images of individuals wearing
swimsuits, and all Web searches are blocked. Configured for "Minimal Filtering,"
N2H2's Internet Filtering Manager blocks sites falling into the categories of
"Adults Only," "Hate/ Discrimination," "Illegal," "Pornography," "Sex," and
ClickSafe - rather
than relying on lists of objectionable URLs, ClickSafe is designed to review
each requested page in real time. According to company cofounder Richard
Schwartz's outline for testimony submitted to the commission created by the 1998
Child Online Protection Act (the COPA Commission) in 2000, ClickSafe "uses
state-of-the-art, content-based filtering software that combines cutting edge
graphic, word and phrase-recognition technology to achieve extra-ordinarily high
rates of accuracy in filtering pornographic content," and "can precisely
distinguish between appropriate and inappropriate sites."
Cyber Patrol -
currently owned by Surf-Control, operates with 12 default blocking categories,
including "Partial Nudity," "Intoler-ance," "Drugs/Drug Culture," and "Sex
Education." (See appendix B.) According to the manufacturer's Web site, "Cyber
Patrol employs a team of professional researchers at least 21 years of age
including parents and teachers" to determine whether sites are to be blocked.
Any page that "contains more than 3 instances in 100 messages or any easily
accessible pages with graphics, text or audio that fall within the definition"
of any of the 12 categories "will be considered sufficient to place the source
in that category." As with most filtering products, Cyber Patrol's list of
prohibited sites is not made public, but SurfControl offers the CyberNOT search
engine, a feature on its Web site through which users can enter URLs and receive
immediate responses as to whether or not those pages are on the filter's block
list. SurfControl adds, "Internet sites that contain information or software
programs designed to hack into filtering software, including Cyber Patrol, are
added to the CyberNOT list in ALL categories as a measure of protection for the
parents, educators and businesses that rely on Cyber Patrol to screen Internet
before 1999, CYBERsitter, in addition to blocking entire sites and searches for
terms on its block list, would excise terms it deemed objectionable or leave
blank spaces where they would otherwise appear. This procedure led to some early
notoriety for the product, such as the instance in which it deleted the word
"homosexual" from the sentence, "The Catholic Church opposes homosexual
marriage"—and left Web users reading "The Catholic Church opposes marriage."
In 1999, CYBERsitter
modified its system and established 7 default settings, including "PICS Rating
adult topics," which "[c]overs all topics not suitable for children under the
age of 13," "sites promoting the gay and lesbian life style," and "[s]ites
advocating illegal/ radical activities." Its total list of blocking categories
grew to 22. Users could, as they can with the most recent versions of the
software, enable or disable any specific category.
(whose spokesperson is Donna Rice Hughes) allows users to choose from a variety
of filtering configurations. Its least restrictive "Full FamilyClick access"
setting, "recommended for ages 18+," blocks sites falling into any of 7
categories, including "Crime," "Gambling," and "Chat." Its "Teen access"
setting, for ages 15–17, blocks the previous 7 categories as well as
"Personals," "Illegal Drug Promotion," "Chat/Message Boards," and "Non-FamilyClick
Email Services." "Pre-Teen access," for ages 12–14, bars 4 additional
categories; these include "Advanced Sex Education" and "Weapons." "Kids access,"
geared toward ages 8–11, blocks "Basic Sex Education,"s defined as "[s]ites
providing information at the elementary level about puberty and reproduction."
Finally, the "Children's Playroom," for ages 7 and under, "is 100% safe. It
contains activities, games and content that have been pre-selected and
pre-approved by FamilyClick."
manufactured by Symantec, operates by a combination of a set of predefined URL
databases and a "Dynamic Document Review." The site databases are divided into
22 categories. Dynamic Document Review further reviews the content of a
requested page: If the URL is not in any of the databases, I-Gear scans the page
for trigger words from the corresponding "DDR Dictionaries." Each matching word
on the site receives a numerical score; if the total score for the page exceeds
50 (which is the default maximum score; it can be adjusted to anywhere between
one and 200), the site is blocked. According to the product literature, "In
addition to unconditionally vulgar words, I-Gear looks for words that are
conditionally appropriate. I-Gear reviews each word on a page and examines the
surrounding words to determine the context" of such terms. The example given in
I-Gear's manual is the word "sexual": While the string "hot sexual pictures" may
be included in the "Sex/Acts" dictionary and thus earn a page a few points, the
string "sexual harassment" will not.
Internet Guard Dog -
manufactured by McAfee , announces that it "allows children to surf and chat
safely" through "a comprehensive objectionable content database" which prevents
"messages deemed inappropriate ... from reaching your child." "[O]ffensive
words" as well as sites are blocked. A June 9, 2000 review in PC Magazine
noted that Guard Dog allows the user to filter by category (e.g., drugs,
gambling, the occult) from levels 0 through 4, and that "[w]hen a line contains
a disallowed word, Guard Dog replaces the entire line with asterisks."
Net Nanny - relies
nearly exclusively on a published keyword list, which accompanies a short list
of actual prohibited sites. While generally commended for its willingness to
disclose its blocking list, Net Nanny has nonetheless been charged with
inappropriate and, more frequently, ineffective filtering.
Net Shepherd - in
October 1997, AltaVista and the filtering software Net Shepherd launched an
AltaVista-based "Family Search" engine designed to screen the results of
AltaVista searches and furnish users with pre-filtered results consisting only
of sites deemed appropriate according to NetShepherd's database of site ratings.
Net Shepherd claimed to have rated more than 300,000 sites based on "quality"
and "maturity," relying on "demographically appropriate internet users'"
judgments of what would be "superfluous and/or objectionable to the average
Security 2001 Family Edition - manufactured by Symantec, which also produces
I-Gear. The methodology and blocking categories are the same as I-Gear's.
Safeserver - relies
on so-called Intelligent Content Recognition Technology (iCRT), "a leading-edge
technology based on artificial intelligence and pattern recognition technologies
. . . trained to detect English-language pornography" and to screen requested
Web pages in real time. It has 7 categories of objectionable content: "Hate,"
"Pornography," "Gambling," "Weapons," "Drugs," "Job Search," and "Stock
Trading." It does not state its criteria for determining that a Web site fits
into any of these categories.
Safesurf - operates
a voluntary self-rating system whereby authors of Web pages can evaluate their
sites according to 10 content categories, including "Profanity," "Nudity,"
"Glorifying Drug Use," and "Other Adult Themes." In addition, each page is
assigned a numerical rating, or a "SafeSurf Identification Standard" (indicated
by the "SafeSurf Wave": SS~~) between one and 9 to indicate its age range.
Web-page authors may assign ratings in other categories as necessary. For
instance, an author may assign his or her material a "Nudity" rating of one if
it includes "Subtle Innuendo; [nudity] subtly implied through the use of
composition, lighting, shaping, revealing clothing, etc." or a rating of 7 if it
presents "[e]rotic nudity." From 1996–97, SafeSurf offered a remote server-based
Internet Filtering Solution for schools, libraries, ISPs, and businesses.
manufactured by Secure Computing, was originally intended for employers seeking
to limit employees' non-work-related Internet usage. By 1999, it was also
targeting schools. The filter's control list has undergone slight modifications,
but on the whole, prior to 2001 SmartFilter divided objectionable sites into 27
categories, which could be enabled according to each customer's needs. When
SmartFilter 3.0 was unveiled in January 2001, 3 of the categories ("Alternative
Journals," "Non-Essential," and "Worthless") had been removed, and 6 others
added, including "Mature" and "Nudity." In addition, the "Sex" category was
adjusted to encompass not only sites containing pornographic images or sexually
oriented material, but also "text of sex acts." But educational information on
sex ("sexually transmitted diseases, safe sex, teen pregnancy") previously
included in the "Sex" category, was now excluded. This latest version of
SmartFilter did not deem a page a "Sex" site on the basis of "nudity" alone.
Surfwatch - owned
by SurfControl, which also manufactures Cyber Patrol. It blocks Web sites
falling into any of 5 "Core" categories: "Sexually Explicit," "Drugs/ Alcohol,"
"Gambling," "Violence," and "Hate Speech." According to the "Filtering Facts"
page of SurfWatch's Web site, "Before adding any site to our database, each site
‘candidate' is reviewed by a SurfWatch Content Specialist. Deciphering the gray
areas is not something that we trust to technology; it requires thought and
sometimes discussion." The statement continues, "We use technology to help find
site candidates, but rely on thoughtful analysis for the final decision. Before
any site or word pattern is added to the database, it is reviewed for context of
use on the Web—how it is being used, what other types of content might be
restricted if this is blocked. We review the impact that each word or site block
will have once implemented in our filters." Yet complaints of SurfWatch's
inaccurate filtering have continually arisen in recent years.
We-Blocker - a free
Internet filtering service that blocks sites falling into any of 7
classifications. We-Blocker finds potentially objectionable sites through
recommendations from users. Then, according to the product information page of
the Web site (www.we-blocker.com/webmstr/wm_dbq.shtml), "A We-Blocker agent
reviews the site—if it is CLEARLY objectionable, it is automatically entered
into the database. . . . If the site submitted is not clearly objectionable, it
is passed to the We-Blocker site review committee."
WebSENSE - states
that it is designed almost solely for office and library use. It originally
operated with 30 blocking categories, including "Shopping," "Sports," and
"Tasteless," which could be enabled according to each administrator's needs. The
filtering categories were revised with the December 2000 release of WebSENSE
Enterprise 4.0, which extended the number of filtering categories from 30 to 53
and supplied greater specificity in some of the individual category definitions.
WebSENSE's "Alcohol/ Tobacco," "Gay/Lesbian Lifestyles," and "Personals/Dating"
categories were brought together, along with the new classifications
"Restaurants and Dining" and "Hobbies," under an umbrella category, "Society and
Lifestyle." Its "Hacking" category was incorporated into the larger "Information
Technology," which also encompassed the previously unaccounted-for "Proxy
Avoidance Systems," "Search Engines & Portals," "Web Hosting," and "URL
Translation Sites." The "Activist" and "Politics" categories were combined into
one, as were "Cults" and "Religion," while "Alternative Journals" were absorbed
into a "News & Media" category. Separate subcategories were created for "Sex
Education" and "Lingerie & Swimsuit."
X-Stop - claim to
fame is its "Felony Load," later redubbed the "Librarian" edition, through which
the product's manufacturers, the Log-On Data Corporation, originally claimed
that "only sites qualifying under the Miller standard are blocked,"
referring to Miller v. California's three-part test for constitutionally
unprotected obscenity. Log-On also asserted that "[l]egitimate art or education
sites are not blocked by the library edition, nor are so-called ‘soft porn' or
‘R'-rated sites. Subsequently, X-Stop's manufacturer, which changed its name to
"8e6 Technologies," only maintained that "Nobody blocks more pornographic sites
than X-Stop. We also search out and block sources containing dangerous
information like drugs and alcohol, hate crimes and bomb-making instructions."
The software relies on an
automated "MudCrawler" that locates potentially objectionable sites using 44
criteria that are not made public. Borderline cases are reviewed by "‘MudCrawler'
technicians." X-Stop also comes equipped with a "Foul Word Library," whereby
users are prohibited from typing (in e-mails or search forms, for instance) any
of the listed terms.
Filters are described by the NCAC. This is not an
organization we support, but they had great information specifically about
filters and we wanted to acknowledge that our information came