How
will computer technology affect future politics? This short
survey is not meant to be a soapbox for any particular political
viewpoint. Instead, it stumps for the application of Artificial
Intelligence (AI) technology to expand public involvement
with information-driven politics, the politics of knowledge,
not necessarily the politics of winning elections. I will
point to some potential AI contributions: political models,
tools to search for and assess political facts, tools to
frame political concepts, and also tools to expand electronic
discussion.
All
Models Are Local
Computer
models and simulation are needed to track even the roughest
outlines of the increasingly complex political landscapes
and to understand the dynamics of the underlying power realities.
Political models achieve two goals: They locate candidates
in what R. Joslyn calls "issue space" by analyzing
the content of candidate appeals and making informed guesses
about candidates' programmatic behavior once in office.
They also attempt to understand the role of partisanship
for example, the primary win by former Illinois Representative
Dan Rostenkowski, even though he later lost his seat, was
not influenced by issues as much as by perceived steadfastness
and party loyalty.
One
approach to modeling the behavior of political parties uses
the artificial adaptive agent structures developed by John
Holland and John Miller in the Echo class of models for
complex adaptive systems. Echo models let researchers explore
the relationship between optimization and adaptation and
test hypotheses about the underlying environment. Echo's
ability to represent the "unconscious internal models"
might be useful for modeling the political thought processes
of citizens. Likewise, Echo's ability to represent "aggregate
behavior" might be useful for modeling the organizational
evolution of a political party itself. Echo is available
via anonymous ftp to ftp.santafe.edu for the file /pub/Users/terry/echo/Echo-1.0.tar.Z).
Smart
Whistles and Watchdogs
AI tools
for knowledge discovery are used to detect patterns of fraud
in credit card and business applications. Can similar approaches
be used in the political and governmental domains?
Taxpayers
Against Fraud (TAF), a Washington, D.C.-based nonprofit
organization, has recovered more than $588 million for the
U.S. government since 1986. TAF uses the whistleblower'
law to uncover fraud. This law originated when Abraham Lincoln
cracked down on war profiteers who filled musket crates
with sawdust and sold the same horses to the cavalry time
after time.
Lisa
Hovelson, executive director of TAF, says that computers
have been used only to calculate damages after fraud details
are known, not at the front end for data discovery or analysis,
for which TAF essentially has relied on inside persons.
"We have discussed and support the need for such AI
capability, but it is still in the future for us,"
says Hovelson. An example of U.S. government interagency
exchange of information, where data correlation is required,
suggests Hovelson, is the IRS and the Department of Education
for defaulted student loans. Another example is the Department
of Customs and the duties paid on products coming into the
United States, compared to the prices charged to the government.
Yet
another potential application involves watchdogs for vote
fraud. A recent case involving a close election loss for
the Pennsylvania State Senate by Republican Bruce Marks
kept the Philadelphia news media humming for months. The
election had slipped by the watch of the nonpartisan group,
which manually inspects ballots and allegations of election
impropriety. A pattern of ballot fraud and forgery was detected
after citizens protested that their names were on erroneous
absentee ballots. A Philadelphia Inquirer editorial called
for "modernizing voter registration information by
computerization including digitizing signatures."
Forensic
Linguistics Reliable information is essential for a free-thinking
public to arrive at opinions. New computer applications
can assist in the related functions of news understanding,
text retrieval, and the acknowledgment of bias or intentional
ambiguity. Such applications could assist journalists, as
well as citizens.
The
Arlington,Va. based Advanced Research Projects Agency (ARPA)
has sponsored a series of Message Understanding Conference
(MUC) competitions.The goal in MUC 3 concerned the extraction
of information from news articles about the topic of terrorism.
MUC solutions have ranged from in-depth natural-language
understanding capabilities to skimming techniques that aim
to avoid the knowledge-engineering bottleneck associated
with many text-processing systems.
Mainstream
journalism in the wire services the primary source for most
of the 1,800 daily newspaper, 11,000 radio, and 2,000 TV
stations in North America is characterized generally by
neutrality and balance. Exceptions exist, and the detection
of linguistic bias in the news media is very important.
A few of the news services that focus on the exceptions
include FAIR (Fairness & Accuracy In Reporting), LOOT
(Lies Of Our Times, Institute of Media Analysis), and Critical
Intelligence (Boardroom Inc.), all based in New York.
Fuzzy
Detective Tools
L. Bennett
suggests that implicit handling of policy information by
the news media would not be a problem for democracy if members
of the public approached the news as detectives, looking
for hidden clues upon which to build their understanding
about a situation. Libraries already use electronic-search
capabilities for information filtering, document location,
and fact extraction. Software tools that achieve these tasks
include Gopher, Wide Area Information Servers, Archie, and
AppleSearch. While these first-generation tools have been
limited by keyword requirements, the commercial development
of fuzzy search' capabilities in a few expensive tools is
a harbinger.
One
fuzzy search' tool vendor is Excalibur Technologies Inc.
(San Diego, Calif.). Excalibur's document-retrieval products
have migrated to client/server architectures and will be
offered by late 1994 as an unbundled set of advanced programming
tools for embedded applications. Metrics given by Excalibur
include search 200,000 pages of text in ten seconds, learn
new input data at a rate of five megabytes in 160 seconds,
and create index memories a third of the size of the original
text.While Excalibur's pattern-recognition tools have been
applied to text and picture images, multimedia applications
with digital data of voice or video are yet to be explored
in this domain.
Unlike
many traditional search-and-retrieval systems that discard
certain words such as "the," Excalibur's approach
can search on concepts or every single word. For example,
"The" is a common Vietnamese name and is featured
prominently in many Defense documents of the Vietnam War
era. The Library of Congress uses Excalibur's tool to scan
in Spanish-language law journals from around the world.
The Defense Intelligence Agency's Counter-Drug Directorate
uses this tool to scan in articles from Spanish newspapers
and search for words and images. The U.S. Department of
Defense's Decision Systems Management Agency uses this tool
to process records from the former Soviet Union, searching
for clues related to U.S. prisoners of war. More
>>