May 2003

Middle East

A classic story tells of blind men describing an elephant, each with his own ideas based on which part of the elephant he was touching. In my work providing software for educational institutions I get to touch the Middle East.

I've recently touched, via the Internet, students in Iran and Israel. They all seem alike to this blind observer--bright students enjoying the challenges of implementing AI projects. It was Iranian students that sparked my interest in RoboCup soccer, and Israeli students that sparked my interest in the complexities of parsing bi-directional language.

The World is not so grim when touching the side of education.

Cathaia - Bayesian and Neural Networks at Work

Cathaia is a small company using AI technologies to provide a competitive edge in services for telecommunications and other industries. Headquartered in Dublin, Ireland, they have a branch in Budapest Hungary, and customers in Germany and Mexico.

Their area of specialty is network management, workflow and other business support and decision support applications, all areas that are well suited for AI techniques. They use both neural networks and Bayesian belief networks (BBN) in their products. Populating a BBN is always a tricky problem, and they have tools that aid in that, including the capability of porting rules from more rigid formats into the more flexible belief network nodes.

While they examined various commercial offerings for BBNs, they decided to write their own reasoning engine and knowledge representation in pure Java, which also gave them the integration they wanted with the rest of their application suite.

AI Languages

Prolog is one of the oldest AI languages, right after Lisp. The history of the two languages is a tale of the power of belief in supposedly technical, objective individuals. The Lisp people in the major AI centers in the United States were, for some reason, very threatened by Prolog, and spent a lot of time saying bad things about it. As a result, it was never that popular in the U.S., but got a large following in the AI communities of Europe and Asia.

In the 1980s that all changed and Patrick Winston, one of the greats of MIT's AI Lab, wrote the forward for Ivan Bratko's excellent Prolog book, Prolog Programming for Artificial Intelligence. (I highly recommend any of Patrick Winston's books on AI. They are extremely well written and the ones with Lisp code ground AI concepts firmly in working code.)

Why are Prolog and Lisp good languages for AI? They are both symbolic languages, which means it is very easy to manipulate symbols without having to declare them. What's a symbol? Just a character string that represents something, like a piece on a chess board, a word in a natural language sentence, or a condition in a rule. Last month we looked at symbols representing probabilities in some code for Bayesian belief networks.

Why not C++ or Java for AI? Well, no good reason really, any language can be used to program anything. The issue is one of productivity and expressability. In a non-symbolic language, variables need to be allocated for the various entities being reasoned over in the program, and comparison functions used to test for similar entities.

An AI program often involves pattern-matching and search of symbols (text strings) related to a problem domain. A language like Lisp or Prolog makes it easy to code and express that sort of logic.

The savings can be quite dramatic for the right types of program. Pricing rules in commercial applications are a good example. eoTek, a company offering services to the mortgage loan industry, replaced 5000 lines of pricing code written in Java with 500 lines of Prolog code. The reduced code size, and declarative nature of the code, had the additional benefit of greatly reducing the quality assurance stage that was necessary in providing a service such as theirs.

There are many variations on Lisp and Prolog available as well, as individuals tinker with the aspects of the languages. Scheme is my favorite of the Lisp variations. The concepts in Prolog show up in languages like Mozart, covered in an earlier newsletter, and Mercury. These last two languages are powerful tools offering multiple ways of expressing concepts, allowing the programmer a variety of options for different portions of an application.

AI Performance

In the past, performance has always been an issue with AI languages. They had trouble keeping up with hard-core programming languages like C/C++. So developers of AI languages put a lot of effort into making them as efficient as possible. Still there was a large performance gap. Prolog, in particular, had a bad reputation for performance.

In the mid 1980s I had written a Prolog program that solved the Rubik's Cube puzzle. It took four to five minutes on my old 286 computer. In other words, if you wanted to do a Rubik's Cube worth of AI processing, you would need to get a cup of coffee while you waited.

Today, that same program solves 10 cubes per second on this four year old computer. In other words, you can do a Rubik's Cube worth of AI processing with every keystroke on today's machines.

Couple that with the tremendous overhead introduced into other software, such as Windows, Java, and .NET, and the overhead of network communications, and suddenly, the performance aspects of an AI component of an application becomes negligible.

And that's for an average, performance wise, Prolog implementation. There are faster Prologs and also Mercury, a Prolog derivative, which claims to be faster than the fastest Prologs.

Moore's law is making AI more and more practical every year. This was one of Kurzweil's themes in his book, The Age of Spiritual Machines (reviewed in an earlier newsletter). It is just a matter of time, he claims, before full human intelligence is realized in software.

Code Corner - The Two Faces of Prolog

Prolog is a confusing language because it is really two languages in one. It is said Prolog is very simple and easy to use. This is true. It is said that Prolog is obtuse and difficult to learn. This is also true. It is said Prolog has only limited applicability. This is true. It is said Prolog can be used for all sorts of wondrous applications. This is also true.

One face of Prolog is as a simple, straightforward rule language that can be used to easily implement efficient logic bases of formal rules. Pricing, tax code, and automated form-filling are some examples of this type of application. In these cases, the business rules are mapped directly to Prolog rules.

For example, consider these Prolog rules for a simple phone pricing system. The price is ten cents a minute before 7:00AM and after 8:00PM, and 25 cents a minute in between.

price(StartHour, DurationMinutes, PriceCents) :-
StartHour < 7,
StartHour > 20,
PriceCents is DurationMinutes * 10.
price(StartHour, DurationMinutes, PriceCents) :-
StartHour >= 7,
StartHour =< 20,
PriceCents is DurationMinutes * 25.

It doesn't take much training to learn the :- (called the neck) symbol means "if" and how these declarative rules directly encode the business logic for pricing. Also, anything beginning with an upper case letter is a logical variable, so these predicates expect StartHour and DurationMinutes as input, and output PriceCents.

The rules can refer to other rules and encode a complexity that matches the complexity of the business rules, for example the nightmare of pricing air fares.

It is this use of Prolog that is easy, but limited. It's great for crisp business rules, but you cannot readily use Prolog like this for anything involving uncertainty, or requiring the construction of solutions, as in an application that creates product configurations for customers. Pricing configurations, and bill-of-materials, on the other hand, fit nicely into this simple subset of Prolog.

The other face of Prolog is the use of rules as meta-rules. That is, the rules are used to describe ways of representing knowledge and ways of reasoning with that knowledge. This use of Prolog often requires using recursion, unification (pattern-matching) of complex structures, and list manipulation, all concepts that require more work to master. But once mastered, they enable remarkable expressiveness of complex programming concepts.

For example, here's a few lines of Prolog code that implements an object-oriented system with full polymorphism (the ability to send the same message to different object types with appropriate responses).

First, Prolog structures are used to represent the knowledge of the objects. These are class definitions with the signature of the type of object, and the methods that can be called for that object. This is the knowledge representation for our system.

oo_class( rectangle(H, W), 
        methods([ ( area(A) :- A is H * W ), 
                    ( perimeter(P) :- P is 2 * (H + W) ) 
                ]) ). 
oo_class( circle(R), 
        methods([ ( area(A) :- A is pi * R * R ), 
                    ( perimeter(P) :- 2 * pi * R ) 
                ]) ).

Next, a single Prolog meta-rule implements a polymorphic "send", the heart of any OO system. This is the reasoning engine for the system. It is a logical specification of the program. What is the logical specification of "send" in an OO system?

  • find the class definition for the object, and its list of methods;
  • find the method that matches the message being sent to the object;
  • call that method with the object's data.

Here it is in Prolog:

oo_send(Object, Message) :- 
oo_class(Object, methods(Ms)), 
member((Message :- Method), Ms), 
call(Method). 

An additional meta-rule that finds a member of a list is also needed.

member(X, [X|_]). 
member(X, [_|Z]) :- member(X, Z).

While the code for pricing was probably pretty clear, this code might look obtuse if you don't know Prolog. This is the face of Prolog that is harder to learn, but is extremely powerful for implementing all sorts of applications that are, as in our definition of AI, not in the nature of computers.

If the code above was in a file, called oo.pro, it could be tested in a Prolog listener like this:

?- consult(oo).
yes
?- oo_send(circle(1), area(A)).
A = 3.14159
yes
?- oo_send(rectangle(2,4), area(A)).
A = 8.0
yes

It is only one's imagination that limits the breadth of custom systems that can be built using these concepts.

Links

The Best of AI Expert

Using C++ for Backpropagation [July 1994] - David Cox outlines a C++ approach to implementing back propagation (how neural nets learn) neural networks. He illustrates the advantages of using object-oriented thinking in modeling the problem of neurons and synapses. Who says you need AI languages for AI?

Fuzzy Cognitive Maps Model Social Systems [July 1994] - Rod Taber describes a tool called called Fuzzy Cognitive Map (FCM) that is designed to model complex social systems. It allows for the specification of fuzzy causal links, such as increased drug prices lead to increased consumer costs which lead to increased political pressure for reform which lead to increased bureaucracy etc. etc. Such a model can be used to predict changes in complex systems.

Fuzzy Virtual Worlds [July 1994] - Bart Kosko and Julie Dickerson describe another use of fuzzy cognitive maps (FCM) for modeling in virtual reality systems. In stead of causal links between social systems, they talk of causal links describing behaviors of sharks and other fish. Fun stuff.

AI Languages

The home page for the Mercury project, with full documentation, a tutorial, and tools for developers as well as users.

http://www.swiss.ai.mit.edu/projects/scheme/ - a home page for Scheme resources.

A tutorial for Scheme.

An excellent compendium of books and resources about Lisp. I second the author's endorsement of Abelson and Sussman's book as one of the great computer science books.

http://www.amzi.com/customers/eotek.htm - an Amzi! customer story describing eoTek's use of Prolog for pricing logic.