The first question and answer to consider from this story is:
The ISAAC question understander can handle several basic types of questions, including the classic set of where, why, how, who, and when. For why and how questions, the system strips off the question word and does a simple transformation on the question to turn it into a declarative sentence, in this case the last man did not start running again after being worked on by the robot. This is given to the sentence processor to turn into a conceptual entity, as any other sentence is handled by the system. The conceptual entity created by this supertask looks as follows (I show two concepts here since all negations in the system are stored in this double concept format):
2|c|NEGATE-8111 | |
:IS-A | NEGATE |
:ACTION | FUNCTION-ACTION-8101 |
2|c|FUNCTION-ACTION-8101 | |
:IS-A | FUNCTION-ACTION |
:ACTOR | MAN-8092 |
:RELATIVE-TIME-INDEX | AFTER-8095 |
What these two concepts are indicating is that the man did not function at a time after the robot had attempted to ``fix'' him. The fact the man is the last man in the system is stored in the concept man-8092; similarly, the after-8095 concept contains the information that it is after the robot had made its attempt. This conceptual entity is then used as a general memory probe. The set of retrieved concepts are checked until one is found which ISAAC views to be an explanation. This is then returned as the answer.
In this question, the answer retrieved by ISAAC looks, in part, like:
2|c|KILL-4524 | |
:IS-A | KILL |
:ACTOR | ROBOT-325 |
:OBJECT | MAN-209 |
:CAUSES | FAIL-4678 |
The fail-4678 is matched to the negate-8111 concept, as well as the respective man and robot concepts in each concept matching. Thus, ISAAC indicates that this concept represents its answer to the question.
To turn this into an English sentence, two primary things were necessary. First, information not relevant to the question needed to be dropped from the possible answer. Second, a proper tense was selected.
The next question to consider is:
This is a simpler style of question to answer as it is asking for a straight factual response rather than an explanation. The transformation performed on the original question results in According to the narrator, the main difference between a man's skeleton and a robot's is what. This is turned into an internal representation captured in the concept of different-state:
2|c|DIFFERENT-STATE-4780 | |
:IS-A | DIFFERENT-STATE |
:OBJECT1 | SKELETON-4699 |
:OBJECT2 | SKELETON-4724 |
:DIFFERENCE-DESCRIPTION | UNKNOWN-4730 |
skeleton-4699 represents the man's skeleton, while skeleton-4724 is the internal representation for the robot's skeleton. Using this information as a memory cue, the memory retrieval returns the state description which explained that there are differences between men and robots, one of which is the skeletal composition. Two concepts are actually retrieved to answer this; one referring to the fact that man's skeleton is made of a calcium compound, and one referring to the fact that titanium makes up a robot's skeleton:
2|c|SKELETON-1235 | |
:IS-A | SKELETON |
:SUBJECT | MAN-1221 |
:COMPOSED-OF | CHEMICAL-COMPOUND-1229 |
2|c|SKELETON-1289 | |
:IS-A | SKELETON |
:SUBJECT | ROBOT-1239 |
:COMPOSED-OF | TITANIUM-1242 |
These are then translated into the single response which ISAAC is seen to give. Notice that I did drop the ``compound'' portion of the answer and simply went with the fact that man's skeleton is made of calcium.
Finally:
This is another inference question which the system is given to answer. In the course of reading the story, the model is told two pieces of seemingly contradictory information--the man had forgotten how to talk and the man later complains of the heat. ISAAC chose to explain this anomaly by hypothesizing that the man had learned how to speak again, since speaking is a main way people complain to other agents that they are near. Since the archaeologist is a scholar, by the definition of archaeologist, it is not too difficult to come to this conclusion. At the time the question is asked, then, it is another straightforward retrieval to arrive at the answer which is an instance of the taught concept with the robot as the actor and the man as the subject. Interestingly, several students chose different inferences. Some students indicated that the man communicated his discomfort through gestures, a few said that he could not have complained at all (although this would leave certain elements of the story unresolved, these students were apparently content with this). A large number did select the same inference that the system made; however, this is an inference and there is no strict right or wrong answer. The evaluator for this question accepted a wide range of inference answers as ``correct.''