Member Login
E-mail:
Password:

Reset Password

 

 

Vol 12, No. 4 (Summer 1995)

[article | discuss (0) | print article]

The Adolescence of CALL

Donald Loritz

THE ADOLESCENE OF CALL

Pioneering mainframe systems like PLATO brought CALL through infancy and childhood, but the Apple II defined the field's adolescence. Adolescence is a time of exploration, a time of energy and exuberance, a time when old ways are discarded, a time when new identities are born and born again. As the child is father of the man, adolescence presages maturity, and it was the Apple II that pioneered the present and future course of CALI. Certainly, it shaped my own growth and development as a researcher.

Childhood

My generation grew up together with computers and programming, but ours was not a wholly happy childhood. The modern computer was conceived and born during World War 11, where it was used to encode and decode encrypted war plans. This created a large, government subsidized industry whose success in war ensured its funding and success in the following decades. Even before these top-secret computers became publicly available, their theory had a pervasive influence on cognitive science, which

47

was then simply called "psychology." It was psychology that prescribed "programmed instruction" and "teaching machines," a kind of computer-assisted-instruction-as-if-one-had-computers of the very earliest, "drill and kill" variety (Lane 1964).

My earliest programming memories are of English 3200, a grammar textbook with some 3200 questions and answers about English grammar and absolutely no plot. I was a failure. Try though I did to be compliant, I could not be programmed.

Even when I grew up a little and switched from being programmed to doing the programming, my luck was bad. I got to Harvard just after Artificial Intelligence flunked out.

It is said that you can always tell the pioneers of computer science: they're the ones with the arrows in their chests. Al had been admitted in the mid 1950s to study machine translation under Professor Anthony Oettinger (I 969). In 1966 the Automatic Language Processing Advisory Committee (ALPAC) gave MT a failing grade and closed the door on further funding.1 In retrospect, when we envision legions of WWII secretaries transcribing Cyrillic onto keypunch cards, we now consider it something of a miracle that the MT effort of the 1950s and early 1960s made as much progress as it did.

Younger readers should be assured that there was, in fact, a time when punch cards (known to the cognoscenti as "Hollerith cards") were used to input data into computers. Punch cards were about the size of standard envelopes and about the weight of baseball trading cards. They were fed into a keypunch machine, which resembled a fender press in an automobile factory, except that it had a keyboard. Each key press on the keypunch would kerchunk several holes in successive columns of the 80 column card. One completely punched card corresponded to the primordial "record." In theory, human readable letters were supposed to be typed across the top o f the card, but after 10,000 cards the keypunch's nonreplaceable typewriter ribbon dried out, and a generation of computer science graduate students learned to read Hollerith cards like a kind of inverse Braille.

By the time I got to Harvard, the embarrassment of the MT failure had evolved into a rather unenthusiastic and inconspicuous attempt at computer assisted instruction (Oettinger 1969). 1 enrolled in a crash course in FORTRAN, but in this dispirited context no one told me that computers might be used for anything except mathematics. I remember well my first program, a deck of ten Hollerith cards that aspired to convert

48

the day's subfreezing temperature from Fahrenheit to Centigrade (and which deck took me a whole morning to punch without error). And I remember walking through a snowstorm to the batch window to retrieve my output:

0F0013DE

Program Error

Job Time 0.01 sec.

Like ALPAC, I gave up on computers. Of course, no sooner did I graduate than LISP and Artificial Intelligence were admitted to grad school at Harvard. In 1970 Woods implemented the first "augmented transition network”2 which successfully parsed natural language, and his subsequent LUNAR system3 went on to demonstrate a natural English interface for querying a database on the Apollo Mission's just retrieved moon rocks. Along with Winograd's SHRDLU program,4 in which a robot in a simulated microworld of variously shaped and colored blocks followed natural language commands, Woods's work established the viability of natural language processing (NLP) for a generation of researchers. Where MT had failed, NLP graduated summa cum laude. But I had already gone on to Boston University to study language learning, that poor cousin of psychology.

While I was at Harvard majoring in war resistance, Roger Brown began, also unbeknownst to me, an intensive study of three children who were just learning to speak. Brown focused on "grammatical morphemes" like English -ing or -ed, whose incidence might be quantified and analyzed to explain language learning. Brown's influential study5 set an entire generation of graduate students to work imitating him and counting the grammatical morphemes of learners' language.

Oblivious to Woods and Winograd, as I had been to Brown, I now found myself locked in my study recording and transcribing hours upon hours of broken English, counting therein all of the contracted copulas, -ing verbs, and a dozen other such wordlets, pinning them into notebooks like so many resplendent variegated butterflies. I had surely made my own little graduate hell, squinting at my accounts of morphemes, adding their sums, standardizing their deviations, and correlating their coefficients, all by hand and by slide rule.6

49

I despaired and contemplated law school until I discovered SPSS and interactive terminals. With the enthusiasm of an adolescent, I knew that I had been born again. I didn't have to be a lawyer. I could be a computational linguist! I could develop systems for computer assisted language learning!

SPSS, which liberated me from the calculator and the slide rule, ran on an IBM 360. The IBM 360 itself first emerged in the early Cenozoic and became the first commercially successful, meat eating mainframe. It was capable of performing multiple tasks at once, so long as none of them was very interesting. Counting morphemes satisfied this last criterion, but the IBM 360 still spoke only FORTRAN. Once again I took a crash course in FORTRAN, but this time my learning took place on an interactive terminal, and this time I passed. Then I traveled abroad to the Computer Center and lived there at night for a year. By the time I returned home, I had become fluent at automatically correlating and deviating all my morphemes.

Speaking the language, I discovered there were other tricks one could perform with language and numbers. Ken Steven's tab at MIT had just completed DARPA-SUR, the Defense Advanced Research Projects Administration's Speech Understanding Research Project. Begun in 1965, DARPA-SUR sought to perform speech understanding by computer. Yes, the idea was to talk to the computer and have it understand you.

Like machine translation before it, DARPA-SUR was widely pronounced a failure ten years after its inception (Lea 1980)-even in 1965 speech understanding was the technology of maƱana. But CALL was a special case. And in Ken Steven's tab there were tools for digitally recording speech, playing it back, and analyzing it. One could actually even make real-time spectrograms on the lab's new, unexpectedly powerful PDP-11.

The PDP-11 was Digital Equipment Corporation's most successful evolutionary challenge to IBM's mainframes. Capitalizing on Very Large Scale Integration, a manufacturing process that packed thousands of transistors onto a single silicon chip, DEC fit the computing power of a room sized IBM 360 dinosaur into a refrigerator sized PDP-11. The furry, warm and cuddly (if not user-friendly) PDP- II successfully exploited environmental niches like personal and real-time speech computing that the impersonal, time sharing IBM 3 60 dinosaur could not fit into its busy schedule.

50

Since my advisors, Paula Menyuk and Bruce Fraser, were both graduates Of MIT, they secured for me the Occasional privilege of studying in Ken Stevens' lab. But I was still a child in this candy shop; I could look, but not touch.

Puberty

When your hardware changes, your life changes. I (and about a million other IBM mainframe users) learned this lesson the hard way when Boston University's Computing Center (and computing centers worldwide) upgraded the IBM 360 to the IBM 370. The IBM 370's newer, bigger, faster hardware brought with it a new operating system, a "virtual system" which gave the new, big hardware the appearance of being even bigger by regularly "swapping out to disk" dormant sections of large programs. But whereas the old system was stable and reliable, the new system was unstable and crashed frequently, and many programs had to be completely rewritten for it. While the new version of SPSS worked, none of my programs did. Were all my morphemes counted in vain? Did I need to be born yet once again, this time of the Virtual System?

With daily crashes affecting the university's financial records, the staff at the computer center had no time for linguistics grad students, so I had to start over: learn a new version of FORTRAN, rewrite all my I/O, and recompile everything. I was distraught. In my anguish I began once again to contemplate law school.

First Love

In this dark moment, Annie entered my life. Annie was an Apple computer who winked at me cheerily with her little green cursor whenever I turned her on. I cast myself at her feet. I would be her knight. I would rewrite everything for Annie. Using a $200 modem Annie could communicate with the largest mainframes almost as quickly and easily as a computer center terminal. And I didn't have to change Annie's operating system until I wanted to.

It was by then the late 1970s, and the 8080 series of Intel microprocessors had succeeded in cramming a prodigious four dozen instructions onto a single chip! This was so technologically impressive that even weapons designers began to take notice of

51

microcomputers, and in 1976 researchers at Bolt Beranek and Newman organized some of the nation's first microcomputer based CALL projects, like Project ILIAD.7 Project ILIAD used some of the same Al technology as the LUNAR system, but on a Cromemco 8080-based microcomputer. It wasn't felt that microcomputers were fast or powerful enough to parse learner language, so Project ILIAD instead proposed only to generate grammar exercises for deaf learners of English.

I wanted to do similar research with Annie, but the BBN and MIT researchers only laughed. Their Cromemcos had the latest in VLSI technology: 48 instructions packed onto an 8080 microprocessor chip, and twin 8 inch disks for a combined storage capacity of 512 K. Annie was only 24—5 1/4—256. Their Cromemcos came with USCD PASCAL installed; Annie only came with lower-class BASIC. Still, when they called Annie "cheap" the insult could not go unanswered. I had to defend Annie's honor.

She was a poor man's computer, but there was nothing cheap about my Annie. The soul of her machine was a 6502 microprocessor. Although the 65 02 economized by implementing only about half as many instructions as the 8080, it compensated by "pipelining" its smaller instruction set. By being able to execute two constructions at once, Annie was twice as fast as the more expensive Cromemcos. Today such Reduced Instruction Set Computers (RISC) are advertised as the newest thing in computers, but in the early 1980s, they were considered obsolete. Complex Instruction Set Computers (CISC) were the wave of the future.

It was in August 1981 that IBM brought out its own personal CISC computer. Personal computers suddenly became respectable, and my special times with Annie were over. Soon, when I took her with me to conferences, everybody wanted to talk with her. At the TESOL Convention, Annie was a founder of the CALL Interest Section (Loritz 1983). She was there when CALICO was founded.

With a $300 analog-digital-analog converter board, Annie could do something I could never do with a time-shared mainframe. On a dare from Dennis Kiatt, I programmed Annie to record and analyze digitized speech just like MIT (Loritz 1983).

Being unencumbered by useless math instructions (which could always be delegated to a coprocessor), the 6502 was also optimally efficient for processing linguistics languages such as LISP and PROLOG, so by 1984 (at the 18th Annual TESOL Convention in Houston) Annie could parse learners' English, just like at MIT.

52

Well, not quite like MIT. The 6502 had some virtues, but it only ran at 1 MHz. Even though this made it equivalent to a 2 MHz CISC chip, it was still slow. And the 6502 was only an 8 bit processor: it could not readily sample the faint but spectrally significant 9th-12th bits of digitized speech, and it could only address 64K of RAM.

On the other hand, Annie had Potential. For example, one add-on board made the Apple II capable of addressing 16 megabytes of bank switched RAM (equivalent to "EMS" memory on IBM PCs). And by 1982 a 16 bit 65816 was in the works, so in the earl 1980s, it seemed it would be only a short time before an 8 MHz, 16 M Apple IV would he produced which could compete with the IBM AT. This would certainly make nearly all of MIT's PDP-11 technology accessible and affordable for the classroom.

Sturm und Drang

But Annie was busy growing up herself. She got a color monitor, and immediately she was surrounded by thousands of suitors. Still I was not jealous. They were only attracted to her painted face and new hard disks, but I loved Annie for her soul.

Then one day Annie's father, Steve Wozniak, gave Annie's hand to Steve Jobs! Poor me! Poor Annie! She still made all the money for Apple, Corporation, but Jobs only had eyes for Lisa, Apple's first MAC-like computer.

Because of its large, installed base in schools (and because it still produced almost all of Apple's revenues), the Apple II was allowed Ito survive, but only as an "educational" computer. Indeed, after wresting sole control of Apple Computer Corporation from Wozniak, Jobs at one Point canceled every Apple II/III research project! And when Apple Corporation finally brought out the Apple IV (as the Apple IIgs) in 1985, it was released with only 1 M of RAM running at 2 MHz. I shall never forgive Jobs for so humiliating Annie and for foisting such a dumbed-down computer on the education market. But he was worldly wise and rich and powerful, and I was young and naive.

Teen Angel

I am older and wiser now. After Annie left, I settled down with an IBM PC clone. I have lived with her faithfully now for nearly ten years. She has been upgraded several times, with newer, bigger, faster hardware, accompanied by new virtual operating systems,

53

each giving the newer, bigger hardware the appearance of being even bigger by regularly swapping out to disk the dormant sections of large programs.

Of course, these new systems are unstable and crash frequently, and many programs have had to be completely rewritten for them. While commercial programs work under these systems, few of my programs do. The analysts at PC Magazine used to be helpful, but now that PC system crashes affect all of American business, I find they have little time for the marital problems of a linguistics professor and his not so stylish PC. So I still like my PC best when she wears black and white and DOS 3.3, the same as when we had our first children. In those days, the children did what they were supposed to do. At times I don't understand this younger generation; it seems they never want to work.

About a year ago, I noticed a small story in the back pages of the Wall Street Journal announcing that "Apple Computer Corporation has canceled production of the Apple II line of computers." Although the Wall Street Journal may remember Annie for bringing color to computing, I will always remember her as the shy little processor from the wrong side of the tracks.

Eulogy

As Thorstein Veblen observed in 1899 in The Theory of the Leisure Class, technology progresses so that the affluent can indulge in conspicuous consumption. Since displays are the most conspicuous and most expensive hardware components in today's basic computers, and since graphics software is correspondingly conspicuous and expensive, it was fated that Annie should have been seduced by color and graphics. The fact that color is also 95% superfluous to 95% of all computer applications only makes color displays, as Veblen would say, "more honorific": computers, too, evolve by natural selection, and when it comes to courting development money, display is everything.

Engineers and technology have given society the capacity to feed all the hungry, clothe all the naked, and shelter all the homeless. In his 1918 book The Engineers and the Price System, Veblen explained how these developments would be threats to the social order, since they imply that prices of food, clothing, shelter can become too low to return an

54

adequate profit to financiers. Society now also has the capacity to inform all the ignorant, but to maintain an adequate profit, computer engineers now research and develop expensive packaging for information. By my estimate color displays and graphics programming add 200% to the cost of every workstation. Such workstations are like manuscripts illuminated for the glory of God and His Priests rather than disseminated for the illumination of the masses.

When I first met Annie, I dreamed that the nation — and even the world — could soon well afford a computer for every language student. Fifteen years later, we have hardly approached that goal. I live in what is, by some measures (and no thanks to me), one of the ten wealthiest counties in the United States. Yet in my county schools there is not a single school that regularly and systematically teaches language using computers. Surrounding counties are not significantly further advanced. I can't help but think that if Annie had been allowed to develop with intelligence and modesty, that dream could be real today.

Those of us who knew Annie in her youth will never forget her innocent simplicity. Such innocence rarely survives adolescence. Once she put on a color display, her simple virtue could never again be reclaimed. In the end, Annie was caught between two worlds: the information industry, which sells information at the highest price the market will bear, and education, which gives it away free. We will all remember Annie with sadness. She had a good soul, and then she became Prom Queen.

Notes

1 ALPAC: Automatic Language Processing Advisory Committee of the Division of Behavioral Science, National Academy of Science. Language and machines: computers in translation and linguistics. National Research Council Publication 1416. Washington, DC: National Academy of Science/National Research Council, 1966.

2 Woods, William A. "Transition network grammars for natural language analysis." Communications of the ACM 13: 591-606, 1970.

3 Woods, William A., Ronald Kaplan, and Bonnie Nash-Webber. The lunar sciences natural language information system: Final report, BBN Report 2378. Cambridge, MA: Bolt Beranek and Newman, 1972.

4 Winograd, Terry. Understanding Natural Language. New York: Academic Press, 1972.

55

5 Brown, Roger. A First Language. Cambridge, MA: Harvard University Press, 1973.

6 Younger readers will be amused by the slide rule, exhibits of which are still to be found in museums desperate for a collection. By sliding one of its logarithmic rules along the other, one could use a slide rule to quickly (albeit inaccurately) multiply two numbers by adding their logarithms. In 1986 I remember having a brief public debate with Bruno Bettelheim about slide rules. He maintained that some learning must be simple drudgery and adduced the multiplication tables as evidence. I objected that the slide rule could take the drudgery out of that task. Thirty years later I still find myself objecting to the drudgery model of learning, but my sword is no longer the slide rule.

7 Bates, Madeleine, and Kirk Wilson. Project ILIAD: Final report. Cambridge, MA: Bolt Beranek and Newman, 1980.

AUTHOR'S BIODATA

Donald Loritz studied English and Chinese at Harvard, Ed.D. in Applied Psycholinguistics from Boston University. Currently Associate Professor of Computational Linguistics at Georgetown University. CALL research on artificially intelligent morpheme counters for English, Chinese, Russian, and various other languages. When he becomes frustrated debugging program logic, he studies neural networks and language processors made of meat.

56