Advertisment

The 'C' Man

author-image
PCQ Bureau
New Update

Much before the world had even heard the term open systems, two computer scientists at the then AT&T Bell Labs, decided to try something completely unheard of. They tried to see if they could 'port' an operating system, developed by them, along with a few more colleagues a couple of years back, to different machines. One of them, Dennis Ritchie, thought the best way to do

that was to create a 'high level' programming language and recode the

entire operating system-UNIX-in that language. The operating system, originally written in machine language in 1969 was almost entirely rewritten in C-the new language that Ritchie developed-by 1973.

Advertisment

Anyone reading this, you don't have to be told what and how much UNIX and C have contributed to the computing world in particular, and the world,

in general. Yet, when on 12th October 2011, Ritchie was found dead in his house, there was little coverage in traditional media and the web. While the serious computing community was ...... by his death, the world at large, which was still mourning the death of Steve Jobs, who had passed away a week

back, did not even notice it. In any case, few outside the tech community knew-or know today-who he was.






“In reality, anyone who uses Internet or anyone who uses a computing device today, is doing that because of the contribution made by dmr,” wrote a blogger. Ritchie was popularly known as dmr, his user name.

Pretty much everything on the web uses those two things: C and UNIX,” Wired.com site said quoting Rob Pike, Ritchie's ex-colleague, current Google employee, and co-author of one of the most popular books on UNIX systems programming, Unix Programming Environment. “The browsers are written in C. The UNIX kernel - that pretty much the entire Internet runs on - is written in C,” Pike was furher quoted by the site. That pretty much gives an idea of the direct impact that C has had on the computing world-and today, on the Internet, that has taken computing to pretty much everybody's lives.

The indirect impact is much more. C was the first step to achieve what

are today called portability, open, and interoperability. It allowed

programmers to write codes without bothering so much about what machine the programs would run on. That was a small step for those who wanted to port UNIX to machines other than pdp11, the DEC machine on which it was created. But that was a giant leap for the computing world.

Advertisment

That is not all. C 'influenced' almost all the languages and environments that followed-be it C++ or Java. Most of these conceptually borrowed from the syntax of C. Many even call them C derivatives.

UNIX, to the development of which Ritchie greatly contributed, and whose C made it possible it to be ported to other machines, is, even today, in its different avatars,the de-facto OS for anything that is mission critical. Solaris, AIX, HP-UX, Linux-all these are derived from UNIX.

While Ritchie, along with Ken Thompson, won the Turing award in 1983, and did win the National Medal of Technology from the US president in 1999, his contributions are still not recognized enough.

Advertisment

PCQuest and Dataquest join the programmers community in mourning the death of this computing titan.

The Man of (Artificial) Intelligence

Advertisment

Just 12 days after Dennis Ritchie was found dead in his house, on 24th October, the computing world lost another luminary, John McCarthy. Unlike Ritchi's creation, that is everywhere today, McCarthy's contribution has still not materialized into something real. Yet, in terms of an idea, it still remains one of the biggest ideas in the history of computer science.






McCarthy is a pioneer in and the coiner of the phrase Artificial Intelligence. In one of the best books written ever on history of (any branch of ) computer science, Into The Heart of the Mind, author Frank Rose had traced how AI originated. He unequivocally credits McCarthy for the concept of AI.

"AI officially began with theDartmouth Conference, called in 1956 by John McCarthy, then aDarmouth mathematics professor, and Marvin Minsky, then a junior

fellow at Harvard. Minsky and McCarthy, together, with Nathaniel Rochester, an information specialist at IBM, and Claude Shannon, an information specialist as AT&T, got a $7,500 grant from the Rockfeller Foundation that year for a ten-man summer conference to discuss the almost unhear-of-idea "that every aspect of learning or any other feature of intelligence can in principle be so precisely

described that a machine can be made to simulate it" It was at Dartmouth, at McCarthy's urging, that the term artificial intelligence was adopted, and it was at Dartmouth that the first genuinely intelligent computer program was demonstrated...," the book notes. The book is not readily available these days, though.

McCarthy, who coined the term in 1956, defines AI as the science and engineering of making intelligent machines. Going by that definition, it is still a hot idea. But theoretical research on Artificial Intelligence (which was partly philosophical, asking questions, for instance, whether a silico sapiens is possible) faded out in the mid-90s, with no big breakthrough happening. But the pursuance of this area gave idea to a few concepts such as natural language understanding and machine learning that are still being researched.

Advertisment

Such was the interest among the reseachers on this area at one time that the researchers in this area bagged as many as four Turing Awards, a feat comparable only to two other areas, numerical methods and database. McCarthy himself won itin 1971.

McCarthy also invented LISP programming language.

About a decade back, he wrote a short story The Robot and the Baby, which

is still there in his site, asking readers if he should publish it. It light explored whether robots should have (or simulate having) emotions.

PCQuestmourns the passing away of the last of the philosopher-computer scientists, who went far beyond the technicalities to question and concetualize philosophical ideas while simultaneously contributing to hard core computer science.

Advertisment