ROBERT FLOYD (Roll No: 34)

Leave a comment



Computer pioneer Robert W. Floyd, a professor emeritus and former chair of the Department of Computer Science. Born in New York on June 8, 1936, Floyd was recognized as a child prodigy at age 6; he skipped three grades and finished high school at 14. A scholarship allowed him to study at the University of Chicago, where he received a bachelor’s degree in liberal arts in 1953 at age 17. After that he supported himself and earned another bachelor’s degree in physics in 1958. “His entire career in computing was based on autodidactic efforts,” recalls his second ex-wife, Christiane Floyd, a computer scientist at the University of Hamburg in Germany. “He started as an operator in the night shift, taught himself how to program, started to publish in the early 1960s and re-entered academia as an associate professor at Carnegie Mellon at the age of 27. He was only 32 when he got his full professorship at Stanford.”

“In the old days, programmers would just twiddle with programs till they seemed to work,” says Professor Emeritus of The Art of Computer Programming Donald Knuth. “Floyd showed that there was a way to prove programs would work.” His approach of marrying math with computer science was “a revelation to the field,” Knuth says.”Floyd’s 1960s method of invariants, in which assertions are attached to points in a computer program, is still the basis of much work in proving that computer programs meet their specifications,” says John McCarthy, professor emeritus of computer science.

His research included design and analysis of algorithms for finding the shortest paths in a network, parsing (decomposing) programming languages, calculating quantiles, printing shades of gray on a dot printer, sorting information and selecting random permutations and combinations.

His most important scientific achievement, however, was pioneering systematic methods of program verification. His seminal 1967 paper, “Assigning Meanings to Programs,” opened the field of program verification. His basic idea was to attach so-called “tags” in the form of logical assertions to individual program statements or branches that would define the effects of the program based on a formal semantic definition of the programming language. Many researchers in formal methods of computing worldwide adopted this method. One of the most important influences was on C. A. R. Hoare, who in 1969, starting from Floyd’s work, developed his calculus of pre- and post condition semantics for computer programs.

“The Art of Computer Programming eventually was transformed into a series of seven volumes and is still a work in progress”. Floyd may have been the first advocate of refactoring — the rewriting of working programs from scratch, re-using only the essential ideas. Refactoring is now standard practice among computer programmers. By continuously looking for simpler ways to do the same thing, Floyd aimed to improve not only programs but also programmers’ abilities and understanding.

At Stanford, he taught algorithmic courses, including “Sorting and Searching.” With his former graduate student Richard Beigel, he wrote a textbook titled The Language of Machines: An Introduction to Computability and Formal Languages (Computer Science Press, 1994).

He enjoyed backgammon — pitting his skill against computer opponents at home and human opponents in major tournaments, says friend and colleagues.

He also was a passionate hiker and mountain climber who loved the wilderness country in the High Sierra, recalls Christiane Floyd.Floyd is survived by four children — Susan Barnett of Butte, Mont.; Michael Floyd of Urbana, Ill.; Sean Floyd of Berlin, Germany; and Erik Schorr of Roseville, Calif. — as well as by his mother, Mary Floyd of Alexandria, Va., and siblings Shelby Floyd of Kamuela, Hawaii, Jeff Floyd of Alexandria, Va., and Fred Floyd of Elko, Nev.

In 1991, the Institute of Electrical and Electronics Engineers (IEEE) Computer Society awarded Floyd its Computer Pioneer Award for his work on early compilers. (A compiler is software that translates a computer program as a whole into machine code that is saved for subsequent execution at a desired time.)

In 1994, Floyd retired from Stanford .He was a fellow of the American Academy of Arts and Sciences, the American Association for the Advancement of Science and the ACM.

In 1978, Floyd won the Association for Computing Machinery (ACM) Turing Award — the highest honor in computer science — “for having a clear influence on methodologies for the creation of efficient and reliable software, and for helping to found the following important subfields of computer science: the theory of parsing, the semantics of programming languages, automatic program verification, automatic program synthesis, and analysis of algorithms.”

He died at Stanford University Medical Center on Sept 25 after a long illness. He was 65.

Floyd’s Cycle Detection   Algorithm (The Tortoise and the Hare)


tortoise = top
 hare = top
    if hare == end :
        return 'No Loop Found'
    hare =
  if hare == end :
         return 'No Loop Found'
     hare =
 tortoise =
 if hare == tortoise:
         return 'Loop Found'

How do you determine if your singly-linked list has a cycle? In the late 1960s, Robert W. Floyd invented an algorithm that worked in linear (O(N)) time. It is also called Floyd’s cycle detection algorithm.

The easiest solution to the cycle detection problem is to run through the list, keeping track of which nodes you visit, and on each node check to see if it is the same as any of the previous nodes. It’s pretty obvious that this runs in quadratic ( O(N^2)) time… not very efficient, and actually more complicated than this one.

The Tortoise and the Hare is possibly the most famous cycle detection algorithm, and is surprisingly straightforward. The Tortoise and the Hare are both pointers, and both start at the top of the list. For  each iteration, the Tortoise takes one step and the Hare takes two. If there is a loop, the hare will go around that loop (possibly more than once) and eventually meet up with the turtle when the turtle gets into the loop. If there is no loop, the hare will get to the end of the list without meeting up with the turtle.

How to find the length of the Loop ?
Once Hare and Tortoise finds the loop in a Singly linked List move Tortoise one step forward everytime maintaining the count of the nodes until it reaches the Hare again.


llllllhuuuSingly Linked List with a Loop

Both Tortoise and Hare starts at First node of the list and Tortoise moves 1 step forward while Hare moves 2 steps forward.


Hare and Tortoise starts at First Node of the List

 Hare           Tortoise
1                      1
3                      2
5                      3
7                      4
3                      5
5                      6
7                      7


Both Hare and Tortoise meet up at Node 7. That proves there is a loop in the list.


♣  Time Complexity: O(n)

♣  Space Complexity: O(1)

Floyd-Warshall All-Pairs Shortest Pairs Algorithm

A  Brief Introduction to Graphs

Graphs are a very important data structure in computer science. Many otherwise very difficult problems can be expressed in terms of a graph and solved using standard graph searching/manipulation algorithms.

Graphs can be simply viewed as a set of vertices/nodes and edges connecting these vertices. Mathematically G = (V,E), if u and v are elements of V, then an edge can be denoted as (u,v).

Graphics can be basically classified as either being undirected or directed. In the former, the edges do not have a direction i.e. (u,v) and (v,u) denote the same edge. In the latter the edges have a direction.

Another important aspect of graphs is the identification of cycles. A cycle is a path such that the first vertex and the last vertex are the same.

The following illustrates a directed graph that contains cycles and negatively weighted edges (see below). The graph consists of 5 vertices and 9 edges.


The adjacency matrix that represents the above graph is:

0 3 8 inf. -4
inf. 0 inf. 1 7
inf. 4 0 inf. inf.
2 inf. -5 0 inf.
inf. inf. inf. 6 0

NB: inf. Represents a distance of infinity, i.e. no (direct) connection.

Floyd-Warshall  All-Pairs-Shortest-Path algorithm

The Floyd-Warshall All-Pairs-Shortest-Path algorithm uses a dynamic-programming methodology to solve the All-Pairs-Shortest-Path problem. The algorithm runs in O(|V|3) time and negatively weighed edges may be present, however negatively weighted cycles cause problems with the algorithm.

For a detailed description of the dynamic approach used in the Floyd-Warshall algorithm see:

The main recursively definition of the algorithm is given by:


This leads to the following bottom-up procedure compute the all-pairs-shortest-path table.


SHA1CRK is a distributed brute force attack on the cryptographic hashing function SHA-1, using the micro-blogging site Twitter as a medium of communication. You can follow the initiative on Twitter using the tag #SHA1CRK. If you’re confused by the messages that are being tweeted, check out the  SHA1CRK protocol for a short explanation of their meaning.

If you want to learn more about the project, you can watch a presentation held at NDC 2010.

How Will SHA1CRK Find Collisions?

SHA1CRK uses a technique called Floyd’s cycle finding algorithm to find collisions in SHA-1.  This is the same technique that  MD5CRK  used a few years ago, and is illustrated by the figure to the right.


The algorithm can be described by analogy with a random walk. By its definition, SHA-1 is a function with a finite number of possible outputs, namely 2160. Using the principle that any such function placed in a feedback loop will cycle, one can use a relatively small amount of memory to store outputs with particular structures and use them as “markers”. These markers can then be used to detect when they have been “passed” before. The project calls these markers distinguished points, and the point where two inputs produce the same output is called a collision point. Just like MD5CRK, SHA1CRK considers any point whose first 32 bits are zeroes to be a distinguished point.

Before a collision will be found, a lot of near-collisions will be found. A near-collision with Hamming distance d is defined as two points that produce outputs that differ only in d bits. As d becomes smaller, these pairs may start to give hints about the structure of a collision point. The near-collision with the smallest Hamming distance will also be a good indicator of the progress of the project.


♠  ” It is, therefore, possible to extend a partially specified interpretation to a complete interpretation, without loss of verifiability, […] This fact offers the possibility of automatic verification of programs, the programmer merely tagging entrances and one edge in each innermost loop. ”

 ♠ “If the advancement of the general art of programming requires the continuing invention and elaboration of paradigms, advancement of the art of the individual programmer requires that he expand his repertory of paradigms. ”

 ♠ ” The establishment of formal standards for proofs in about programs […] and the proposal that the semantics of a programming language may be defined independently of all processors for that language, by establishing standards of rigor for proofs about programs in the language, appears to be novel. ”




Robert W. Floyd, Richard Beigel: Die Sprache der Maschinen. International Thomson 1996


Robert W. Floyd, Donald E. Knuth: Addition Machines. SIAM J. Comput. (SIAMCOMP) 19(2):329-340 (1990)


Robert W. Floyd, Jeffrey D. Ullman: The Compilation of Regular Expressions into Integrated Circuits. J. ACM (JACM) 29(3):603-622 (1982)


Robert W. Floyd, Jeffrey D. Ullman: The Compilation of Regular Expressions into Integrated Circuits (Extended Abstract) FOCS 1980:260-269


Robert W. Floyd: The Paradigms of Programming. Commun. ACM (CACM) 22(8):455-460 (1979)


Larry Carter, Robert W. Floyd, John Gill, George Markowsky, Mark N. Wegman: Exact and Approximate Membership Testers STOC 1978:59-65


Robert W. Floyd, Ronald L. Rivest: Expected Time Bounds for Selection. Commun. ACM (CACM) 18(3):165-172 (1975)
Robert W. Floyd, Ronald L. Rivest: The Algorithm SELECT – for Finding the ith Smallest of n Elements [M1] (Algorithm 489). Commun. ACM (CACM) 18(3):173 (1975)
Robert W. Floyd: The Exact Time Required to Perform Generalized Addition FOCS 1975:3-5


Robert W. Floyd, Alan Jay Smith: A Linear Time Two Tape Merge. Inf. Process. Lett. (IPL) 2(5):123-125 (1973)
Manuel Blum, Robert W. Floyd, Vaughan R. Pratt, Ronald L. Rivest, Robert Endre Tarjan: Time Bounds for Selection. J. Comput. Syst. Sci. (JCSS) 7(4):448-461 (1973)
Alfred V. Aho, Allan Borodin, Robert L. Constable, Robert W. Floyd, Michael A. Harrison, Richard M. Karp, H. Raymond Strong: Proceedings of the 5th Annual ACM Symposium on Theory of Computing, April 30 – May 2, 1973, Austin, Texas, USA STOC 1973


Robert W. Floyd: Permuting Information in Idealized Two-Level Storage. Complexity of Computer Computations 1972:105-109
Donald E. Knuth, Robert W. Floyd: Errata: Notes on Avoiding “go to” Statements. Inf. Process. Lett. (IPL) 1(4):177 (1972)
James C. King, Robert W. Floyd: An Interpretation-Oriented Theorem Prover over Integers. J. Comput. Syst. Sci. (JCSS) 6(4):305-323 (1972)
Manuel Blum, Robert W. Floyd, Vaughan R. Pratt, Ronald L. Rivest, Robert Endre Tarjan: Linear Time Bounds for Median Computations STOC 1972:119-124


Robert W. Floyd: Toward Interactive Design of Correct Programs. IFIP Congress 1971:7-10
Donald E. Knuth, Robert W. Floyd: Notes on Avoiding “go to” Statements. Inf. Process. Lett. (IPL) 1(1):23-31 (1971)


James C. King, Robert W. Floyd: An Interpretation Oriented Theorem Prover over Integers STOC 1970:169-179


Charles Stark “Doc” Draper -(Roll no 21)

Leave a comment

Charles Stark “Doc” Draper 

download (2)



BornOctober 2, 1901, Windsor

 DiedJuly 25, 1987, Cambridge

 EducationMassachusetts Institute of Technology (1928 – 1938), Massachusetts Institute of Technology (1926 – 1928),Massachusetts Institute of Technology(1922 – 1926), Stanford University,University of Missouri–Columbia

 AwardsNational Medal of Science for Behavioral and Social Science


“Father of inertial navigation,” Dr. Charles Stark Draper evolved the theory, invented and developed the technology, and led the effort that brought inertial navigation to operational use in aircraft, space vehicles, and submarines.


Born in Windsor, Mo., on Oct. 2, 1901, “Doc” Draper began his college work in arts and sciences at the University of Missouri in 1917, continuing on to graduate from Stanford University in 1922 with a B.A. in psychology. He entered MIT the same year, earning an S.B. in electrochemical engineering in 1926, an S.M. in 1928, and a Sc.D. in physics in 1938.


Doc began as an assistant professor in the Department of Aeronautics and Astronautics at MIT in 1935, and ultimately advanced to the post of Institute Professor in 1966. Doc’s Instrumentation Laboratory eventually divested from MIT in 1973 to become an independent nonprofit research and development laboratory—Draper Laboratory.


                                Draper Laboratory is an American not-for-profit research and development organization inCambridge, Massachusetts. Draper focuses on the design, development, and deployment of advanced technology solutions to problems in national security, space exploration, health care and energy.

The lab was created in the early 1930s by Charles Stark Draper at MIT as the Instrumentation Lab. It was renamed for its founder in 1970 and separated from MIT in 1973 to become an independent, non-profit organization.

Draper’s expertise includes the areas of guidance, navigation, and control technologies and systems; fault-tolerant computing; advanced algorithms and software solutions; modeling and simulation; and MEMS and multichip module technology.


docdraper (1)


First sponsored by the Sperry Gyroscope Co., Dr. Draper’s engineering work led to the development of the Mark 14 gunsight during World War II, where in 1942, the USS South Dakota, using Dr. Draper’s gunsights, shot down 32 Japanese attacking aircraft, an unprecedented antiaircraft score.


He continued work with gun pointing and firing control developments until the late 1950s. His focus and expertise in the use of gyros in inertial guidance systems led to such monumental achievements as the Apollo landing on the moon and development of guidance systems or components for all U.S. deployed strategic missiles. Nationally, Doc’s work has created a multibillion-dollar industry. The foundations of his work have since resulted in the development of a complete inertial navigation system for manned and unmanned vehicles and other autonomous applications for undersea, land, and air applications.

docdraper (2)docdraper

Among his more than 70 honors and awards, were the prestigious Langley Medal of the Smithsonian Institution, the NASA Public Service Award, the Dr. Robert H. Goddard Trophy of the National Space Club, and the National Medal of Science from President Lyndon Johnson.

In 1978, MIT established the Charles Stark Draper Professorship of Aeronautics and Astronautics in his honor. Dr. Draper received the “Engineering for Gold Award” from the National Society of Professional Engineers in 1984. The society cited his work in inertial guidance systems as one of the 10 outstanding engineering achievements of the past 50 years.Many of Dr. Draper’s former students are leaders in government, industry, the military, and academia.


He died in the Mount Auburn Hospital in Cambridge, Massachusetts, at age 85. He was eulogized as “one of the foremost engineers of our time”, and Howard Wesley Johnson, Chairman of the MIT Corporation, credited him for creating a “whole new industry in inertial instruments and systems for airplanes, ships, submarines, missiles, satellites and space vehicles”.


One of the world’s preeminent awards for engineering achievement, The Charles Stark Draper Prize was established by the National Academy of Engineering and endowed by Draper Laboratory in 1988 to recognize innovative engineering achievements and their reduction to practice in ways that have led to important benefits and significant improvement in the well-being and freedom of humanity.

The Prize recognizes achievement in all engineering disciplines, and engineers worldwide are eligible to receive it. The Prize is awarded annually during National Engineers Week in Washington, D.C.


Edsger W. Dijkstra by Ahalya M.S(Rollno:1)

Leave a comment




11 May 1930, Rotterdam, The Netherlands


6 August 2002, Nuenen, The Netherlands.

EDUCATION:download (1)

Gymnasium Erasmianum in Rotterdam (1948); undergraduate degree, physics, University of Leyden, (1956); PhD computer science, University of Amsterdam (1959). Honorary Degrees: Queen’s University Belfast, Athens University of Economics and Business.


Programmer, Computation Department of the Mathematical Centre in Amsterdam (1952–1962); Professor of Mathematics, Eindhoven University of Technology (1962–1973); Research Fellow, Burroughs Corporation (1973–1984); Schlumberger Centennial Chair in the Computer Science Department at the University of Texas at Austin (1984–2000).


Member of the Royal Netherlands Academy of Arts and Sciences (1971); Distinguished Fellow of the British Computer Society (1971); AFIPS Harry Goode Memorial Award (1974); Foreign Honorary member of the American Academy of Arts and Sciences (1975); IEEE Computer Society Computer Pioneer Award (1982); ACM/SIGCSE Award for outstanding contributions to computer science education (1989); ACM Fellow (1994); ACM Influential Paper Award for: “Self-stabilizing systems in spite of distributed control” Communications of the ACM 17, Vol. 11 (1974), pp. 643–644, 2002 (in 2003 this annual award was renamed the Dijkstra Prize).

                                                                                                                                                                                                      Edsger W. Dijkstra was born in 1930 in Rotterdam, The Netherlands. His father, a high-school chemistry teacher, served as president of the Dutch Chemical Society. His mother, who never held a formal job, had a lasting influence on his approach to mathematics and his emphasis on elegance.

Graduating from high school in 1948 and intending to become a theoretical physicist, Dijkstra thought the ability to use an electronic computer might be advantageous. Three years of programming at the Mathematical Center in Amsterdam convinced him that the intellectual challenge of programming exceeded that of theoretical physics, but where was the sound body of knowledge that could support programming as an intellectually respectable discipline? His boss, A. van Wijngaarden, persuaded him that in the years to come he could be one of the people to make programming a respectable discipline. Completing his study of physics as quickly as he could, Dijkstra forsook physics for programming.

                          At the Mathematical Centre a major project was building the ARMAC computer. For its official inauguration in 1956, Dijkstra devised a program to solve a problem interesting to a nontechnical audience: Given a network of roads connecting cities, what is the shortest route between two designated cities? The best known algorithms had running times which grew as the cube of the network’s size; the running time of Dijkstra’s algorithm grew only as the square. Developed in 20 minutes while Dijkstra was relaxing on a café terrace with his fiancée, Maria (Ria) C. Debets, his Shortest Path algorithm is still used in such applications as packet-switching software for computer communications.

Around the same time, Dijkstra invented another very efficient network algorithm for use in designing the X1 computer. Known as the Minimum Spanning Tree algorithm, it finds the shortest length of wire needed to connect a given set of points on a wiring panel. He published both network algorithms in a single paper in 1959.

When Dijkstra and Maria Debets married in 1957, the marriage rites required him to state his profession. When he stated that he was a programmer, the authorities objected that there was no such profession, and the marriage certificate instead identifies him as a theoretical physicist.

While at the Mathematical Center, Dijkstra worked on the very important “real-time interrupt” problem, which became the topic of his Ph.D. thesis.  Several computer manufacturers of the day were facing the same problem, but they had not approached the problem with the rigor that Dijkstra applied to it.


At the Mathematical Center, Dijkstra and J.A.Zonneveld developed the first compiler for Algol-60, a high-level programming language designed by an international committee. Completed in August 1960, their compiler predates the second Algol-60 compiler by more than a year. One of Algol-60’s great innovations, for which Dijkstra was instrumental, was the explicit introduction of recursion. He was probably the first to introduce the notion of a “stack” for translating recursive programs, reporting this seminal work in a short article.  In the Oxford English Dictionary, the terms “vector” and “stack” in a computing context are attributed to Dijkstra.

In 1962 Dijkstra was appointed Professor of Mathematics at the Eindhoven University of Technology. There he built the THE operating system (named for the university, then known as Technische Hogeschool te Eindhoven), which has influenced the design of many subsequent operating systems. It introduced a number ofdesign principles which have become part of the working vocabulary of every professional programmer. Introducing the reprint of Dijkstra’s article on the THE operating system in the 25th Anniversary issue ofCommunications of the ACM, the Editor-in-Chief wrote, “This project initiated a long line of research in multilevel systems architecture—a line that continues to the present day because hierarchical modularity is a powerful approach to organizing large systems.”

In 1968, Dijkstra published a brief letter to the editor in Communications of ACM, titled “Go To statement considered harmful”, arguing that the GO TO statement, found in many high-level programming languages, is a major source of errors, and should therefore be eliminated. There ensued a giant commotion in the computing community, with combatants taking positions on all sides of the issue. The debate has long since subsided; programming languages now provide alternatives to the GO TO. Few programmers today use it liberally, and most never use it at all.

Around this time, Dijkstra was beginning to formulate some of his early ideas about programming as a mathematical discipline. He pointed out that software productivity and reliability is closely related to rigor in design, which eliminates software flaws at an early stage. He was particularly impressed by the immensity of the so-called “software crisis” when he attended the famous 1968 NATO Conference on Software Engineering, the first conference devoted to the growing epidemic of software delivered late, over budget, and full of flaws. Convinced that programming methodology should become a scientific discipline, he decided to study how to avoid complexity in software designs.

Dijkstra’s “Notes on Structured Programming,” circulated to a few friends for their comments and soon became a sensation, and major corporations initiated programs based on his ideas to integrate rigorous practices into their programming projects. Subsequently published and still in print after nearly 40 years, this work has had far-reaching impact on all areas of computer science, from the teaching of the first courses in programming to the design of complex software. Mathematical analyses of program design and specifications have become central activities in computer science research.

Dijkstra’s acceptance speech for the 1972 ACM Turing Award, titled “The humble programmer”, includes a vast number of observations on the evolution of programming as a discipline and prescriptions for its continued growth. It is required reading for any aspiring computer scientist.

In August 1973 Dijkstra joined Burroughs Corporation as a Research Fellow. His duties consisted of consulting at some of the company’s research centers a few times a year and pursuing his own research. Among his significant contributions from this period is the development of a theory of nondeterminacy , a concept outside traditional mathematics. Dijkstra was the first to observe not only that nondeterminacy is central in computations whose components interact asynchronously, but also that even when no asynchrony is involved, nondeterminacy is an effective tool for reasoning about programs and simplifying program design.

His other major contribution during this period was the development of “predicate transformers” as a basis for defining program semantics and as a tool for deriving programs. His ideas refined the earlier ideas of C. A. R. Hoare for an axiomatic basis of computer programming. He expounded these ideas along with nondeterminacy in A Discipline of Programming, which has been identified by the Science Citation Index as a “Citations Classic”.

The Burroughs years were Dijkstra’s most prolific in terms of research articles. He wrote nearly 500 documents in the EWD series, most of them technical reports, for private circulation within a select group.

As a frequent visitor to the Burroughs Research Center in Austin, Texas starting in the late 1970s, Dijkstra had become familiar with the Department of Computer Science of the University of Texas at Austin. In 1984 Dijkstra accepted an appointment as the Department’s Schlumberger Centennial Chair.

During his eighteen years in Austin, Dijkstra continued as a prolific researcher. Having earlier embarked on a long-term project for “Streamlining Mathematical Arguments”, in Austin he co-authored a book on predicate calculus [9] advocating a “calculational proof style” for mathematical arguments. He continued to apply his method in a number of diverse areas: coordinate geometry, linear algebra, graph theory, designs of sequential and distributed programs, and many others.

The years in Austin saw Dijkstra at his best as a teacher and mentor for a generation of undergraduate and graduate students. From his days in the Eindhoven University of Technology he had thought deeply about how computer science should be taught, and Austin provided him the opportunity for trying out his ideas.He enjoyed the experience, appreciating “… brilliant students who made it a challenge and a privilege to lecture for them” .He urged universities not to shrink from the challenge of teaching radical novelties.

                            On the occasion of Dijkstra’s 60th birthday in 1990, the Department of Computer Sciences organized a two-day seminar in his honor. Speakers came from all over the US and Europe, and a group of computer scientists contributed research articles which were edited into a book.

Dijkstra retired from active teaching in November 1999. To mark the occasion and to celebrate his forty-plus years of seminal contributions to computing science, the Department of Computer Sciences organized a symposium, which took place on his 70th birthday in May 2000. The symposium was attended by a large number of prominent computer scientists as well as current and former students.

Returning to The Netherlands in February 2002, Dijkstra died in Nuenen on 6 August 2002.

Dijkstra’s Aphorisms and Epigrams



Dijkstra was famous for his wit and eloquence, such as in his remark:


The question of whether computers can think is like the question of whether submarines can swim;
or his advice to a promising researcher, who asked how to select a topic for research:


Do only what only you can do;


and his remark in his Turing Award acceptance speech:


In their capacity as a tool, computers will be but a ripple on the surface of our culture. In their capacity as intellectual challenge, they are without precedent in the cultural history of mankind.


Compiled below are a few more of Dijkstra’s other memorable epigrams.


The tools we use have a profound and devious influence on our thinking habits, and therefore on our thinking abilities.


Brainpower is by far our scarcest resource.

Program testing can at best show the presence of errors but never their absence.


The competent programmer is fully aware of the strictly limited size of his own skull; therefore he approaches the programming task in full humility, and among other things he avoids clever tricks like the plague.

So-called natural language is wonderful for the purposes it was created for, such as to be rude in, to tell jokes in, to cheat or to make love in (and Theorists of Literary Criticism can even be content-free in it), but it is hopelessly inadequate when we have to deal unambiguously with situations of great intricacy, situations which unavoidably arise in such activities as legislation, arbitration, mathematics or programming. (foreword to Teaching and Learning Formal Methods, edited by C. N. Dean and M. G. Hinchey, Academic Press, 1996)

Computer Science is no more about computers than astronomy is about telescopes.


Being abstract is something profoundly different from being vague.


Nothing is as expensive as making mistakes.


In the practice of computing, where we have so much latitude for making a mess of it, mathematical elegance is not a dispensable luxury, but a matter of life and death.


If 10 years from now, when you are doing something quick and dirty, you suddenly visualize that I am looking over your shoulders and say to yourself, Dijkstra would not have liked this, well that would be enough immortality for me.


Martin Campbell Kelly-Ammu Archa P,Roll no:04

Leave a comment

Martin Campbell Kelly
Martin Campbell-Kelly is an English computer scientist based at the University of Warwick who has specialised in the history of computing. Campbell-Kelly is professor emeritus in the Department of Computer Science at the University of Warwick. He is on the editorial board of the IEEE Annals of the History of Computing journal. He is a committee member of the Computer Conservation Society, a Specialist Group of the British Computer Society.
His books include Computer: A History of the Information Machine, co-authored with William Aspray, From Airline Reservations to Sonic the Hedgehog: A History of the Software Industry, and ICL: A Business and Technical History. He is editor of the Collected Works of Charles Babbage. Professor Campbell-Kelly is a Fellow of the British Computer Society, visiting professor at Portsmouth University, and a columnist for the Communications of the ACM. He is a member of the ACM History Committee, a council member of the British Society for the History of Mathematics, and a committee member of the BCS Computer Conservation Society. He is a member of the editorial boards of the IEEE Annals of the History of Computing, the International Journal for the History of Engineering and Technology, the Rutherford Journal, and editor-in-chief of the Springer Series in the History of Computing.

John Von Neumann – (Roll No : 15)

Leave a comment

John Von Neumann


Born 28 December 1903, Budapest, Hungary.

Died 8 February 1957, Washington DC.

Brilliant mathematician, synthesizer, and promoter of the stored program     concept,  whose logical design of the IAS  became the prototype of most of  its successors – the von Neumann Architecture.Image


Top of Form

Bottom of Form

John von Neumann was a famous Hungarian-American mathematician, who is still revered for his unparalleled contributions to disciplines like mathematics, science, economics and many more. Born in a wealthy family to Jewish parents, he was a child prodigy, exhibiting great analytic and computing skills. He was able to divide eight – digit numbers in mind in very early ages, which proves his great analytical and computing skills. He was a passionate learner and was interested in many subjects like physics, economics, statistics, history etc., apart from mathematics. He was a prominent member of the Manhattan Project and the Institute for Advanced Study in Princeton. His unusual abilities astonished many great personalities like mathematician Jean Dieudonne Peter Lax, Hans Bethe etc. Hans Bethe once said, “I have sometimes wondered whether a brain like von Neumann’s does not indicate a species superior to that of man”. Read through this biography to know more about the career and life of this genius.

The Institute of Electrical and Electronics Engineers (IEEE) continues to honor John von Neumann through the presentation of an annual award in his name. The IEEE John von Neumann Medal was established by the Board of Directors in 1990 and may be presented annually “for outstanding achievements in computer-related science and technology.” The achievements may be theoretical, technological, or entrepreneurial, and  need not have been made immediately prior to the date of the award.


      Von Neumann’s interest in computers differed from that of his peers by his quickly perceiving the application of computers to applied mathematics for specific problems, rather than their mere application to the development of tables. During the war, von Neumann’s expertise in hydrodynamics, ballistics, meteorology, game theory, and statistics, was put to good use in several projects. This work led him to consider the use of mechanical devices for computation, and although the stories about von Neumann imply that his first computer encounter was with the ENIAC, in fact it was with Howard Aiken’s Harvard Mark I (ASCC) calculator.


Quantum Mechanics

After receiving his degrees in 1926, Neumann attended the university in Gottingen, Germany, where he dealt with quantum mechanics. He was creative and an original thinker, and came up with complete and logical concepts. He worked on the theories of quantum mechanics in 1926 to reconcile and improve them. Neumann worked to find out the similarities between wave mechanics and matrix mechanics. Also, von Neumann worked on the rules of ‘abstract Hilbert space’ and devised a mathematical structure in terms of quantum theory.

Notable Contributions

Neumann contributed to the Los Alamos project devised by the American government in the development of the nuclear weapons and also in developing the concept and design of explosive lenses. The mathematical modelling which he used at Los Alamos helped him in the development of the modern computer. Along with gathering resources, he also funded the project for the development of the modern computer. He worked on the architecture of the computer as well. His efforts made other scientists realize that computer is not just a ‘bigger calculator’. Quantum logic, game theory, linear programming and mathematical statistics are some of the many contributions he had made to the field of science.




Leave a comment

David  A.Huffman


David Albert Huffman (August 9, 1925 – October 7, 1999) was a pioneer in the computer science field.

Throughout his life, Huffman made significant contributions to the study of finite state machines, switching circuits, synthesis procedures, and signal designs. However, David Huffman is best known for the invention of Huffman code, a highly important compression scheme for lossless variable length encoding. It was the result of a term paper he wrote while a graduate student at the Massachusetts Institute of Technology (MIT), where he earned a D.Sc. degree on a thesis named The Synthesis of Sequential Switching Circuits, advised by Samuel H. Caldwell (1953).                   

                                                                                Image        “Huffman Codes” are used in nearly every application that involves the compression and transmission of digital data, such as fax machines, modems, computer networks, and high-definition television (HDTV), to name a few.


A native of Ohio, Huffman earned his B.S. in electrical engineering from Ohio State University at the age of 18 in 1944. He then served in the U.S. Navy as a radar maintenance officer on a destroyer that helped to clear mines in Japanese and Chinese waters after World War II. He subsequently earned his M.S. degree from Ohio State in 1949 and his Ph.D. from MIT in 1953, also in electrical engineering.

Huffman joined the faculty at MIT in 1953. In 1967, he went to University of California, Santa Cruz as the founding faculty member of the Computer Science Department. He played a major role in the development of the department’s academic programs and the hiring of its faculty, and served as chair from 1970 to 1973. He retired in 1994, but remained active as an emeritus professor, teaching information theory and signal analysis courses.

Huffman made important contributions in many other areas, including information theory and coding, signal designs for radar and communications applications, and design procedures for asynchronous logical circuits. As an outgrowth of his work on the mathematical properties of “zero curvature Gaussian” surfaces, Huffman developed his own techniques for folding paper into unusual sculptured shapes (which gave rise to the field of computational origami).



Huffman’s accomplishments earned him numerous awards and honors. Most recently, he received the 1999 Richard Hamming Medal from the Institute of Electrical and Electronics Engineers (IEEE) in recognition of his exceptional contributions to information sciences. He also received the Louis E. Levy Medal of the Franklin Institute for his doctoral thesis on sequential switching circuits, a Distinguished Alumnus Award from Ohio State University, and the W. Wallace McDowell Award. He was a charter recipient of the Computer Pioneer Award from the IEEE Computer Society, and he received a Golden Jubilee Award for Technological Innovation from the IEEE Information Theory Society in 1998.

David Huffman died in 1999 after a 10-month battle with cancer.

Huffman never tried to patent an invention from his work. Instead, he concentrated his efforts on education. In Huffman’s words, “My products are my students.” 


Brian Wilson Kernighan (Roll No:34)

Leave a comment




Brian Wilson Kernighan was born in Toronto, Ontario, Canada in 1942. He is currently employed by Princeton University. He wrote The C Programming Language and The Practice of Programming. He has worked as a computer scientist and a programmer, and his current occupation is scientist. who coined the term Unix in the 1970s. The original term he coined was Unics (for Uniplexed Information and Computing Service, a play on Multics), which was later changed to Unix, worked at Bell Labs alongside Unix creators Ken Thompson and Dennis Ritchie and contributed to the development of Unix. He is also coauthor of the AWK and AMPL programming languages. The ‘K’ of K&R C and the ‘K’ in AWK both stand for ‘Kernighan’. Brian Kernighan is currently a Professor at the Computer Science Department of Princeton University, where he is also the Undergraduate Department Representative.
Brian W. Kernighan is head of the Computing Structures Research Department, BellLaboratories, Murray Hill, New Jersey. He received a B.A.Sc in engineering physics from theUniversity of Toronto in 1964, and a Ph.D. in electrical engineering from Princeton University in1969. Since joining Bell Labs in 1969, he has worked in combinatorial optimization, documentpreparation systems, programming languages, and software tools. His current research interests are in application-oriented programming languages, programming methodology, and user interfaces.
Dr. Kernighan is the co-author of several books, including “The C Programming Language” and“The UNIX Programming Environment”.
Brian Kernighan is in the Computing Science Research Center at Murray Hill, where he has beenin the same office since 1969. (It has been painted once.) He writes programs and occasionally books. The latter are better than the former, and certainly need less maintenance.
Kernighan was the software editor for Prentice Hall International. His “Software Tools” series spread the essence of ‘C/Unix thinking’ with makeovers for BASIC, FORTRAN, and Pascal – and most notably his ‘Ratfor’ (rational FORTRAN) was put in the public domain.


Here is a summary of the very important programming style tips from Brian Kernighan’s 1994 guest CS50 lecture.
  Say what you mean, simply and directly.
 Use the “telephone test” for readability.
Write clearly – don’t be too clever.
Don’t use conditional expressions as a substitute for a logical expression.
Parenthesize to avoid ambiguity.
 Each time you make a test, do something.
Follow each decision as closely as possible with its associated action.
Use the good features of a language; avoid the bad ones.
Capture regularity in control flow, irregularity in data.
Each module should do one thing well.
Make sure comments and code agree.
Don’t just echo the code with comments – make every comment count.
Don’t comment bad code – rewrite it.
 Use symbolic constants for magic numbers.
Watch out for side effects and order of evaluation.
Macros are not functions.
Watch out for off-by-one errors.
Test programs at their boundaries.
Program defensively.
Make sure input cannot violate the limits of the program.
Make it right before you make it faster.
Keep it right when you make it faster.
Don’t sacrifice clarity for small gains in “efficiency.”
Don’t stop with your first draft.


                                Key terms and concepts: a cost matrix plus connectivity matrix models system • measure isthe cut cost, or cut weight • careful to distinguish external edge cost and internal edge cost •net-cut partitioning and edge-cut partitioning • hypergraphs with stars, and hyperedges modelconnections better than edges • the Fiduccia–Mattheyses algorithm uses linked lists to reduce O(K–L algorithm) and is very widely used • base logic cell • balance • critical net



 (a) The network contains a net y with three terminals.
 (b) In the network hypergraph we can model net y by a single hyperedge (B, C, D) and a star node.
 Now there is a direct correspondence between wires or nets in the network and hyperedges in the graph.

Untitled-1 copy


 (a) Shows how swapping node 1 of partition A with node 6 of partition B results in a gain of g =1.
 (b) A graph of the gain resulting from swapping pairs of nodes.
(c) The total gain is equal to the sum of the gains obtained at each step.



(a) An example network graph.

(b) The connectivity matrix, C; the column and rows are labeled to help you see how the matrix entries correspond to the node numbers in the graph.For example, C 17 (column 1, row 7) equals 1 because nodes 1 and 7 are connected.In this example all edges have an equal weight of 1, but in general the edges may have different weights.


  • The C Programming Language home page.
  • The Practice of Programming home page.main
  • The Unix Programming Environment home page.
  • The AWK Programming Language home page.
  • AMPL: A Modeling Language for Mathematical Programming home page.
  • Scanned pages of An Efficient Heuristic Procedure for Partitioning Graphs (Bell System Technical Journal, February, 1970); 750KB pdf.
  • Scanned pages of An Effective Heuristic Algorithm for the Travelling-Salesman Problem (Operations Research, March, 1973); the format needs work.
  • Why Pascal is Not My Favorite Programming Language (April, 1981).
  • WiSE – A Wireless System Engineering Tool, an application of computational geometry, optimization and visualization to wireless.
  • Experience with Tcl/Tk for Scientific and Engineering Visualization, programming issues in the wireless work above; a version appeared in the Tcl/Tk Workshop, Toronto, 1995. (Careful: 11Mb of Postscript when unzipped.)
  • Extracting Geometric Information from Architectural Drawings, with Chris Van Wyk, from the Workshop on Applications of Computational Geometry, Philadelphia, May, 1996. (630K)
  • Timing Trials, or, the Trials of Timing: Experiments with Scripting and User-Interface Languages, with Chris Van Wyk, describes experiments to see how fast various scripting and user interface languages, from Awk to Visual Basic, run on a spectrum of representative tasks. Postscript version (250KB). The tests themselves are also available, as is the input data (1.7Mb). Updated 11/30/97.
  • “An AWK to C++ Translator” (Postscript) describing an early experiment; published in the Usenix C++ conference in 1991 but hard to find.
  • Bibliography in bibtex format or in html format.


Information about the AMPL modeling language for mathematical optimization.

 I have had trouble updating this code, so the current version (October 23, 2007) is found at Princeton: newest Awk source

✔ Source for the one true awk, updated May 1, 2007. Shell archive; gzipped tar file; zip file. There is also a Windows executable.

  Examples from The AWK Programming Language by Aho, Kernighan, and Weinberger as text (120KB) or zipped (30KB).

All the example code from The Unix Programming Environment by Kernighan and Pike: gzipped tar file; zip file.

  Source for the hoc calculator from The Unix Programming Environment by Kernighan and Pike.

 Software Tools programs

Software Tools in Pascal programs

✔  Source for learn, the original Unix computer-aided instruction program. This works but is mostly of historical interest; it dates from about 1979. (If your version of the ar command gives you trouble, this version has the files already extracted and does not use ar.)

✔  A very dusty version of code for algorithm animation (with Jon Bentley).

✔  Code for timing tests (with Jon Bentley and Chris Van Wyk, updated 7/96).

 Miscellaneous ancient typesetting code, including chem and indexing tools (with Jon Bentley).


The most effective debugging tool is still careful thought, coupled with judiciously placed print statements

Brian W. Kernighan, in the paper Unix for Beginners (1979)

Controlling complexity is the essence of computer programming.

Brian W. Kernighan

“Everyone knows that debugging is twice as hard as writing a program in the first place. So if you’re as clever as you can be when you write it, how will you ever debug it?”

Brian W. Kernighan

“I’d give my right arm to be ambidextrous.”

Brian W. Kernighan

Older Entries