Lesson 10: A Brief History of Computer Languages

This course would not exist without there first being computer languages. However, those languages first had to be invented, and in the early days of electronic computers, there was only one kind of language a computer understood: its own machine code. There are raw instructions, originally found on punched cards, much like the ones FORTRAN programmers would later use. However, these cards would be punched with arbitrary-seeming binary symbols that stood for instructions the computer would perform, such as addition, moving data from place to place, or making a comparison and jumping to elsewhere in the program if that comparison came out True. To a human, these were illegible without training.


[Commodore (Admiral) Grace M. Hopper, Photo by James S. Davis, United States Navy, 1984]

And so enters Grace Brewster Murray Hopper. Born Grace Brewster Murray on December 9, 1906, she was a brilliant mathematician, graduating from Vassar College with a BA in both mathematics and physics, and going on to Yale to complete her MA and PhD. That mathematics PhD alone made her a rarity at the time, but she would go on to show her creativity and persistence could pay off in spades.

Entering Harvard's Cruft Laboratories in the late years of WWII, she helped in the construction and programming of the Mark I, II and III computers from that lab. Through her leadership, code written by programmers would be shared and distributed to the whole staff, to be re-used in future programs (similar to modern libraries and modules) and by 1949, mnemonics for the common instructions were used to make code more human-readable.

The Admiral and her team would then work on the first ever compiler, named simply A-O, which would translate entirely human-readable instructions into machine code. These combined with the various program libraries collected on magnetic tape data storage for the machines, and allowed a program to read the necessary subroutines (module functions) from the tape as the program requested. The description for A-O, she published in 1952 as "The Education of a Computer". But this compiler was not enough for Admiral Hopper; she wanted programming to be available to anyone, not just the highly skilled technicians she worked with.

The journey ahead lead to the Univac computers, and the creation of a new compiler, initially named B-O. Likely for obvious reasons, the name didn't stick, and was rapidly replaced with FLOW-MATIC, a language with compiler intended for billing, accounting and payroll systems. FLOW-MATIC was not a complete programming language, but it allowed the UNIVAC I and II computers to understand twenty statement types and produce correct results. The Admiral pushed for a full compiler using English-like statements, but was told she couldn't do this, because computers don't understand English. Over the next few years, she proved that you could do it, if you first figure out how.


Admiral Grace Hopper's presence in the meetings for CODASYL, working to make a common business language, were vital to the creation of COBOL, the first computer language to see full use in commerce. Her work with FLOW-MATIC was vital to this, and created the template for COBOL as it became a full language, with the first specification for COBOL appearing in 1959. Her presentation and marketing skills proved invaluable in convincing businesses to take up the new language, and it soon spread to all sectors of American business. She then went on to work with the Navy to create standardization of the compiler, and validation systems to ensure other's implementations of the compiler would be correct. This set the stage for future compiler standards for all future languages.

COBOL code is not particularly nice to write, as full statements were written very verbosely. COBOL is intended to be very readable and nearly self-documenting, but it could be tedious to write at times. Additionally, data had to be set up in advance similarly to machine code, rather than declaring variables as needed, the way we do today.

PROGRAM-ID. Multiplier.
AUTHOR. Michael Coughlan.
* Example program using ACCEPT, DISPLAY and MULTIPLY to 
* get two single digit numbers from the user and multiply them together


01 Result PIC 99 VALUE ZEROS.

 DISPLAY "Enter first number (1 digit) : " WITH NO ADVANCING.
 DISPLAY "Enter second number (1 digit) : " WITH NO ADVANCING.
 DISPLAY "Result is = ", Result.

[Example COBOL multiplier script from http://www.csis.ul.ie/cobol/examples/ last updated 2002]


Not long after Grace Hopper's compiler, IBM computer programmers led by John Backus put together their own language and compiler, for a different audience. Their language, intended for mathematics and physical simulations, was based on FORmulas that would be TRANslated into machine code for the computer's use, thus, FORTRAN.

This language, from 1957, introduced greater portability of code between computer architectures, as well as being simple to learn and less rigidly defined. Variables could be declared in a more familiar modern way, rather than COBOL's machine-language-like structure. FORTRAN also introduced the concept of subroutines that return values as functions, a name that has stuck for most languages since.

FUNCTION func_name(a, b)
    INTEGER :: func_name
    INTEGER :: a
    REAL    :: b
    func_name = (2*a)+b

    INTEGER :: func_name
    PRINT *,func_name(2, 1.3)

[An example of FORTRAN designed to be F77 compatible, from https://en.wikibooks.org/wiki/Fortran/Fortran_procedures_and_functions last updated 2016]


Meanwhile, in the field of Artificial Intelligence, John McCarthy and colleagues at Dartmouth worked on a new language, intended to make AI construction easier. Their work was intended for use on the IBM 704 computer, a choice that influenced the language's design greatly. The List Processing language, Lisp, thus uses the rather cryptic commands car and cdr to refer to parts of a given "word" in the 704's data store, which correspond to the next-address and data portion of the linked list structures it uses for nearly everything. Lisp's first implementations saw use in 1958.

Lisp uses a prefix notation for all its operations. Instructions are wrapped in parentheses () and the parts separated by spaces. The first item inside the parentheses is always the "function" part of the instruction, whether it's a user-defined function, a built-in function, or one that you'd normally think of as an operator. For instance, (+ 2 3) is an instruction to add 2 to 3, and the return value is 5.

Lisp also has function recursion baked in as a safe way of handling data, so a function can return a value that's the result of itself with different input. If you try to do this in most languages, the result will be a "stack overflow" as the program will keep adding function information to the computer's memory with each recursion until it runs out of stack space. In Lisp, as well as other "tail-call optimized" languages, it's understood that you don't need the old function information anymore, so the computer discards it and inserts the information for the new call in its place, saving the stack from destruction.

The example below demonstrates how compact this can make the code, while also showing how difficult it can sometimes be to keep track of all the nested parentheses in this language.

(defun factorial (N)
  "Compute the factorial of N."
  (if (= N 1)
    (* N (factorial (- N 1)))))

[Example Lisp program that creates a function for calculating factorials, using tail recursion. Dr. Philip Fong, from https://www.cs.sfu.ca/CourseCentral/310/pwfong/Lisp/1/tutorial1.html]

Lisp also introduced a concept familiar to any Python programmer: the Read-Eval-Print Loop, or REPL. Entering Lisp statements into the REPL allowed the statements to be immediately evaluated, and their results printed, making the otherwise somewhat obtuse language much easier to understand.


In the 1960s, new methods for arranging computer memory and programs were being considered. Most languages used the "imperative" model that was closest to the machine code, while Lisp held a functional structure (one where every statement in the language could be considered a kind of function). But all data in these languages was handled either with global variables, or with local variables within functions and subroutines. So, how would they handle more complex kinds of data, with built in functionality attached? As you'll see in Lesson 11, the answer to this exists in Python: Objects. But where did the concept start?

Here, Alan C. Kay comes into play. He would go on to invent Smalltalk, but first he had to get the inspiration for objects in the first place. An early device with this structure was the Sketchpad interface, which would allow a user to draw something on a terminal, and send that graphics data to other users of the system on their own terminals. The graphics data was stored in "instances" of "masters" that described what those instances would do. The idea enthralled Kay when he encountered the documentation in 1966, despite how hard it was to understand. He realized this had far greater possibilities than just this device.

He would go on to work with colleagues on the FLEX language and its corresponding devices, intended to move towards personal computers. They even managed to create a kind of flat panel display, back in 1968, a variant of what we now call plasma displays. There was no way, yet, to make a FLEX machine capable of fitting onto the back of the display, but the dream of a portable computer was already with them: the concept of the Dynabook, a computer the size of a college notebook that might well have wireless networking (since ARPA was playing with the concept at the time). At the same time, children in another colleague's classes were learning an educational language called Logo, and were getting deep into a profession that most would have thought only adults would get. These dreams, of personal, perhaps portable computers, and of even children being able to understand programming, were what drove the creation of Smalltalk.

Putting these concepts together, Kay and his team at Xerox put together such a small, personal computer, the miniCOM, and to run it, a programming language he called Smalltalk. Due to the year it was made, this version is now called Smalltalk-71. Its entire concept was based on objects, and the passing of "messages" to those objects; all statements in Smalltalk were messages of this kind, written after the names of the objects (and sometimes nesting other messages inside them).

Kay's work would continue through multiple other computers, and expand the Smalltalk concept into a complete language that compiled directly to machine code, plus attaching it to the first modern-style graphical user interface (GUI) with a "window" model for handling application tasks. Yes, Windows was far from the first system to have "windows", and the Macintosh system would be directly inspired by Smalltalk's demonstrations.

The "release version" of Smalltalk, Smalltalk-80, would appear in 1980, around the same time that Kay's colleague Larry Tesler had decided Xerox would never get the true potential of their work. Apple had offered to hire, and Tesler took the offer. Kay, meanwhile, took a sabbatical to think over the future of the project. He would later also be hired at Apple Computer.

factorial "a method understood by integers"
(self = 0)
    ifTrue : [ ^ 1 ]
(self < 0)
    ifTrue: [self error: 'Factorial is not defined']
    ifFalse: [ ^ self * (self - 1) factorial ]

[Factorial function example in Smalltalk-80, by Loren K. Rhodes, PhD, from http://jcsites.juniata.edu/faculty/rhodes/lt/smalltalk.htm last updated 2015]


Almost every modern computer system contains some amount of C code, if nowhere else, then deep in its kernel. The core operating system, most hardware drivers, and a huge amount of supplemental programs are written in a combination of machine code and C. So, with it being essentially everywhere, how did C come about?

In Bell Labs, a small group led by Ken Thompson worked to create a new operating system, which would one day be known as Unix. This required a new workflow, based on a systems programming language called BCPL. BCPL was designed by Martin Richards in the mid 1960s, and had been used for operating systems development in the past. In 1969, Thompson crunched the structure of BCPL's compiler down to 8K of memory to run better on the PDP-7 machines they were using, creating the B language. Thompson then wrote B's compiler in B itself, creating a "bootstrap" version.

Later, in 1971, Dennis M. Ritchie would create a "New B" with types, extending the original B system. Rapidly thereafter, as NB was further refined, the resulting C language came into existence in 1973. It was during this time that one particular quirk of the C language came about, that being the separation of the bitwise operators & and | for "and" and "or" from their logical counterparts, which were originally the same symbols. Because of that, their precedence in C is in the exact same place, rather than on separate parts of the comparison operators like they are in the later Python language.

The primary advantage C had over most other non-assembly languages at the time was direct connection to memory. Libraries could be dynamically linked to the program, allowing access to their positions in memory, and memory "pointers" could touch positions in RAM to read and write data, either simply for the program to use, or mapped to files or devices to allow more complicated input/output operations. This made C more or less a "portable assembly" for use on as many different computer systems as possible.

C would go on to be bootstrapped like B, standardized, and put to use building Unix, Linux, BSD, OSX, and the various libraries used in those OSes.


int factorial(int n);

int main() {
    int n;

    printf("Enter a number to find its factorial: ");
    scanf("%d", &n);

    printf("%d! = %d", n, factorial(n));

    return 0;

int factorial(int n) {
    int r = 1;
    if(n==0) {
      return 0;
    for(i=1;i<=n;i++) {
      r *= i;

    return r;

[Example of C code that generates a factorial from user input, Aubrey Smith, 2017]

Python's Lineage

Parts of Python can be seen in all of the above languages. Without A-O, there would be no FORTRAN; without FORTRAN, much of the functions and mathematical operation sets common to Python and C would be much different. Without Lisp, the familiar REPL might not exist. Without Smalltalk, modern Object Oriented Programming, particularly Python's kind, would not exist. And without C, neither would many of the system libraries Python relies on, and likely neither would the operator precedence rules we rely on (with Python's improvements).

That said, Python is hardly the end-game for language design. There are many other languages which are still in use, including all of the languages given here. There are also new languages being designed today, some of which might become as popular as any of these have been.

Regardless of what language you use, algorithms are algorithms. You're asking the computer to do certain things with its input, and produce an output accordingly. The language you program in won't stop you from implementing an algorithm, it just changes the symbols you use to do it.

Since all languages have some amount of specialization, there will never be one best language to use for all possible programs. Your best option in your career as a programmer is to learn many languages, and get good at learning new ones. You never know what you might need to use in a future project.


Author Unknown, Grace Hopper Celebration of Women in Computing 1994 conference proceedings, 1994, Retrieved from http://www.cs.yale.edu/homes/tap/Files/hopper-story.html on May 20, 2017.

Author Unknown, The FORTRAN Programming Language, 1999, Retrieved from http://groups.engin.umd.umich.edu/CIS/course.des/cis400/fortran/fortran.html on May 20, 2017.

John McCarthy, History of Lisp, Stanford University, 1979, Retrieved from http://www-formal.stanford.edu/jmc/history/lisp/lisp.html on May 20, 2017

Alan C. Kay, The Early History Of Smalltalk, Apple Computer, 1993, Retrieved from http://worrydream.com/EarlyHistoryOfSmalltalk/ on May 20, 2017

Dennis M. Ritchie, The Development of the C Language*, Bell Labs/Lucent Technologies, 1993, Retrieved from https://www.bell-labs.com/usr/dmr/www/chist.html on May 20, 2017

Exercise: Quiz Yourself (Computer Languages)

After reading the above history (or reviewing again now), ask yourself the following questions about what you read, and make sure you know how to answer correctly.

  1. When Grace Hopper suggested the concept of a fully functional computer language compiler, what was the reason given by superiors for why it supposedly wasn't worth doing?

  2. What was the name of the first ever compiler, as created by Grace Hopper?

  3. What was the purpose of the FLOW-MATIC language Grace Hopper made after that compiler?

  4. What language did the CODASYL meetings eventually produce, with Admiral Hopper's guidance?

  5. What does FORTRAN stand for?

  6. What was the intended purpose of FORTRAN?

  7. What modern features first appeared in FORTRAN?

  8. Why does Lisp have the car and cdr functions, and other similarly named things?

  9. What did Lisp give to Python's design?

  10. What does the name Lisp mean?

  11. What was Smalltalk's main contribution to language design?

  12. When demonstrating Smalltalk, what major innovation did Kay's team demonstrate that all personal computer operating systems would later mimic?

  13. What was the purpose of the C language?

  14. What model of machine was the precursor to C, known as B, originally made for?

  15. What makes C more useful for programming operating systems than, say, Smalltalk or Python?

  16. What is the best language a programmer should use for every future program?