Today there are hundreds of computer languages in existence, and a thorough examination of them all or even a full list is the subject for a book rather than a single article. But we can get an overview of the evolution of languages to get a general sense of the past, present, and future of computer programming languages.
Where to begin?
While the obvious answer to the question is “At the beginning,” where exactly that is may be open to question. We’re used to thinking of computers as a late 20th-century phenomenon because that was really when they took off on the consumer level, which would mean starting with Autocode.
But major language innovations emerged earlier than that. In fact, the very beginning predates the last century altogether.
The first programmer was a woman
We’d have to go way back in 1883 to find what some consider the first programming language. That’s the year in which Ada Lovelace (the only legitimate offspring of Lord George Gordon, the Romantic poet who was known to be “mad, bad and dangerous to know”) set out the algorithm for Charles Baggage’s Analytical with the goal of calculating Bernoulli numbers.
It takes nearly 70 more years to arrive at the next major development, during which time another woman has a major impact on the industry.
The middle of the century
Autcode, was developed by Alick Glennie for the Mark 1 computer at the University of Manchester in the U.K. in 1952. Autocode is considered to be the first compiled computer programming language.
A few years later John Backus created the programming language FORTAN, which stands for Formula Translation, to work through complex scientific, mathematical, and statistical work. It is still used today. Which year to assign to it is not altogether clear.
Generally the year 1957 appears. However, Why physicists still use Fortran points out that 1954 was the year in which Backus wrote the “original specification” for the language.
If you have actually read the Margot Lee Shetterly’s book Hidden Figures and not just relied on the film version, you’d know that FORTRAN was actually taught to the employees, and the on-site classes were open to all races. Consequently, there would have been no reason for Dorothy Vaughan to steal a library book on the subject in order to learn that language.
In 1958, a committee created ALGOL, which stands for Algorithmic Language. While the language is not familiar to us today, it is considered a foundation for ones that are, such as C, Java, and Pascal.
In 1959 programming language based on the design work of Grace Hopper emerged. To learn more about the woman who is known as “the grandmother of COBOL” see the video below:
COBOL, which stands for common business-oriented language, was “created as part of a US Department of Defense effort to create a portable programming language for data processing.” At the time, they thought it would only be used as a temporary measure, but the language proved to have amazing endurance.
While no one seems to study COBOL anymore, it is still in use to maintain some legacy infrastructure on mainframe computers. For that reason, skills in this language were very much in demand just before the Y2K panic.
For those who have no memory of that time, see the video below:
In 1964 BASIC, which stands for Beginner’s All-purpose Symbolic Instruction Code.was developed by John G. Kemeny and Thomas E. Kurtz at Dartmouth College. The goal was to enable students without strong technical skills to make use of computers. BASIC was adapted for use in Microsoft computers, bringing the language to students of the next generation.
Getting to C level
C++ is an extension of the C language and was developed in 1983 by Bjarne Stroustrup. As Guilherme Torres Castro explained in a Medium post, “Large portions of Mac OS/X, all major Adobe applications and Google all employ C++ language.”
The next iteration of that letter is C#, pronounced C Sharp, a kind of progression from the other two C languages that Microsoft designed in the early part of this century. It’s no surprise then that it is the language of choice for Microsoft applications, though it is also used in “a wide range of enterprise applications that run on the .NET.”
Other familiar languages from the last decade of the 20th century
Everyone currently up on computer science now is familiar with Python, which ranks among the top three languages for most lists, largely due to its adaption to data science projects. Guido Van Rossum developed Python in 1991 and named it for the British comedy group Monty Python.
Java was born around the same time, and it became very popular early on. Oracle provides this history of the language now incorporated into its brand.
In 1991, a small group of Sun engineers called the “Green Team” believed that the next wave in computing was the union of digital consumer devices and computers. Led by James Gosling, the team worked around the clock and created the programming language that would revolutionize our world – Java.
The Green Team demonstrated their new language with an interactive, handheld home-entertainment controller that was originally targeted at the digital cable television industry. Unfortunately, the concept was much too advanced for the team at the time. But it was just right for the Internet, which was just starting to take off. In 1995, the team announced that the Netscape Navigator Internet browser would incorporate Java technology.
The first version of Ruby was released at the end of 1995. There have been several iterations since. The origin story is that Yukihiro Matsumoto (“Matz”)wanted to develop an object-oriented scripting-language that was better than what was already available. Ruby is used to build websites and mobile apps. To expand its reach beyond its native Japan, Matz set out an English homepage for Ruby in 1998. You often hear Ruby paired with Rails, its add-on framework that allows rapid development, requiring less coding to make it easier to build web apps.
In the 21st century
Go was a language that emerged at Google and then became an open-source project in November 2009. It was intended to improve the working environment for programmers so they could write, read, and maintain large software systems more efficiently. The project first started in 2007 and went through development by a number of people to advance to something usable.
In 2014 Apple invented Swift, which makes it a pretty recent addition to computer language. In the words of the parent company: “Swift is a powerful and intuitive programming language for macOS, iOS, watchOS, tvOS and beyond. Writing Swift code is interactive and fun, the syntax is concise yet expressive, and Swift includes modern features developers love.”
Which languages will be in use in the future?
While it may be possible to extrapolate from the current top ranked language on the TIOBE Index which languages are likely to remain in vogue, the larger context provides some warning that what is tops one year may rank at the bottom some years later.
Such is the object lesson of Pascal. The language, which is named after the French mathematician Blaise Pascal, was developed by Niklaus Wirth and in 1970. Despite its virtues of reliability and efficiency, it is rarely used.
In fact, on the TIOBE Index, Pascal hold the unenviable rank of #220 for 2019. That’s a huge drop from its rank in 16th place just five years ago. Even more dramatic is its decline from its high rank as the third most popular language in 1994. It seems computer languages are somewhat like celebrities; some linger in the limelight for decade, while others fade away into obscurity.
Castro offers some of his takes, with a prediction of the increasing attention for language used in or LLVM (Low Level Virtual Machines) He lists some of the relevant languages: ActionScript, Ada, C#, Common Lisp, Crystal, CUDA, D, Delphi, Fortran, Graphical G Programming Language, Halide, Haskell, Java bytecode, Julia, Kotlin, Lua, Objective-C, OpenGL Shading Language, Pony, Python, R, Ruby Rust, Scala Swift, and Xojo.
His advice then is to remember that success in development is not a function of “specific technological skills.” Rather, it’s about mastering “solid principles that transcend any particular technology.”