Programming language
A programming language is a formal system of notation used to write computer programs, enabling humans to communicate structured instructions to machines. These languages are defined by their syntax, which governs their form, and semantics, which determines their meaning. Programming languages offer features such as type systems, variables and mechanisms for handling errors or exceptional conditions. To execute a program, an implementation of the language is required—either an interpreter, which directly runs the source code, or a compiler, which translates it into an executable form. Over time, programming languages have evolved alongside computer architecture, gradually increasing in abstraction to make programming more accessible and widely applicable.
Definitions and Theoretical Context
Programming languages differ fundamentally from natural languages. While natural languages are used for communication between people, programming languages are designed specifically to convey instructions to computers. The broader term computer language is sometimes used interchangeably, though some authors use it to include specification languages, markup languages and other formats used in computing that do not necessarily encode algorithms.
Most practical programming languages are Turing complete, meaning they can compute any algorithm that a Turing machine can, given sufficient resources. Theoretical distinctions sometimes define programming languages as idealised constructs for abstract machines, while computer languages represent the subset implemented on finite hardware. Formal specification languages and even input formats may be considered programming languages when they influence program behaviour, reflecting the flexible boundaries of the concept.
Programming language theory is the academic discipline that studies language design, implementation and classification, analysing the trade-offs between expressiveness, safety, portability and performance.
Early Developments
The earliest programmable computers, appearing in the late 1940s, were controlled using machine language, a first-generation programming language consisting of numeric instructions directly executed by the processor. Machine code was difficult to debug, hardware-specific and cumbersome to work with.
To improve usability, assembly languages—second-generation languages—were created. They replaced numeric operation codes with symbolic mnemonics, making programs easier to write and understand while retaining close ties to hardware. However, these languages still lacked portability between systems.
The major transformation came with the introduction of high-level languages in the 1950s. These third-generation languages abstracted away hardware details, enabling programmers to express algorithms in symbolic, human-readable form. In 1957, Fortran (FORmula TRANslation) became the first widely used high-level language and is often credited with pioneering the use of the compiler. Arithmetic expressions could now be written directly, with translation into machine code handled automatically.
Developments in the 1960s and 1970s
From the 1960s, the emergence of mainframe computers and the continued scarcity of computing resources influenced language design. Programs were typically input via punched cards, limiting interaction during execution.
Two major paradigms emerged during this period:
- Functional programming, exemplified by Lisp (1958), introduced recursion, conditional expressions, dynamic memory allocation and automatic garbage collection. Lisp dominated early artificial intelligence research.
- Imperative programming, influenced by the release of ALGOL (1958 and 1960), became the basis for many later languages. ALGOL introduced block structure, lexical scope and the use of formal grammars such as Backus–Naur form. Although it achieved limited commercial success, its design shaped subsequent imperative languages including C, Pascal, Ada, Java and C#.
Simula, developed in the 1960s, extended ALGOL with notions such as classes, inheritance and dynamic dispatch, becoming the first object-oriented programming language. Prolog, created in 1972, pioneered logic programming, allowing programmers to state desired outcomes and delegate the method of achieving them to the system.
The invention of the microprocessor in the 1970s led to cheaper computers, encouraging greater interaction and the development of languages supporting user-oriented programming.
Programming Languages from the 1980s to the Early 2000s
The arrival of the personal computer in the 1980s broadened the purposes for which programming languages were used. New languages introduced during this period included:
- C++, extending C with classes and object-oriented features while retaining compatibility with C.
- Ada, designed with strong typing and built-in support for concurrency.
- Fourth-generation languages associated with database manipulation and application generation.
The Japanese Fifth Generation Computer Systems project sought to advance logic programming languages with enhanced concurrency features, though these efforts were eventually eclipsed by languages with more practical concurrency mechanisms.
During the 1990s, the rapid expansion of the Internet prompted the creation of languages suited to distributed systems and web development. Java, intended to be portable, secure and platform-independent, achieved significant success, reflecting the needs of emerging networked applications.
Scripting languages such as Python, JavaScript, PHP and Ruby gained popularity for rapidly developing small programs and coordinating existing applications. With their integration into web technologies, they became central tools for building dynamic web pages and server applications.
Language Design, Trade-offs and Classification
Programming languages are commonly classified into broad paradigms based on their operational style:
- Imperative languages, where statements modify program state in a designated order.
- Functional languages, which emphasise pure functions and immutable data.
- Logic languages, based on declarative specification and automated inference.
- Object-oriented languages, organised around encapsulated data structures known as objects.
Designing a programming language involves balancing competing considerations. Features such as exception handling improve safety and reduce code complexity but may introduce performance overhead. Similarly, strong type systems can prevent errors but sometimes reduce flexibility. The influence of processor architecture historically favoured imperative and low-level languages, particularly on machines based on the von Neumann model.
Evolution and Influence
Over several decades, thousands of programming languages have been developed, tailored for specialised domains such as scientific computing, artificial intelligence, systems programming, web development and embedded systems. Many languages fall into multiple paradigms, reflecting hybrid designs that combine imperative, functional and object-oriented features.
The continued diversification of hardware—from multi-core processors to distributed cloud environments—has sustained interest in new languages and execution models. Meanwhile, older languages such as Fortran, C and Lisp have demonstrated remarkable longevity, illustrating the durability of certain core concepts.