APL stands for A Programming Language. However, originally APL was not invented as a programming language at all, but as a method to describe algorithms.

IBM used the notation in the early sixties to describe their mainframe hardware architecture. That worked out well. So well that implementing the notation as a programming language more or less recommended itself.

Not only is APL different from other programming languages, it is completely different.

That is probably the reason why people when they are introduced to APL will either love it or hate it.

APL is an interpreter

Because APL is an interpreter one can work interactively.

One enters a statement, and as soon as the <enter> key is pressed, APL calculates the result and prints it to what is called the session manager. If one makes a mistake an error message is printed instead.

That ability to respond immediately is probably the major reason why APL is used by a number of statisticians for data analyses purposes.

If one makes a mistake or wants to manipulate the data in some way one can do this instantly.

This is also a reason why APL programmers don’t break into a sweat when they hear the word “debugging”. APL programmers do not spend 90% of their time with debugging.

APL manages the memory itself

No memory allocation required; no deallocation either. Therefore one cannot forget it.

There is no need to program any garbage collection routines – that’s all done by the interpreter.

For a long time this concept was criticised, but when .NET emerged it became mainstream.

APL manages pointer itself

APL does offer pointers, but it manages itself. No buffer overflows and other dangers posed by C-style pointer.

One of the major problems of C simply do not exist in APL.

APL manages data types

APL is not type-oriented; it is left to the interpreter to take care of that.

Adding an integer to a float? No problem: APL performs any necessary conversion itself.

Purists are horrified, practitioners are delighted.

APL does not have key words or commands

The built-in primitives are represented by mostly mathematical, some invented and very few symbols taken from the Greek alphabet.

In APL speech we talk about operators and functions, and while functions process data, operators take functions as operands in order to define a derived function which then can be used to process data.

Because APL functions as well as operators are extremely powerful this concept of representing operations results in extremely terse programs which can achieve a lot.

APL is a non-scalar programming language

That’s why APL is sometimes called an “Array processing language”, and rightly so.

Ordinary programming languages like C (with or without ++), Pascal or Basic are scalar languages by nature. They can process just one piece of data, although in recent years we have seen add-ons.

If you need the total of a vector of numbers with, say, ten items in it, then one has to write a loop and add all the values together.

In APL one takes a different approach because APL can process arrays in one go, no matter whether they are vectors, matrices or high-dimensional arrays.

How does this work? Well, let’s look at an example. Lets’ assume that A is a vector with numbers. In order to get the total of course we need the function +.

In order to process all numbers in the vector A, we need to introduce a concept which does not exist elsewhere: in APL there are operators. Operators take functions as operands. This means that an APL operator has nothing to do with operators in other programming languages or in mathematics.

To calculate the total of A we need the operator “reduce”, which is represented by the slash symbol:/

We process A with this:


In APL speech + is an operand of /. This defines a derived function “totalise” which then processes the data.

In effect it is as if / would put the + between all the items in A as in:

A[1] + A[2] + A[3] + … + A[n]

Therefore we don’t need a loop, and because we don’t need a loop we don’t even need to know the number of items in A.

Whether A has just one element or thousands or none at all, we always get the total.

If we get interested in the product of all numbers in A rather a total then we would replace “plus” by “times”:


Imagine A is a matrix with 34 rows, representing branches, and 12 columns, representing months. If we want to compute the total turnover we can say:


In APL the comma is a function. Fed with a left and a right argument it is actually “concatenate”.

Without a left argument it converts its right argument to a vector, no matter what it is. And +/ then computes the total of that vector.

But what if we are interested in how much turnover was generated by every single branch?

In that case we have to specify an axis, so that APL knows whether it should sum up all the branches or all the months. The first axis specifies branches:



The second axes specifies the months:



Even these simple examples should already give you an idea of why APL is so famous (some would say: infamous) for its terseness and power.

An average  project can indeed be done significantly faster in APL than for example in C; depending on the type of application we are talking about a factor of between five and fifteen compared with C++. A fact that should make one think if one is in charge of an IT budget, but these guys seem to be busy travelling to Ukraine, India or China. (Update 2014: that seems to change now. Jobs in IT are coming back)

During its most difficult period between 1980 and 1990, APL only survived because of this enormous advantage.

Why has this been such a difficult time, and what has changed since?

Because APL is an interpreted language it needs a fast CPU. On a PC APL was not really of any use at all before the introduction of the 80486 processor.

APL also needs plenty of memory. Due to its ability to process arrays in one go you all the data should be data in the memory.

On a mainframe system with the MVS operating system and 16 MB of memory or on a Z80 PC with 64 KB memory this was a show-stopper.

In the 1980s IBM was the leading APL vendor. However, since the outdated VSAPL was replaced by APL2 in the eighties IBM has not managed to add a single major feature to the language.

In the 1990s APL2000 and Dyalog took over as the leading vendors. Both interpreters have improved ever since, although Dyalog is clearly the current leader.

Since the early 1990s APL on a mainframe has been in decline while usage on PCs has been increasing.