## Saturday, 31 January 2009

### Permutations

This last week I happened to have the necessity of generating all the possible permutations of an array. This seems to be a fairly easy task but many programmers won't come up with a simple solution to this problem very fast. There is at least one intuitive, recursive, solution to this problem, many more complex but *slightly* more efficient, iterative, solutions and no really time-efficient solution. After all, the problem is to generate n! orderings of an array so we, at least, have to visit each one once.

The well-known solution

The most widely spread solution to this problem is the Dijkstra algorithm for generating the next permutation. It is based on the assumption that there is a total order between the elements of the array and the elements of the array are unique (there are no repetitions). If these conditions are met, the algorithm always generates the next permutation in lexicographical order.

The algorithm is very straight forward and goes as follow:
`def get_next(a):    N = len(a)    i = len(a)-1    while a[i-1] >= a[i]:        i -= 1    j = N    while a[j-1] <= a[i-1]:        j -= 1    a[i-1], a[j-1] = a[j-1], a[i-1]    i += 1    j = N    while i < j:        a[i-1], a[j-1] = a[j-1], a[i-1]        i+=1        j-=1    return a`

To use this algorithm to generate every possible permutation we just call it n! times:
`def dijkstra(a):    import operator    fact = reduce(operator.mul, range(1,len(a)+1))    the_list = [copy(a)]    for i in range(fact-1):        a = get_next(a)        the_list.append(copy(a))    return the_list`

the only possibly uncommon thing here is line 3 which is a functional-style factorial function.

This works very well for generating the permutations of n (the permutations of the array [0..n-1]) but it won't work for arrays of arbitrary elements. So we explore other options.

The known recursive solution

There's a (not-so-well-)known recursive solution to solving the permutation problem. It is known as the HeapPermute algorithm or Heap's algorithm for permuting an array.
My python implementation of the algorithm goes as follow:
`def heap_aux(a, n, the_list):    if n == 1:        the_list.append(copy(a))    else:        for i in range(n):            heap_aux(a, n-1, the_list);            if n % 2 == 1:                a, a[n-1] = a[n-1], a            else:                a[i], a[n-1] = a[n-1], a[i]def heap(a):    the_list = []    heap_aux(a, len(a), the_list)    return the_list`

This algorithm works very well and it is very efficient too but it generates the permutations in an order that I find particularly disturbing.
Also it is not as easy to understand as I would like.

Enter my recursive solution
This is probably the most straight forward solution I could come up with. It is a clasical recursive design.
Want to permute an array of n elements? How would you do it if you already had a function to solve the problem for a smaller array?
Simple. Select each possible element of the array to be "the first element of the permutation" and then permute the rest of the array:
A permutation of 1,2,3,4 would be:
1 + permutation_of(2,3,4)
2 + permutation_of(1,3,4)
3 + permutation_of(2,1,4)
4 + permutation_of(2,3,1)

Simple, right?

Let's code it:
`  for j in range(i, len(a)):      a[i], a[j] = a[j], a[i]      permutation(a, i+1, the_list)      a[i], a[j] = a[j], a[i]`

This loop captures the essence of the procedure described above. We have to remember to put the elements back where they were after the recursive call to allow next iteration to work. Not doing so would result in swapping the wrong element in the next iteration.

Puting it all together (and adding a small optimization):
`def perm_aux(a, i, the_list):    if i == len(a):        the_list.append(copy(a))    else:        perm_aux(a, i+1, the_list)        for j in range(i+1, len(a)):            a[i], a[j] = a[j], a[i]            perm_aux(a, i+1, the_list)            a[i], a[j] = a[j], a[i]def permutate(a):    the_list = []    perm_aux(a, 0, the_list)    return the_list`

This algorithm generates the list in a nicer order (though it's not lexicographical). And as its predecessor is quite efficient (taking into account the complexity of the problem).

Performance

Normally recursive algorithms are put down for performance reasons: They keep all local variables stored in the stack while the other recursive calls are made and have to incur in function calling overhead. In the case of these two algorithms the good thing is that the recurtion depth is not proportional to the size of the problem solution but to the size of the array (actually it is exactly as deep as the array's length). Also, if the list variables are passed by reference (this is programming language specific), only the reference to the variable needs to be stored in the stack and there is no overhead of coping the variables on every call.
It will probably never be as efficient as their iterative counterparts but for me readability (with small performance penalties) definitely pays off.

To measure the times I used the following python decorator:
`def timing(func):    def inside(*args):        t1 = time.time()        res = func(*args)        t2 = time.time()        print '%s took %0.3f ms' % (func.func_name, (t2-t1)*1000.0)        return res    return inside` Given the complexity of the problem, the graph grows rapidly out of scale. So a graph in logarithmic scale is in order As you can see the algorithms behave very similarly. The largest difference (for an array of size 9) between the fastest (heap) and the slowest (permute) is 0.6 seconds. This of course will grow with the array size, but in a linear instead of exponential manner.

Have another solution to this problem? comment on! =)

#### 8 comments:

1. A niggle, but the phrase "the exponential nature of the problem" is technically incorrect - n! is not Θ(1)^n

2. Since you are using python:

#!/usr/bin/env python2.6

from itertools import permutations

for permutation in permutations([1, 2, 3, 4]):
print(permutation)

3. Took 0m0.433s to generate permutations of nine elements and 0m2.351s to generate and print.

(MBP 2.16 python 2.6)

4. Default, it's unfair to compare Python's implementation to these since it's written in C (and uses "yield" which saves some time and memory).

The algorithm itself can return permutations of sizes smaller than the length of the array.

Besides that, it is similar to the third recursive algorithm, with a nice approach to the permutation on the smaller array. (Rotating them before replacing elements)

It also uses a nice trick that can be used to overcome the "total order, unique elements" constraint of Dijkstra's algorithm.
range(len(a)) is permuted and the indices are used to "order" the elements in a:

for indices in permute(a):
print [a[i] for i in indices]

Due to that trick, though, lists with repeated elements will produce duplicate permutations.

The code is here:
http://svn.python.org/view/python/trunk/Modules/itertoolsmodule.c?view=markup

5. salty-horse I know ;) I keep a copy of python trunk updated in my desktop.

(my test program called list over permutation to generate the list)

6. @Default:
your computer is probably faster than mine.

I ran python's implementation on my machine and got 0m3.591s for generating and printing which compares to 0m5.615s for heap (the fastest) and 0m6.233s for permutate (the slowest) considering they're written in pure python. And only for the generation pythons implementation ran in 0m0.492 while heap ran un 0m2.192s and permutate in 0m2.515s. Clearly most of python's permutation algorithm is in the printing.

A more fair comparison would be to compare the python implementation of python's permutation algorithm: in my machine it took 0m8.867s to generate and print and 0m5.088s just to generate. Still, python's is a superior algorithm because it can permutate arrays of elements of arbitrary size without swapping them around.

7. @Anonymous:
Thanks for the comment! I fixed it.

8. I wrote a series of related posts last year that you may find interesting.

http://antimatroid.wordpress.com/2008/06/29/one-tree-many-possibilities/

http://antimatroid.wordpress.com/2008/07/06/one-tree-many-possibilities-part-2/