Navigation – Plan du site

AccueilCollège de France Newsletter7Inaugural LecturesAlgorithms and Science

Inaugural Lectures

Algorithms and Science

Bernard Chazelle
p. 32

Notes de la rédaction

Excerpts from the Inaugural Lecture 18 October 2012

Source: La lettre, no. 35, December 2012

Texte intégral

1The following anecdote, perhaps apocryphal, is told about the great Danish physicist Niels Bohr:
­­– Professor Bohr, I see you have a horseshoe hanging on the wall. Don’t tell me you believe in this kind of thing!
­­– Don’t worry, I don’t believe in it at all, but I was told that it works even if you don’t believe in it.

2So it goes for the algorithmic revolution. Beyond the scepticism or infatuation of the day regarding the latest IT novelty hides one of those paradigm shifts dear to Thomas Kuhn. The algorithm is a subversive conceptual tool, which affords the possibi­lity to look at science and technology from a new pers­pective extending beyond its practical applications. These lectures will explain the constituent elements of this intellectual revolution underway.

  • 1 French horse-betting agency.

3The algorithm owes its name to Abū ‘Abdallāh Muhammad ibn Mūsā al-Khwārizmī, who spearheaded the Abbasid renaissance in Baghdad in the ninth century: a pedigree filled with revolutionary zeal. You are hardly likely to find this fervour in a passerby asked to define the word algorithm. An algorithm, they will say, is a formula painfully learnt in school, which allows you to multiply two numbers by aligning the digits in rows. Everyone knows how to add, subtract and multiply, which brings the number of universally known algorithms to three – or four, if we count the survivors of the digital era who still know how to divide. An algorithm is a sequence of instructions to follow in order to arrive at the desired result through a series of simple and boring steps. Don’t let your calculator fool you. It has no more knowledge of arithmetic than a horse at Longchamp race-course has of the PMU1. Its algorithms read, write and delete without understanding a thing.

4An algorithm is defined independently of the size of the data. Multiplying 10 or 10 billion digit numbers not only follows the same principle but also the same word-for-word instructions. Only the execution time differs. The multiplication of two 10 digit numbers produces a grid with 10 rows of 10 or 11 digits each, and requires a number of basic stages at most proportional to 10 x 11. Generally, multiplying two numbers of n digits each requires a number of stages at most proportional to n (n + 1) ≈ n2: we then say that the complexity of the algorithm is on the order of n2. It is of little importance whether the number of stages equals n2, 3n2, or 17n2 + n: all that matters is the order of magnitude, n2. These constant factors are not ignored out of laziness but because of a sharp sense of priority. They often reflect implementation details extrinsic to the algorithm in question (such as the numerical basis for multiplication). Another reason to get rid of them is to give credit where credit is due. Does your calculator multiply so quickly because of an outstanding algorithm or because of a turbocharged processor? Generally speaking, constant factors are owed to computer power, and orders of magnitude (such as n2) to the complexity of the algorithm.

  • 2 Fürer, M. Faster Integer Multiplication, SIAM J. Comput., 39 (2009), 979-1005.
  • 3 (3) Ailon, N., Chazelle, B. The Fast Johnson-Lindenstrauss Transform and Approximate Nearest Neighb (...)

5School multiplication has a complexity on the order of n2. Can we do better? The answer, which is positive, can be surprising. Schönhage and Strassen’s algorithm reaches a complexity of a slightly higher order than n (to be precise, of the order of n log n log log n). In practice it is useful for the factorisation of large integers. The method, recently improved by Martin Fürer2, is based on an old algorithm, the Fast Fourier Transform (FFT), a cornerstone for processing the signal present in all of your electronic gadgets. As strange as that may seem, the same algorithm which allows you to listen to music and interpret your fMRI scan will help you to multiply large numbers very fast. The algorithm owes its greatness to its versatility, and Gauss’ FFT, rediscovered by Cooley and Tukey in the 1960s, is one of the greatest. In collaboration with Nir Ailon, I have shown how to randomize the FFT to exploit the Uncertainty Principle, a central concept in quantum mechanics, and to search for neighbours in very high dimensions3.

6The rest of this lecture could inventory the algorithms which, perhaps without you realising it, have changed your life. But I have a greater ambition. It is to convince you that the algorithm is not so much a useful object as a different way of thinking.

Haut de page

Notes

1 French horse-betting agency.

2 Fürer, M. Faster Integer Multiplication, SIAM J. Comput., 39 (2009), 979-1005.

3 (3) Ailon, N., Chazelle, B. The Fast Johnson-Lindenstrauss Transform and Approximate Nearest Neighbors, SIAM J. Comput. 39 (2009), 302–322.

Haut de page

Pour citer cet article

Référence papier

Bernard Chazelle, « Algorithms and Science »La lettre du Collège de France, 7 | -1, 32.

Référence électronique

Bernard Chazelle, « Algorithms and Science »La lettre du Collège de France [En ligne], 7 | 2015, mis en ligne le 02 novembre 2015, consulté le 23 mai 2025. URL : http://journals.openedition.org/lettre-cdf/2671 ; DOI : https://doi.org/10.4000/lettre-cdf.2671

Haut de page

Auteur

Bernard Chazelle

Eugene Higgins Chair of Computer Science, Princeton University

Articles du même auteur

Haut de page

Droits d’auteur

Le texte et les autres éléments (illustrations, fichiers annexes importés), sont « Tous droits réservés », sauf mention contraire.

Haut de page
Rechercher dans OpenEdition Search

Vous allez être redirigé vers OpenEdition Search