In a manuscript entitled "A note on normal numbers'' and written presumably in
1938 Alan Turing gave an algorithm that produces real numbers normal to
every integer base. This proves, for the first time, the existence of
computable instances and it is the best solution to date to Borel's problem
on giving examples of normality. Furthermore, with this work Turing pioneers
the theory of randomness and shows that he had the insight, ahead of his time,
that traditional mathematical concepts, like measure or continuity, could be made
computational. In this talk I will highlight the ideas in these achievements
of Turing, which are largely unknown because his manuscript remained
unpublished until it appeared in his Collected Works in 1992.
