The uniformity conjecture postulates a relationship between the syntactic length of expressions built up from the natural numbers using field operations, exponentials, and logarithms, and the smallest of nonzero complex numbers defined by such expressions. The Uniformity conjecture claims that if the expressions are written in an expanded form in which all the arguments of the exponential function have absolute value bounded by 1, then a small multiple of the syntactic length gives a bound for the number of decimal places needed to distinguish the defined number from zero. Richardson has systematically searched for counterexamples, but not found any.