In physics, there are 4 revolutions in their development: Newton’s classical mechanics and he invented calculus; Maxwell’s unifying magnetism and electricity and his equations; Einstein’s General Relativity and he used Riemann Geometry; Quantum mechanics and maybe new math is needed.
I wonder if math also has these distinct stages/areas/revolution in our development, what are they?
Terence Tao replied:
I think you would have to consult a historian of mathematics to get a satisfactory answer. My personal feeling is that mathematics is not easily divided up into distinct stages, but there have been fundamental paradigm shifts throughout mathematical history, e.g. Euclid’s introduction of rigorous proof, Descartes’ unification of analysis and geometry, the discovery of non-Euclidean geometries; Cantor’s “paradise” of infinite cardinals in set theory; Klein’s rethinking of geometry through the Erlangen program, Hilbert’s consistency program and its relatives, and its failure (Godel, Russell, Turing, etc.), the Bourbaki era and emphasis on abstraction (cf. Grothendieck); the current rise of computer-assisted proofs; etc.. One could see e.g. https://en.wikipedia.org/wiki/History_of_mathematics for many more such important turning points in mathematical history.
William Wade replied:
That’s a big question that might be answered in different ways by different people.
Here’s my take on milestones in mathematics:
- Babylonia, 2000 BC or earlier: Invention of Geometric Algebra (doing algebra without symbols using rectangles for multiplication, squares for squares, etc.). The ghost of this approach still appears in our terminology. Completing the square involved representing a quadratic equation as a square with a small chunk taken out. Completion of the square was adding a small square to complete the bigger square. They could solve quadratics and generate Pythagorean triples using these methods.
- Greece, 300 BC: Invention of Axiomatic Geometry. All early cultures did geometry, but it was the Greeks who began to identify axioms, basic assumptions to base their geometry on.
- Moslems, 800-1200 AD: Decimal expansions. Decimal integers were introduced in India as early as 200 BC (during the Ashoka Empire), but the first use of decimal fractions was taken in the Middle East. The resulting decimal expansions provided a unified theory of numbers that is still used for classification purposes, e.g., an irrational is a number whose decimal expansion never repeats (proved by an English mathematician named Wallis in 1680’s).
- Vieta, 1591, Symbolic Algebra: One by one, during the one hundred years before Francois Vieta published his influential book, symbols were introduced that streamlined the geometric algebra of the Babylonians and the rhetorical algebra of the Indians, symbols we still use today. Calculus and many of the later developments could not have taken place without this development.
- Newton and Leibniz, 1680’s: Calculus. Using decimal expansions for numbers as a model, Newton guessed that all functions had a power series expansion. Several earlier mathematicians had discovered the power series expansions of arctan x, ln x, (1+x^2)^-1, and sin x, cos x. Newton fleshed this out by discovering the binomial series and emphasizing the series approach. He discovered derivatives using series. We still tech some of this material under the heading Taylor series. Leibniz, on the other hand, emphasized the formula approach to calculus, hence the product and chain rule are his. This is the way we introduce derivatives today.
- Lobatchevsky and Bolyai, 1830’s: Non-Euclidean Geometry. The realization that Euclidean geometry was only one of several viable geometries led mathematicians to rethink almost all of mathematics with an eye to greater generality and more axiomatic development. This resulted in axioms for algebra (by Dedekind in the 1870’s generating rings, fields, etc), analysis (by Weierstrass and his students in the 1880’s, especially the completeness axiom), and probability (by Kolmogorov in the 1930’s).
- Weierstrass, 1880’s: Rigor in analysis through epsilons and deltas. This was a tremendous break with the past. Newton, Leibniz, Cauchy, and even Euler believed in infinitesimals, i.e., that the number line was made up of individual molecules like a string of beads, NOT an uncountable continuum of points that we use today to generate strange sets like the Cantor sets.
- Cantor, 1890’s: Set Theory. The discovery that there is more than one infinity still is affecting research today. We still classify sets as countable or uncountable.