The human tendency to a deep conviction of the ontological reality and epistemological validity of visual-sensory experience informs our tendency to heavy reliance on geometric intuition in mathematical thinking. However, as mathematics became more and more sophisticated, mathematicians began noticing that "visualization" methods of proof were inadequate for the purpose of rigorous mathematical proof, and that arithmetical analysis often yielded conflicting conclusions to what seemed geometrically obvious "truths." Should we believe then that our world is primarily a world of appearances yielding it's essential illusory nature at the assault of arithmetical logic?
A major shift in mathematical thought which led to growing realization of the limitations of geometric intuition began with the introduction of the "method of fluxions" by Isaac Newton (1642-1727). Isaac Newton's "method of fluxions" was derived from the idea of "flowing," or variable quantities and their rates of flows. At the core of the idea of the method of fluxions was the fundamental concept of the limit, a rather fuzzy concept which would keep generations of able mathematicians busy; for even though Newton's prototype method worked for the simple types of curves to which he applied it, more advanced methods and theories had to be developed in time as mathematicians encountered more complex problems.
Newton's method was applied to finding the rate of change of a variable at a point x. Now, since a point x+0 could not be specified for that purpose, Newton considered a point x+h and considered the average rate of change in the interval x and x+h. Then, as the range x+h converged on the point x, the estimate of the rate of change at x could become more accurate and you could, in theory, derive estimates of the rate of change at x as accurate as you wished by letting x+h converge on x as closely as possible. The idea of the limit, fundamental to the method of fluxions (which later came to be called "calculus"), therefore, involved the notion of a range (y>0) converging on a point (x=0).
The German mathematician Gottfried Leibniz seemed, also, to have discovered calculus independently of Newton, in Germany. An unpleasant affair developed over who first discovered the method of calculus.
Following Newton's discovery, other mathematicians applied themselves to developing Newton's ideas in application to more complex problems. The pioneering mathematicians in this field after Newton were the Bernoulli brothers, James(1654-1705), John(1667-1748) and a contemporary of the brothers L'Hospital (1661-1704).
It was John Bernoulli who posed, as a challenge to his contemporaries, the famous brachistochrone which was a problem to find the curve between two points along which a ball could roll in the least time.
Leonhard Euler(1707-1783), a Swiss, has the reputation of being the most prolific writer of all mathematicians, and his work in calculus laid a foundation for further developments. Euler's work is considered a watershed in the evolution of the concept of the function.
By the mid 1800s, however, the Bernoulli and Euler ideas of functions had become inadequate for dealing with the even newer problems that had arisen. Joseph Fourier(1768-1830) developed new advanced methods in calculus based on a more advanced concept of the function and it was left to Augustin-Louis Cauchy(1789-1859), Bernard Bolzano (1781-1848), Niels Abel (1802-1829) and Peter Dirichlet (1805-1859) to further clarify the already known concepts of functions, integral, continuity, limit and convergence, all of which were the basis of the mathematical method of calculus.
Bolzano may be considered to have pioneered, in modern times, the critical revolution in mathematical thinking from the age-old naive reliance on geometric intuition to arithmetical definition in the conceptualization of limits and continuity of magnitudes. Fourier's work, in particular, had led to further complexification of the concepts of calculus such that visual intuition could no longer be relied upon in conceptualization.
It should be noticed that the basic notions in calculus were never quite clear to both Leibniz and Newton even though they were able to apply the prototype methods they had developed successfully to the particular problems they encountered. By the mid 1800s, it was taken for granted that a continuous function could have a finite number or even an infinite number of "corners" but that between these "corners" the function was smooth. This view seemed obvious to mathematicians from the geometric analogy they employed. But Karl Weierstrass (1815-1897) showed conclusively that what seemed obvious from geometric intuition, that a continuous function had a derivative at nearly every point, is wrong. He presented his colleagues with an example of a function which is continuous but nowhere differentiable: that is, a function which was all corners and no smooth in-between regions to the corners!
The growing complexity of mathematical concepts soon stretched geometric analogy to breaking point. Our world of visual experience was not merely proving inadequate for rigorous logic, it was actually showing itself in conflict with rigorous logical analysis. Soon mathematicians were forced to realize that the "appearances" of dimensional shapes and figures and solid objects in what we have always assumed to be the "real world" could not be relied upon in the quest for mathematical truth. Mathematicians finally abandoned the method of arriving at truth by geometric intuition and began reasoning, instead, in terms of the principles of arithmetic regardless of the geometric paradoxes such approach threw up.
It would appear, from the experience of mathematicians, in which "pure" arithmetical logic conflicts with geometric intuition, that the "stuff" of the world of our visual experience might be, after all, "illusion."
The writer John Thomas Didymus is the author of "Confessions of God: The Gospel According to St. John Thomas Didymus"(http://www.resurrectionconspiracy.com/). If you have found this article informative you are invited to read the article: "Bohr's Complementarity Principle: The Physical Universe as Virtual Reality Projection" on his blog: http://johnthomasdidymus.blogspot.com/2010/08/physical-universe-as-virtual-reality.html
This post was made using the Auto Blogging Software from WebMagnates.org This line will not appear when posts are made after activating the software to full version.
沒有留言:
張貼留言