In another generalization, these linear spaces extend to infinite linear spaces, the Taylor polynomial series can be swapped out for better behaved Fourier series and other polynomial series, like those of Tschebycheff, the function classes expand to include random walks, and non-smooth monsters that are convergent in the 19th century, the notion of integration becomes universal in the 20th century due to Lebesgue Cohen and Solovay. Then there is the idea of vector spaces, and linear tangent spaces, and differential geometry, which leads to General Relativity. One set of ideas here are the Legendre transform, swapping out $y$ for $F_y$, which is ultimately explained by statistics and Gaussian integrals. The next idea is of partial derivatives, that if you have a function of several variables: This idea was developed by Newton, Euler, a million people each focusing on a different differential equation, and today there is an industry for understanding these equations. This allows you to speak about algorithms- a differential equation plue a little stepsize defines an algorithm to compute $f$, and if you iterate it, you do physics. Where $df$ means $f(x+dx) - f(x)$, then you can compute $f$ given an initial value. ![]() The next idea is that of differential equations: you can express algorithms with steps which are infinitesimals as equations. It was what led Newton and Leibnitz both to run with the idea. This theorem is due to Isaac Barrow, Newton's advisor. This allows you to give a systematic calculus for areas. If you look at the area under a curve from $0$ to $x:$ $A(x)$, then $A(x + dx) = A(x) + f(x) dx$ (you can see this by drawing rectangles), and therefore $f(x)$ is the derivative of $A(x)$. The next idea is that areas and derivatives are related. This idea is due to Newton, it was greatly developed by Euler, and it was made stick by Cauchy and others in the 19th century, in the development of complex analysis and analytic function theory. It allows you to indentify certain functions as infinite polynomials, and treat them as polynomials. When $dt$ is not infinitesimal, there are all these orders. Once you know enough, you see thatįor infinitesimal $dx$ (in radians) so that ![]() You can also do calculations with trigonometry. You can use this to do arithmetic well, after you internalize the idea. By definition, then, $6$ is the derivative of squaring at $3$. Where $dx$ is an infinitesimal, so I dropped the $dx^2$, because the square of an infinitesimal is twice more infinitesimal than the infinitesimal and can be ignored. The point is that it is consistent to imagine little itty-bitty numbers, infinitesimals, adjoined to your conception of the real numbers, and these infinitesimals contain the idea of limit and asymptotics. The main point is a little buried in a modern treatment.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |