In Calculus, the derivative of a function $f(x)$ is given by $f'(x)$ which is
$$f'(x)=\frac{f(x+h)-f(x)}{h},$$
in the limit as $h\rightarrow 0$. Let's test this defintion here. Think of a function
$f(x)$ and it's derivative, $g(x)$. As example, if $f(x)=sin(x)$ then $g(x)=cos(x)$.
Let's compute $f'(x)$ according to the definition above, letting $h$ get smaller and smaller,
and compare $f'(x)$ at some $x$ with the analytical derivative, $g(x)$ at the same $x$. Let's
see what we get.
Now you try. Fix the return statements in both the f(x) and g(x) functions. Then fix the maximum value of i you wish
to run through in the for-loop.
Type your code here:
See your results here:
This code will not run! You have to put in a function in the return statement in the function f(x) defintion,
and its derivative in the return statement in the function g(x) definition. Lastly, how many runs via the for
loop do you want to run? This is a big question. How many interations of this do you want to run until you are convinced
that the definition of the derivative (above) is valid? (Note: to make $h\rightarrow 0$ as needed, we cut it by $1/4$ each
time through the loop. This is the best a computer will get at taking a limit.)
Share your code
Show a friend, family member, or teacher what you've done!