We're on a bit of a role of implementing advanced maths in Sniff, so lets keep going. So far we've looked at ways of calculating Sin and Square Roots, but perhaps we can write some code to solve any equation. Well maybe not any - sometimes we talk about "well behaved" functions, which means there some gotcha's that we're going to pretend don't happen (they're a bit like "smooth" surfaces in physics), but generally it will work for a lot of things we'd like to solve. This would also work just as well in Scratch, because Scratch is actually a lot more powerful than it looks.
Lets say we want to calculate a square root of 2. We can rewrite that a bit to get x*x=2, or x*x-2=0. We can generally rearrange any function to make it something on the left, equal to zero on the right. We're going to find f(x)=0, and to use another handy phrase, we can do this "without loss of generality" (this means it looks like we're simplifying things, but actually we're not - any equation can be rewritten in this form).
Here's some unknown, typical function. It crosses the X Axis, so there's a value x, where f(x)=0, and we'd like to find it. We've no idea where it crosses, so lets guess! What should we guess? Well for complex functions we maybe have to make a good guess, but for now lets just pick anything, and call our guess X0. We can find f(X0) and then (surprise) we realise that f(X0)!=0. d'oh!
If we replace f(x) with a straight line y=mx+c based around the f(X0), then we'd get a triangle like the one in the diagram. Now instead of solving our tricky function, we can solve our dumb approximation (cause its easier!): where does mx+c=0?
Well if we can find the gradient of the line: m, then we can fairly easily show that the the straight line will hit the X axis at:
To find the gradient to use for the line, so it's a good approximation to the curve at X0 we need to consider a tiny triangle: we know y at X0. Now move a little to the right (a distance we call delta x, or simply dx) and evaluate f(X0+dx). The gradient of the line we need is just (f(X0+dx)-f(X0))/dx. In other words how much we've changed in y divided be how far we've moved in x. The exact value of dx doesn't really matter - but ideally we'd like it to be as small as possible.
If you've done calculus you should have spotted that m is an approximation to f'(x), and we can often calculate f'(x) directly instead of this triangle/gradient stuff [strictly speaking using this approximation makes this the secant method rather than Newton's method, but that's probably not important]. However if you've not done calculus then just know that mathematicians sometimes write m as f'(x).
So X1 is the solution? Well sort of, except its the solution to the straight line, which is only an approximation to f(x), but just like in our sqrt code from the previous post, X1 is (probably) a better solution than X0. In that case, all we had to do we use each guess to calculate a better guess, and after a few goes round we get a pretty good answer. This gives us the final equation:
Given a guess Xn we can find a better guess by plugging it in to this equation. Intuitively we can see that if we get to the right answer then f(Xn)=0, so the value stops changing. This is called Newton's method, or Newton-Rahpson.
That was a lot of maths and no code! Lets write that in Sniff:
make x number
make f number
make dx number
make fx number
make m number
.set x to 45
.set dx to 0.0001
..broadcast calcF and wait
..set fx to f
..say join join [x] ":" [fx]
..change x by dx
..broadcast calcF and wait
..change x by -dx
..set m to (f-fx)/dx
..set x to x-fx/m
To get a better guess, we increase x by dx and find f at this point (we then change x back, so we're back in the right place!). Now we can use the new value of f, along with the previous fx, to find the gradient m. Now we know everything we need to know to apply the newton-rahpson formula, get a better guess, and go back around the loop again.
So lets use that code to find the square root of 2. Remember the approach finds f(x)=0, so we write the problem as x*x-2=0. We can just drop that in to a script to calculate F, and let the rest of the code run:
.set f to x*x-2
Well the first thing we learn is that 45 is a pretty terrible guess for the square root of 2, but after about 5 times around, replacing our current guess with a better one each time, things start to look promising - we've figured out its 1 and a bit! After nine times around we get the answer 1.41421 which is the correct answer to 5 decimal places. If you look at the 10th time around, and answer is the same when its printed out, but the error is even less.
But we didn't really think about what was going to be in calcF when we wrote the code, so we can drop anything we want in. What about cos:
.set f to cos of x
Well 45 is still a pretty terrible guess, but after only 4 times around the loop we've got the right answer and the error is effectively zero.
But we knew those, or could calculate them easily. Lets try something we don't know:
This is cubic, so its not so easy to solve. In fact equations like this are exactly the sort of thing Newton Rhapson is used for. Lets just throw it at the code and see what happens:
Apparently it has a value of zero at about 0.8. Lets plot it and see (OS X has a neat graph plotting app in the Applications/Utilities folder):
Well so it does!
But wait a minute - that graph crosses the X Axis THREE times. There are three possible correct answers, but we've just found one. It all depends on that first guess (its also possible to make a really bad first guess, but we'll ignore that for now). It looks like there are crossing around -0.5 and -2.5, so lets try starting points of -1 and -2:
If we pick a value for out first guess that is close to a correct answer then it converges pretty quickly. If we plot a graph then we can usually use that to make a good first guess and from there the code does the rest!
Sniff (or Scratch) aren't my first choices of language for doing this kind of work - if you're serious about maths then you should be looking at something like Mathematica (which is FREE on raspberry pi, and justifies the code of a pi by itself), but its up to the job, and may be appropriate if you're teaching someone with strong maths, who hasn't learnt a more advanced programming language yet.
I think one of the main takeaways from this series of posts is how coding can really help in maths classes. Newton Rhapson and Taylor approximations are pretty abstract concepts, but they're ideally suited to being coded. Once implemented on code they become real things, rather than just equations on a page.