Next we'll take a look at various different types of recurrences that are more complicated than linear first order that we just saw. Just to get some idea of the types of things that can arise when we're faced with recurrence relation in general to solve. So simplest one we just looked at is the so-called linear first order recurrence. Linear just means that we don't multiply a sub n times an or minus 1 or anything like that. We just use coefficients that are not evolving our unknowns. And first order means that on the right hand side, we just have one term an-1. So here's nonlinear where we're dividing an-1 into a constant, so that's just not a linear combination of it it's dividing. So it's first order nonlinear, and actually we'll talk about that one. So here's second order linear. It's linear combination of an-1 and an-2. The coefficients can involve Ns and constants, but not multiplying or dividing with our unknowns. This is an example of a non-linear second order where it's a function of an-1 and an-2, but we're multiplying and taking square roots or whatever else. Now these types of things can be very easy to write down. And they're even easy to write computer programs to compute values. But doing the math behind some of these can be quite a challenge. So variable coefficients, that's the one that we started with, where we don't use constants, but we do use functions of n. And those arise a lot in the analysis of algorithms. And then, there's higher order, which go back way further than just the first two terms. For example, the quick sort is an example of that, where it's a full history. That is every value, every term in the sequence depends on all the previous terms in the sequence. That's a full history. And then there's a special case that arises quite often in the analysis of algorithms. And that's the so called divide and conquer. Where a sub n depends on terms say about halfway back in the sequence. And, we'll have quite a bit to say about that in just a bit. And those are very important in the analysis of algorithms. So, what about nonlinear first order recurrences? Well those are familiar actually to people who do numerical calculations. A very famous example of that is Newton's method for computing roots of functions. I'll just do an example for computing the square root of two. Newton's method says in order to compute the square root of 2 what you do is start with some unusual estimate like start with 2, and then take the average of your current value and 2 over your current value. And use that as your next estimate. That's defining a sequence. And you can see from this formula, if you get close to the square root of 2, or if you get exactly a square root of 2, 2 over the square root of 2 is the square root of 2 plus the square root of 2 and divided by 2 is equal to square root of 2. So you should converge on the value of square root of 2. That's Newton's method. And I can write the code to compute. The square root of 2 this way, and just another sequence in the code that I gave for the quick sort of Fibonacci numbers. And if you print out values for that, you see it gets very close to the square root of 2. And actually, very quickly. It's actually got quadratic convergence. The number of significant digits doubles on each iteration. It's a very efficient way to compute routes and applies to lots of other functions, not just square root. So that's non linear, first order recurrence. And again, doing the math to prove that this happens is certainly the analysis of algorithms. It's like analysing this algorithm but it's not of a nature, the recurrences of a quite different nature, than the kind that we're going to largely be considering in this course. So, higher-order linear recurrences, those are something that come up a lot. In the next lecture, we're going to show a systematic solution that uses generating functions for solving recurrences like this, and then actually later on we're going to examine them from an even more general point view, but here's an example. So this is a second order of linear occurrence with constant coefficients, like the Fibonacci numbers, and so the idea is that it's hard to telescope this one. You could try to telescope it, but very soon you have a lot of terms and maybe not so much of an idea of where it's going to go so easily. So one way to see how such an equation can be solved and again this kind of a magic step, but I'll show next lecture how this becomes systematic. Is to say well, I think that it's going to grow like some number to the Nth power. So lets say that A N equals X to the N. So if A N is going to be equal to X to the N, then it would have to be the case that we have X to the N equals 5 X to the N minus 1 minus 6 X to the N minus 2. In that kind of equation is a good thing for us, because it means we can get rid of the N by just dividing X to the N minus 2. For now we've just got a quadratic equation, and we've known since middle school how to solve quadratic equations. So that factors to x minus 2 times x minus 3. And it means that both 2 to the n and 3 to the n are solutions to this, and actually it's not too much of a step to see that the final solution must be of the form ace of n equals some constant times 3 to the n plus some other constant times 2 to the n. And again, I'm not going to go through the proof of these postulations, because I'll give a very systematic way for doing it in just the next lecture. But you can believe from these manipulations that that's what's gotta happen, and so now what we have to do is find those two constants. Well those two constants are easy to find, because the initial conditions. We have two constants that we need, and we have two initial conditions so we can just plug in for A0 and A1. And it says that we must have that 0, which is A0, is equal to C0 plus C1. And that one, which is A1 must be 3Z0 plus 2C1. So those are simultaneously equations in c 0 and c 0, and then the solution is c 0 equals 1 c 1 equals minus 1, 1 plus minus 1 is 0. Three times 1 plus 2 times minus 1, 2 minus 2 is a 1 and then that's the solution. Plus 3 to N minus 2 to the N is the solution to that equation, and plug it in to check that's the answer, but that definitely works in this case. And that kind of technique is going to work for any second order linear recurrence with constant coefficients, like the Fibonacci numbers. Fibonacci numbers, the same setup, but the coefficients are both 1, so then, we get xn = xn-1 = xn-2, which gives us this quadratic equation, x2-x-1=0. Solution to that, using the quadratic equation, minus b plus/minus the square root of b squared minus 4 ac, is going to have a square root of 5 in it, B squared is 1. 4 ac is minus 4. Minus 4 ac is plus 4. 1 plus 4 is 5. And so it works out from the quadratic equation. That equation factors in terms of those two roots. 1 + the square root of 5 over 2, and 1- the square root of 5 over 2. In this one, 1 + the square root of 5 over 2. It's a very famous number known as the golden ration, and maybe we'll come back to that at some time. It has all kinds of interesting properties. But in the current context, this just a root of that quadratic equation. So, again, just as before, the solution must be linear combination of these two terms c0 times c to the n plus c1 times theta to the n, and how do we find those coefficients? Same way we just plug in a zero equals zero has to be c0 plus c1, and A1 want to equal one has to be phi c0 plus phi hat c1, solve those two equations, and you find out that the c0, is 1 over square root of 5, and c1 is minus 1 over square root of 5. And that's a solution. So that's what we're after, is given the recurrence, we have a simple equation for the nth term. That's solving the recurrence. And again, we can do that systematically for any such recurrence. And this will come up, another couple of times, in the next few lectures. So, this procedure, in a sense, it amounts to an algorithm. So that is given any recurrence that we can go through this and come up with a solution and that's ideal. Actually there's one case that we didn't consider, and that is what if the roots are the same. If the roots are the same, then we get an N times the root to the Nth term there. And that's discussed in the text. But I'm not going to talk about it now because, again, we're going to do this with generating functions next time. So what you wind up is needing to compute roots of equations quickly. And this extends to higher order recurrences. So that's the an example, now a days where you use a symbolic math package to go ahead and compute the roots. And this is computing things with sage, but you can use mathematic or maple or other packages to go ahead and compute roots. That's where you do it now a days. So we have an algorithm for solving [COUGH] higher order recurrences with constant coefficients. We'll develop this more fully later on. The other point that I didn't mention is the roots might be complex. And again, rather than go through it in this context, we'll see what happens when we talk about it in the contents of generating functions. Okay, the last thing that I just want to mention is that the, a lot of times we get recursive programs that map directly to recurrences of the divide and conquer style, and those are not traditionally found in studies of recurrence in Mathematics, that's something that's really brought to the table by computer algorithms. We'll look at those in some detail later in this lecture. That's the general types of recurrences that come up in the analysis of algorithms.