
Number sequence help
Say I have the following recursive numbersequence: a(n) = a(n1)+2a(n2)
I know that a(1) = 2 and I know that a(2) = 3. Now I need to find an explicit formula for that.
I know it has something to do with fibonaccis sequence since, if a(1)=1 and a(2) = 1 the explicit formula would have been a(n) = (2^n(1)^n)/3. The problem that this does not work when a(1) = 2 and a(2) = 3.
Can anybody explain how I should come up with an explicit formula for this problem?


herrrrrrrrrrrm.......ummmmmmmmm.......no :D

It can be solved with matrices. I'll see if I can produce some images for you.

That would be very nice!!

OK, the board is semidown right now, so we'll see if this gets through..
I'm going to assume that you know how to handle matrices. First, we'll define a matrix A like this:

Ahh, the boards are back!
Now, we add your recursive formula:

Ok....yah I know how to handle matrices (at least to the point where I can multiply them).

We want to diagonalize X, to make it easy to calculate X^n.

When we perform the last multiplication with A2, the first row of the result will be a(n+2).
Here's the final formula:

Numbers generated from this formula:
3 7 13 27 53 107 213 427 853 1707 3413 6827 13653 27307 54613 109227 218453 436907 873813 1747627 3495253
Seems correct for the first three, at least. HTH

That definately seems to be correct! Thanks a bunch, now I just have to get a good grip on everything but that shouldnt be too hard... :D! Thanks.
It is correct for the 18 first numbers.