I was doing a test of my game I am making....and while testing some new things I got a dramatic decrease in my frame rate.

First I changed the resolution to 640x480, which I guessed would decrease my frame rate by a little, but not a lot (I initially had it at 320x240).

I also changed my background map which I blit onto the screen. For all testing in the past, I have had a very simple background BMP which I through together in paintbrush, and using that background and at 320x240, I was getting 70 fps.

This time however, I took the time to get a nice background map to use so my game would look pretty. Using this nice background map at 640x480 fps, I got 27 fps.

This was a dramatic decrease in fps. I guessed it was soley because of the resolution change, and did not have anything to do with the change in bitmaps, because I was simply blitting the BMP's onto the screen. A blit is a blit, right? It is just a simple memcpy from memory of one location to another, so I guessed that the difference in detail of the BMP wouldnt matter, because it was just simply blitting the memory from one location to another...that simple...

So then I changed my res back to 320x240 to see if I would get my nice 70 fps frame rate back even when using the new background, but it stayed inside the 20-30 fps range every time I tested it....

So there are a couple things that might be the factor:

A. I forgot to change something back to its original factor in 320x240 mode after I had changed it to something else forr 640x480 mode, and that thing is somehow effecting my frame rate in a sorely negative way.

OR

B. The detail in a blit DOES matter, so the huge amount of increase in detail in my new background has dramatically decreased my frame rate.

What do you think it is? Does detail in a BMP really matter when you are blitting? Whats my problem?