Ok after talking with Shakti for a couple of hours on Teamspeak I think we finally have a solution or close to one.
Whether or not I use my tree class or std::map all depends on the speed of my functions vs. the speed of STL. I sorta have a feeling I will lose....but that's not the issue.
But I will use this structure:
Code:
struct ResNode
{
CResource *pResource;
ResNode *pNext;
ResNode *pPrev;
};
class CResMgr
{
protected:
std::map<DWORD,ResNode *> m_mapResources;
...
...
};
CResource is a simple class that, like the code posted previously, just stores an ID in the base class and other resource classes then derive from this very simple base.
Cannot use std::list and cannot store vector indices in the nodes
The biggest problem with using a map and a list of IDs or vector of IDs is this. When a resource is found to be in the tree, I must move that ID to the top of the list. This means that the list or cache holds the most recently used resources. There are several problems here.
This operation is basically Tree to List. I find the ID in the tree and then somehow I must find the ID in the list to move it to the top. list.find() is most certainly not an option since it is linear in time. Storing indexes in the tree is also not an option because at some point the indices into the list will become invalid and/or will point to the wrong objects.
Because of these two limiting factors using std::list and using stored indices into a vector in the tree nodes is not an option.
Solution: Linked list inside of the tree
So take the following object IDs representing object and a cache representing the order in which they were created. Instead of calling the list a list I will now refer to it as the cache since that is what its function is.
How the cache works
The cache grows downward. This also requires the class to track the top of the cache. This is very simple. When there are no objects yet in the tree we know the first object to be added is going to be the top. When I add a new object to the tree, the top then becomes the new object (new node). The new node's next member is now the former top member. In this fashion the cache will grow upward with top always changing. I will also have to track bottom so I know which object to remove from memory. Instead of actually shifting the object IDs in the cache I will track the total size of the cache. Once the cache reaches a set limit, I automatically remove the bottommost cache ID from the tree. This does NOT guarantee this is an optimum removal for the tree...but it does manage the memory used by the game.
Sample tree and cache
Top of the cache is 5 and bottom is 7. I could not draw the tree here but it is not important to the discussion anyways.
Tree - 5 10 3 4 8 1 7
Cache - 7 1 8 4 3 10 5
Object 7 was created last and object 5 was first or most recent. Now let's say something in the game is requesting object 4.
All I really need to do is stick object 4 in front of 5 and then make up for the gap created by 4 being moved. This is a simple operation. 4's prev member is 8. Since 4 is moving, 8's next member is now 4's next member. 3's prev member is now 4's prev member. A simple linked list removal. That takes care of the cache gap.
Now to take care of the top of the cache. 5's next member is NULL since it's the top of the cache. If we set 5's next member to be 4 and 4's prev member to be 5...then we have moved 4 from it's spot in the cache to the top of the cache. We have changed the order of the cache and changed the top or root with 2 simple operations.
The beauty of this system is that I do not have to 'find' the object ID in the cache since I'm always storing it's prev and next members in the tree nodes. As long as next and prev are correct this system will work and I can use a tree to find the objects which is very fast.....and use the linked list approach to remove items. I don't have to start at the top of the cache and work through it in linear fashion because I already know the prev and next members of the current object.
Confusing? It was for me until I thought it out.
In short the cache represents the most recently used (top) to least recently used (bottom) object IDs in the tree. The objects are in the tree and as long as the ID assignment scheme is decent it should be a quick find operation to access them and return the resource to the caller.
So whaddya think? I think this is the same as a double ended queue. I have not had much exposure to that container so I don't know.
ID assignment algorithm
The next task is correct assignment of ID's to keep the tree somewhat balanced. In some of the examples that Shakti and I worked through, the map quickly became a linked list which is not what I want. Keeping the tree balanced is going to require some ingenuity. Hopefully I have enough left.