I'm working on a pathfinding ai for a strategy game. The idea that I have is to take the initial position, look at the four surrounding tiles, and if those are ok (are on valid spaces that haven't been used yet), use them as initial positions in the next loop. Then I can just test each new spot to see if it is the destination and use the number of iterations through the main loop to determine how many steps it took to get there. This seems like an ok way to do it to me, but when I wrote it out in a simple console way, I got some weird system error. I traced it in MSCV and found out that this function:
Code:
void addpath(path_t* path,int& numpaths,int oldpath,coord_t newcoord) {
  path_t* temppath = new path_t[numpaths];
  int i;
  for (i = 0; i < numpaths; ++i)
    temppath[i] = path[i];
  delete [] path;
  ++numpaths;
  path = new path_t[numpaths];
  for (i = 0; i < numpaths-1; ++i)
    path[i] = temppath[i];
  delete [] temppath;
  path[numpaths-1].coord = new coord_t[path[oldpath].numcoords+1];
  for (i = 0; i < path[oldpath].numcoords; ++i)
    path[numpaths-1].coord[i] = path[oldpath].coord[i];
  path[numpaths-1].coord[path[oldpath].numcoords] = newcoord;
  path[numpaths-1].numcoords = path[oldpath].numcoords+1;
}
was returning unitialized values for path[pathi].numcoords. This doesn't make any sense to me and I was wondering if anyone out there could help me out.