I am writing a program for a class that calls for Huffman coding. The coding itself is not the issue. The problem is that I can't figure out why a chunk of my code works for the first 90% of the iterations, but fails at the very end, causing most of the codes to be correct, but some to be wrong. This is not the best way to do this I am sure (I don't actually build a tree), but it should work.
When symbols are combined, I keep track of the originals in an array. All symbols in the array are affected when the combined symbol is affected. I.e: when symbol "ab" has a 1 added to it, so do "a" and "b".
The included code is the part where these indicies are updated when symbols are combined. This is what is failing near the end of execution.
The whole program is available hereCode:// This code is at the end of the huffman() function // -1 is a dummy value for an unused array element // this codes executes 12 times with the sample input but that shouldn't matter // DUMMY = 999999 // CODES = 256 // arrays are declared with CODES (256) elements // add in the new dependencies for(i=0; i<CODES && lower.indicies[i]!=-1; i++){ if(_codes[_order[i]].total != DUMMY) new.indicies[i] = lower.indicies[i]; } // add in the new dependencies for(j=0; j<CODES && low.indicies[j]!=-1; j++){ if(_codes[_order[j]].total != DUMMY) new.indicies[i+j] = low.indicies[j]; } // replace the old entries _codes[_order[0]].total = DUMMY; _codes[_order[1]] = new; // reset all dependencies for used symbols for(i=0; i<CODES; i++){ _codes[_order[0]].indicies[i] = -1; }
Thanks in advance