# Calculating Entropy of a file and coding it using Hamming

• 06-19-2009
kordellas
Calculating Entropy of a file and coding it using Hamming
Hello there!

I am trying to solve a problem for a project i took and i am in the final

part of it...(view below for my until now code)

Well the aim is to calculate the entropy of a file and then to use

Hamming to code it. I managed to make both parts based on what I learned

from the class and read in the book. (Please if you are familiar with the

object tell me if it is correct :) )

So, my problem is that I am not able to merge those two parts below (you

dont need to read the theory for it).. like finding the entropy and then

continue for the coding of that file.

Any help, advice, anything is appreciate.
Thank you!

Entropy Calculation

Code:

``` #include "BufferedNode.h" #include "Buffer.h" #include "Vector.h" #include <strstream> #include <math.h> #ifdef HAVE_VALUES_H #include <values.h> #endif #ifdef HAVE_FLOAT_H #include <float.h> #endif class Entropy; DECLARE_NODE(Entropy) /*Node  *  * @name Entropy  * @category DSP:Misc  * @description Calculates the entropy of a vector  *  * @input_name INPUT  * @input_type Vector<float>  * @input_description Input vector  *  * @output_name OUTPUT  * @output_type Vector<float>  * @output_description Entropy value (vector of 1)  * END*/ class Entropy : public BufferedNode {     int inputID;   int outputID; public:   Entropy(string nodeName, ParameterSet params)       : BufferedNode(nodeName, params)   {       inputID = addInput("INPUT");       outputID = addOutput("OUTPUT");   }   void calculate(int output_id, int count, Buffer &out)   {       ObjectRef inputValue = getInput(inputID, count);       const Vector<float> &in = object_cast<Vector<float> > (inputValue);       int inputLength = in.size();       Vector<float> &output = *Vector<float>::alloc(1);       out[count] = &output;       float s2=0;       float entr=0;       for (int i=0;i<inputLength;i++)       {         s2+=in[i]*in[i];       }       s2 = 1/s2;       for (int i=0;i<inputLength;i++)       {         if (in[i] != 0)             entr -= s2*in[i]*in[i] * log(s2*in[i]*in[i]);       }       //cout << entr << endl;       output[0] = entr;   } };```
Hamming Coding

Code:

```#include<iostream.h>  #include<math.h>  void hanming()  {  int i,n,k=2;  int h[20];  for(i=0;i<20;i++)h[i]=0;  cout<<"bla bla"<<endl; cin="">>n;  while(pow(2,k)<n+k+1)k++; cout=""><<"bla bla"<<endl; for(i="1;i<=n+k;i++){" if(i!="1&&i!=2&&i!=4&&i!=8)cin">>h[i];  }  h[1]=(h[3]+h[5]+h[7]+h[9]+h[11]+h[13]+h[15])%2;  h[2]=(h[3]+h[6]+h[7]+h[10]+h[11]+h[14]+h[15])%2;  h[4]=(h[5]+h[6]+h[7]+h[12]+h[13]+h[14]+h[15])%2;  h[8]=(h[9]+h[10]+h[11]+h[12]+h[13]+h[14]+h[15])%2;  for(i=1;i<=n+k;i++)cout<<h[i]; jiaoyan(int="" a[],int="" n)="" {="" i,p1,p2,p4,p8,m;="" int="" h[20];="" for(i="0;i<n;i++)h [i+1]=a[i];" k="2;" p1="(h[1]+h[3])%2;" p2="(h[2]+h[3])%2;" m="2*p2+p1;" return="" m;="" }="" if(n="=3){">=5&&n<=7){  // k=3;  p1=(h[1]+h[3]+h[5]+h[7])%2;  p2=(h[2]+h[3]+h[6]+h[7])%2;  p4=(h[4]+h[5]+h[6]+h[7])%2;  m=4*p4+2*p2+p1;  return m;  }  if(n>=9&&n<=15){  //k=4;  p1=(h[1]+h[3]+h[5]+h[7]+h[9]+h[11]+h[13]+h[15])%2;  p2=(h[2]+h[3]+h[6]+h[7]+h[10]+h[11]+h[14]+h[15])%2;  p4=(h[4]+h[5]+h[6]+h[7]+h[12]+h[13]+h[14]+h[15])%2;  p8=(h[8]+h[9]+h[10]+h[11]+h[12]+h[13]+h[14]+h[15])%2;  m=8*p8+4*p4+2*p2+p1;  return m;  }  else{  cout<<"bla bla"<<endl; return="" -1;="" }="" ����="" void="" main()="" {="" hanming();="" coco;="" int="" i,n,m,h[20];="" cout=""><<endl; cout=""><<"bla bla"<<endl; cin="">>n;  cout<<"bla bla"<<endl; for(i="0;i<n;i++)cin">>h[i];  m=jiaoyan(h,n);  if(m==0)cout<<"bla bla"<<endl; if(m!="0)cout<<"bla bla"<<m;" cout=""><<endl; cin="">>coco;  }```
• 06-19-2009
legit
I'm assuming that both files are .cpp files as they both define functions? If so, then they don't need to be "put together" so to speak, they just need to be in the same project. But if the Entropy class is in a header file, then that's a different story, which you already know how it ends judging by your #include statements ;)
• 06-19-2009
kordellas

What i got here is what i read from the book and staff...

However, how can i put these two in one project?
And what happens if I must migrate them ?
• 06-19-2009
Salem
• 06-19-2009
legit
Quote:

Originally Posted by kordellas
However, how can i put these two in one project?

Well what compiler are you using to compile your code? You need to help us in order for us to help you. ;)

Quote:

Originally Posted by kordellas
And what happens if I must migrate them ?

I'm not quite sure what you mean when you say "migrate", do you mean move them inbetween projects or something to that extent?

--
Legit
• 06-19-2009
kordellas
Quote:

Originally Posted by legit

I'm not quite sure what you mean when you say "migrate", do you mean move them inbetween projects or something to that extent?

--
Legit

I mean to make those two in one...
• 06-19-2009
legit
Quote:

Originally Posted by kordellas
I mean to make those two in one...

Dude... you need to try to make your statements a little less vague, "make those two in one" could mean put them in a project or copy and paste the code into one file. Like I said before, help us to help you.

--
Legit
• 06-19-2009
Kudose
I think he is trying to figure out how to implement that algorithm in his class.
• 06-19-2009
legit
Quote:

Originally Posted by Kudose
I think he is trying to figure out how to implement that algorithm in his class.

To be honest, and i'm sorry if i'm overstepping the bounds here, but I don't think it's his/her code, they seem to know about classes, inheritance and container classes... and he/she doesn't know how to implement two files so they can work in coherence with eachother. Again, sorry if i'm overstepping, but that's just the way I pick it up.

--
Legit
• 06-19-2009
Kudose
It's cool. No one can seem to figure out what the OP wants and the OP can't express it well enough, so I figured I've give it a shot. :)
• 06-19-2009
brewbuck
I don't understand your entropy calculation. Why are you taking the entropy over the SQUARES of the values in the input array? I'm assuming the input array lists the prior probabilities (marginal or joint?) of some random variable. In which case it is improper to square these values when computing the entropy. The code does normalize the values, so the entropy equation doesn't explode, but I don't get the squaring. Are you using some entropy measure other than the standard Shannon entropy? Or do the values in the input array constitute something other than prior probabilities?

This first bit of code looks very much like a piece of a much larger info-theoretic engine. Either a data mining application, an AI of some kind, or a compression engine.

The second piece of code differs vastly in style from the first. I'm sorry if I'm wrong, but I have serious doubts that you wrote either of these two pieces of code. Anybody who wrote the first piece of code would not be asking the question you are asking.
• 06-19-2009
legit
Quote:

Originally Posted by brewbuck
The second piece of code differs vastly in style from the first. I'm sorry if I'm wrong, but I have serious doubts that you wrote either of these two pieces of code. Anybody who wrote the first piece of code would not be asking the question you are asking.

That's what I thought ;)

--
Legit