Ok, so I have an application, where I first build a tree and then I search it. The tree has 130.000 elements and I perform 1400 queries.
How should I measure time/memory? The second one is actually what I do not know. I am using the terminal in Linux.
Take a look at this
The Search for...clock time comes from my code (where I use a method named Nomimal’s animal’s (which once was a member of this forum :/) approach from here).
samaras@samaras-A15:~/code/dD_spatial_drarch_NN$ /usr/bin/time -f "\n%E elapsed,\n%U user,\n%S system,\n%M memory\n%x status" ./nn
Search for 1 NN of 1400 queries in the tree,with approximation 0.0 took 1.456464554seconds wall clock time.
However, I measure only the queries in my code.
What is the time command actually says me? Googling confused me very much!
Every element of the tree contains 17 doubles.
Is the memory that's there (694304 memory) telling me how much memory does the application takes? I remember that valgrind can help, but I do not remember/know how!
I also thought of sizeof(), but I think that's not good, since
will yield the same results.
EDIT: I also found this # valgrind --tool=massif program
So what is the output that time gives me? Is it enough?