# Heap?

• 01-09-2009
audinue
Heap?
Does heap memory that they said in programming discussion is:
http://en.wikipedia.org/wiki/Heap_(data_structure)

And also stack in _alloca?
http://en.wikipedia.org/wiki/Stack_(data_structure)
• 01-09-2009
matsp
No that is a heap datastructure, and has nothing (directly) to do with Heap memory.

Heap memory is just a chunk (or even multiple chunks) of memory that is given to the application - malloc or new then carves this up into smaller sections that are given back to the application for it's use. You can make a "simple heap" by just declaring a large array and then letting your own version of alloc return some part of the array, and free will allow the passed in part of array to be reused.

And the stack is not the same either.

--
Mats
• 01-09-2009
audinue
Do you know what is the meaning of "Θ(lg n) or Θ(1)"?

I don't understand what are they wrote on the page with those symbols...
• 01-09-2009
laserlight
Quote:

Originally Posted by audinue
I don't understand what are they wrote on the page with those symbols...

Before presenting the table that uses symbols which you do not understand, that article links to another article on computational complexity. Read that other article.

EDIT:
okay, but you will have to follow yet another link to an article on "Big O" notation.
• 01-09-2009
audinue
At Comparison of theoretic bounds for variants in Heap(data structure)

Θ(lg n) or Θ(1)?

Θ == speed/memory/something?
lg == log?
n == ?
• 01-09-2009
matsp
Quote:

Originally Posted by audinue
At Comparison of theoretic bounds for variants in Heap(data structure)

Θ(lg n) or Θ(1)?

Θ == speed/memory/something?
lg == log?
n == ?

Big-O notation indicate "the number of operations". n is "the number of items". lg n certainly means log(n). O(1) means "one operation" - for example, adding an element to the end of a linked list is O(1) if you have a pointer to the end you are adding at. If you have to walk the linked list, it becomes O(n). A binary search in an array is O(log2(n)).
Simple forms of sorting (which consist of two nested for-loops) are O(n^2).

Note however that this is just a "overall efficency" number - so you can't directly compare it. For example, if we store hash-values of strings in an array (sorted in hash-order), and compare it to a binary search in a straightforward array of strings, the O number will be the same. But comparing (long) strings is much slower than comparing a single integer.

So O numbers are really only "an indication of what happens when n grows" - do if you double the number of items, does it take twice as long, a tiny bit longer or MUCH longer.

--
Mats
• 01-09-2009
tabstop
And yes, lg == log2. It looks like you've got Θ, which is Big-Theta notation. Big O is just an upper bound; Big Theta is both an upper bound and a lower bound (meaning it's right on).
• 01-12-2009
matsp
Quote:

Originally Posted by tabstop
And yes, lg == log2. It looks like you've got Θ, which is Big-Theta notation. Big O is just an upper bound; Big Theta is both an upper bound and a lower bound (meaning it's right on).

So Big-Theta says "It always takes this long", but Big O is "It could take this long, but also less time" - I learned something new today too!

--
Mats
• 01-12-2009
whiteflags
As I understand it, if we can say "It always takes this long," then we have a procedure that takes place in constant time. You are correct in that a stricter notation will establish guarantees, but wikipedia says this about the matter:

Quote:

Originally Posted by wiki
Infinite asymptotics

Big O notation is useful when analyzing algorithms for efficiency. For example, the time (or the number of steps) it takes to complete a problem of size n might be found to be T(n) = 4n² − 2n + 2.

As n grows large, the n² term will come to dominate, so that all other terms can be neglected — for instance when n = 500, the term 4n² is 1000 times as large as the 2n term. Ignoring the latter would have negligible effect on the expression's value for most purposes.

There is also no guarantee about "it could take this long". Such a guarantee is an upper bound. Perhaps, "we've never seen anything perform worse before". That's probably not right either but Big O really offers nary a guarantee except how the work will scale with n.