Thread: 'the produkt' link!?

  1. #1

    Join Date
    May 2005
    Posts
    1,042

    'the produkt' link!?

    Okay there was a first person shooter game posted on these forums (and premiered on NEHE sometime ago) called something like 'the product' or 'the produkt' (by some german company). The entire game is stored in 6KB (all data, instructions, etc).

    To make a long story short, I have a friend that is trying to program an amazing "revolutionary" compression algorithm...I don't want to break his heart and tell him that his goal (as it stands) is completely unattainable (there's more to this story...I got nominated to go to a tech conference along with another guy that wasn't good at programming, because of grades, and this friend was not nominated, although a better programmer), so I am trying to find a link to 'the produkt' to show him some neat ways of utilizing compression (he was very interested when I described it to him, but I cannot find a link to that game for the life of me, I swear).

    I *did* try a google search, a search on these boards, and looked through NeHe's downloads...I'm very frustrated right now.

    ok bye.
    I'm not immature, I'm refined in the opposite direction.

  2. #2
    Crazy Fool Perspective's Avatar
    Join Date
    Jan 2003
    Location
    Canada
    Posts
    2,640
    The impressive bit of that game wasn't compression. It was procedural algorithms. Its not small because its so compressed. Its small because nothing is stored except algorithm implementations to generate everything.

    I don't remember the name though...

  3. #3
    carry on JaWiB's Avatar
    Join Date
    Feb 2003
    Location
    Seattle, WA
    Posts
    1,972
    "Think not but that I know these things; or think
    I know them not: not therefore am I short
    Of knowing what I ought."
    -John Milton, Paradise Regained (1671)

    "Work hard and it might happen."
    -XSquared

  4. #4
    Supermassive black hole cboard_member's Avatar
    Join Date
    Jul 2005
    Posts
    1,709
    That was pretty cool except I got stuck in a doorway fighting that huge thing... I stood there like a tit and it closed on me then I couldn't move.
    Good class architecture is not like a Swiss Army Knife; it should be more like a well balanced throwing knife.

    - Mike McShaffry

  5. #5

    Join Date
    May 2005
    Posts
    1,042
    The impressive bit of that game wasn't compression. It was procedural algorithms. Its not small because its so compressed. Its small because nothing is stored except algorithm implementations to generate everything.

    I don't remember the name though...
    I tried explaining that to him (it is, ultimately, a form of 'compression' when you can have a game of that complexity totally stored in <100KB of instructions), and he claims that's similar to what he is trying to do...the problem is that his aims are not particularly realistic...he wants to be able to take any file (completley arbitrary) and 'compress' it into less than 100 bytes.

    Thanks for the link jawib! Turns out it was 96KB, not 6. Hmm.
    Last edited by BobMcGee123; 07-27-2006 at 07:53 PM.
    I'm not immature, I'm refined in the opposite direction.

  6. #6
    Slave MadCow257's Avatar
    Join Date
    Jan 2005
    Posts
    735
    To make a long story short, I have a friend that is trying to program an amazing "revolutionary" compression algorithm...I don't want to break his heart and tell him that his goal (as it stands) is completely unattainable
    Compression is quite dissapointing. I spent the time to learn all the techniques only to realize there's very little demand. Revolutionary compression is most likely impossible in image or sound. Abandoning mathematics, the file format would be ignored by the market (Think mp3 and what it's taken to get a small % of people to use other formats. I admit it, all my music is 128 mp3s.) I think the exception is real world images. Cameras are improving, and many-megapixel photos could/will be a reality. Fractal compression has potential, it just has resisted current efforts.

    The other market that would respond is video. I saw the most beutiful 2 minute footage of egypt with nice 5 speaker sound....but the file was 160 megs.

    Lossless data compression is saturated as well. There are mathematic limits that current formats like 7z nearly meet. Not to mention there are 1000's of home grown formats and 5-10 very popular ones.

  7. #7

    Join Date
    May 2005
    Posts
    1,042
    It is indeed an extremely interesting field, but I think that many aspects of it would be better suited to be solved by mathematicians, later to be implemented by programmers (which I'm guessing is what has largely happened insofar). This did not become apparent to me until I tried to implement the LZW algorithm. It's ultimately basic enough, and not *too* difficult to get reasonable compression for my simple approach, but the idea of trying to find the most efficient repetitive sequences of data that minimizes the memory footprint seems to have an infinite number of mathematical approaches (although most get tossed on the wayside when you factor in computation time).

    I can't find my implementation, but I do remember I used this site to get me started:

    http://www.cs.cf.ac.uk/Dave/Multimedia/node214.html

    I also recently implemented the functionality to find and store only unique vertexes in a polygon soup (a 3d model) to convert a milkshape3D model to a more tightly-packed renderer-friendly internal format. Again, this was an endeavor that sounds more complex than it actually is.

    Do you have any compression projects of your own that you care to share?
    I'm not immature, I'm refined in the opposite direction.

  8. #8
    (?<!re)tired Mario F.'s Avatar
    Join Date
    May 2006
    Location
    Ireland
    Posts
    8,446
    Quote Originally Posted by BobMcGee123
    he wants to be able to take any file (completley arbitrary) and 'compress' it into less than 100 bytes.
    Yep. That is unrealistic. More, it's most probably impossible, unless we find some other form of storing data (maybe quantum hard drives).

    The problem with his theory is that there is always tradeofs in compression no matter how we choose to do it.

    Looseless compression involves computational power. If he wanted to achieve extreme ratios never achieved before he would have to construct a mammoth dictionary and be a very clever man indeed to build the algorithms around it. Computational power would be seriously affected by this model. More, looseless compression is directly affected by the size of the original file. Information costs space. And 100 bytes would only answer some hypothetical original size threshold. A file 1 bit longer than that threshold, would produce a compressed file larger than 100 bytes. So looseless compression is not the answer for him.

    Lossy compression will enable him to do any kind of compression. He can certainly achieve 100 bytes no matter the size of the original file, no matter the type of file. What he will certainly not get is the original file when he decompresses. Lossy compression is not the answer for human readable files. So this already overthrows his "any kind of file" demand. But there is more. This model has a very simple formula. x% compression = x% loss in data. So it is highly dependant on the compress ratio to be usable. One could imagine algorithms to recover some of the missing data by mixing this model with some dictionary-based compression method. But then we would have the same problem as with the looseless method; i.e. It would be directly affected by the size of the original file.

    So, no. He can't do it. Not that anyone of us doubts the contrary. I'm just rambling here hoping I help you give him some good arguments against.

    If he finds some other way of storing data in which 1 bit means more than 0 or 1, then he just found the answer. He will also be a very rich man and he just invented the first quantum hard drive. Untill then...

    (and even so, he will still be limited. Although certainly by a far less margin than we are today. Physics impose limitations on everything. At least that's our understanding so far of this science.)
    Last edited by Mario F.; 07-27-2006 at 09:07 PM.
    Originally Posted by brewbuck:
    Reimplementing a large system in another language to get a 25% performance boost is nonsense. It would be cheaper to just get a computer which is 25% faster.

  9. #9
    Slave MadCow257's Avatar
    Join Date
    Jan 2005
    Posts
    735
    f he wanted to achieve extreme ratios never achieved before he would have to construct a mammoth dictionary and be a very clever man indeed to build the algorithms around it.
    I've seen projects where encoding time isn't important and gains are in the 5% range. That is the nature of lossless generic data compression. Reduction in file size curves, and normal encoders fall deep on that curve, whereby increases in encode time produce very little benefit.

    This model has a very simple formula. x% compression = x% loss in data.
    Not really true in practice, things come down to cleverness because people simply aren't increasing quantization ratios. For example AAC is smaller and better quality then mp3

    Do you have any compression projects of your own that you care to share?
    I throw away everything digital, but keep everything physical. So no, I don't have any projects to share, just the experience from doing them (I value that because a random hard drive crash won't cause a problem for me. Things happen, just two days ago something inside my power supply exploded. It smoked )

  10. #10

    Join Date
    May 2005
    Posts
    1,042
    Yep. That is unrealistic. More, it's most probably impossible, unless we find some other form of storing data (maybe quantum hard drives).
    To be fair to him, it could be implemented if he was trying to 'compress' something not particularly complex (as you said below: "And 100 bytes would only answer some hypothetical original size threshold"), and/or something *not* totally arbitrary (the makers of 'the produkkt' were able to store the instructions *specific to their game*, but they did not write software that could compress *any* arbitrary file). He is has a procedural approach, where instead of building a dictionary/table of terms/codes/sequences, he is instead trying to write a program that can basically generate mathematical expressions/equations that, when called, produce the proper sequences that represent the original data. This is somewhat described on 'the produkkt' (link above):

    "fiver2 noticed that he only needs a few simple primitives and filters to create very realistic looking textures. he defined a set of operations and asked chaos to write a user interface where the artist can specify, modify and store these operations. in the 64k program, these stored operations are executed and the image is generated."

    the problem, however, is that this does not seem possible for an arbitrary file, the makers of theprodukkt stored instructions specific for what they were making.

    So, no. He can't do it. Not that anyone of us doubts the contrary. I'm just rambling here hoping I help you give him some good arguments against.
    Yeah. The person I am dealing with is sort of a unique story. He is actually quite bright and can grasp abstract ideas as can you, but I also perceive him to have an unrealistic/irrational belief that just because he understands something he can definitely implement it. Granted, you might feel like I'm describing everyone on cprogramming.com, but with this person this quality is especially pronounced, and you'd simply have to know him to agree with me I also perceive him to be lazy, and he's also kind of a douchebag (he badmouths me often behind my back, that my skills were 'overrated' in high school and therefore I did not deserve to be nominated for the tech conference) but I choose to try helping him nonetheless because I suck.


    Thanks for the replies.
    I'm not immature, I'm refined in the opposite direction.

Popular pages Recent additions subscribe to a feed

Similar Threads

  1. Linked List, Please Help!
    By CodeMonkeyZ in forum C Programming
    Replies: 5
    Last Post: 02-17-2009, 06:23 AM
  2. I'm confused about link lists (again)
    By JFonseka in forum C Programming
    Replies: 4
    Last Post: 06-13-2008, 08:13 PM
  3. Function to check memory left from malloc and free?
    By Lechx in forum C Programming
    Replies: 4
    Last Post: 04-24-2006, 05:45 AM
  4. Request for comments
    By Prelude in forum A Brief History of Cprogramming.com
    Replies: 15
    Last Post: 01-02-2004, 10:33 AM
  5. Undefined Structure in Link List
    By _Cl0wn_ in forum C Programming
    Replies: 1
    Last Post: 03-22-2003, 05:57 PM