Holes in terrain

This is a discussion on Holes in terrain within the Game Programming forums, part of the General Programming Boards category; On issue with your cloud shader: the clouds should be dark in the innermost parts and white and fading on ...

  1. #16
    Cat without Hat CornedBee's Avatar
    Join Date
    Apr 2003
    Posts
    8,893
    On issue with your cloud shader: the clouds should be dark in the innermost parts and white and fading on the outside. That would make them look a lot more realistic.
    All the buzzt!
    CornedBee

    "There is not now, nor has there ever been, nor will there ever be, any programming language in which it is the least bit difficult to write bad code."
    - Flon's Law

  2. #17
    Super Moderator VirtualAce's Avatar
    Join Date
    Aug 2001
    Posts
    9,598
    Agreed. Here is a render of the changes.
    Last edited by VirtualAce; 03-12-2011 at 11:41 AM.

  3. #18
    Cat without Hat CornedBee's Avatar
    Join Date
    Apr 2003
    Posts
    8,893
    Oh, yes. Much nicer.
    All the buzzt!
    CornedBee

    "There is not now, nor has there ever been, nor will there ever be, any programming language in which it is the least bit difficult to write bad code."
    - Flon's Law

  4. #19
    l'Anziano DavidP's Avatar
    Join Date
    Aug 2001
    Location
    Plano, Texas, United States
    Posts
    2,738
    Sorry Bubba I don't want to hijack the thread so I will make this brief. I noticed in your code you did this:

    Code:
    float2 cloud_scroll1;
    float2 cloud_scroll2;
    
    float4x4 matWorldViewProj;
    Are those typedefs or classes? Right now I am thinking about reworking my own vector and matrix headers, and right now I am just using typedefs such as:

    Code:
    typedef float[2] .....etc....
    I am thinking about making classes instead but I am just unsure if I want to take the time to do so, and if there would really be any benefit from doing so.
    My Website

    "Circular logic is good because it is."

  5. #20
    Super Moderator VirtualAce's Avatar
    Join Date
    Aug 2001
    Posts
    9,598
    Those are HLSL intrinsic data types. The shader code is not C++.


    I've optimized the system to create 1 index buffer and then do a quick memcpy from that buffer to the actual Direct3D index buffer. I've also optimized the normal creation so that a normal map is generated once at load time and then re-used when needed.

    I have a huge problem now. I need to page in terrain data and somehow maintain my quad-tree structure. Right now if I set the quad tree range to twice as large as one grid then 2 terrains are displayed as expected. Likewise when I set the range to 4.0f then 4 terrains are displayed and so on. So the code is in place to begin loading terrain and creating patches in world space. I just don't know how to place them into the quad tree at run-time without re-building the entire tree.

  6. #21
    Super Moderator VirtualAce's Avatar
    Join Date
    Aug 2001
    Posts
    9,598
    UPDATE:

    I finally have endlessly repeating terrain. After trying several algorithms I decided on this one b/c it fits nicely into my quad tree.

    • Each terrain 'patch' has a world position.
    • All vertices in the patch are in model space.
    • The height for the vertices is grabbed from the heightmap as if the vertices were already in world space.
    • The size of the quad-tree is constant so the max view distance must be specified at program start.
    • Each patch is rendered separately using an ID3DXEffect that can be specified to alter the appearance of the terrain.
    • Terrain::Update() calls Terrain::UpdatePatches() which subtracts the camera position from the patch position. If the x difference or z difference is greater/less than the positive/negative world map width the patch pos is translated positive/negative world map width. This causes the terrain pieces to wrap around if they get too far from the camera. At this point, new terrain data could be loaded in which really resolves to just changing the y vertex pos of the current mesh. In this way terrain can be paged in at run-time.


    Major issues so far:
    • View distance kills framerate. The best view distance is: -worldwidth * 4.0f, worlddepth * 4.0f, worldwidth * 4.0f, -worlddepth * 4.0f. One world depth is 256 * cellsize, 256 * cellsize. Cellsize looks best at around 32 or 64 units. As you can tell this is a ton of data. The good news is that the patch size is only 32x32 so I'm only sending 32x32 patches to the card at any one time.
    • The frustum culling is still wacky. In certain directions it works fine but in others patches pop in/out which should not happen. I'm still unsure as to why this is happening.
    • Terrain pieces 'pop' in on the horizon which doesn't look the best
    • There is currently no CLOD but this scheme is nearly ready for geo-mip mapping so it should not be too much to implement.


    I can either try to fix the AABB frustum culling or just use spheres. Spheres do not suffer from the same issues as the AABBs. Very odd. Next on the list would be to implement the geo-mip mapping. I believe what this amounts to is having various levels of detail in static vertex buffers. The index buffers do not change. At run time the algo would decide which vertex buffer to use based on the total screen space error or whatever decision metric I use to decide the LOD. After this the next step is to page in new data. Since I already know which 'patches' are leaving the viewing area I can cache in new data based on the patches new world position and pull that data from a file or heightmap.
    Last edited by VirtualAce; 04-24-2008 at 11:17 PM.

  7. #22
    Super Moderator VirtualAce's Avatar
    Join Date
    Aug 2001
    Posts
    9,598
    Had to stop development and clean my computer. Started feeling hot air while coding and realized my CPU was at 76C. My terrain graphics takes my card from 64c to around 80c. The code is running 3 vertex and pixel shaders and pushing just under 1 million tri's which shouldn't really tax it too much. My card is supposed to support 7 pixel shader units. This was using a view area of 65,536 units on x and z. I tried to push the viewing area to 262,144 but the vertex buffer failed. I'm assuming I probably ran out of video memory.

    Obviously some LOD techniques are needed before proceeding any further. Arctic Silver for my poor CPU is also in order. After cleaning the system a bit I removed the CPU, power supply, etc. Put it all back together and the computer wouldn't stay on for more than 10 seconds. The thermocouple under my CPU was shutting the system down. I finally got it working but I think a new system is a definite must now.
    Last edited by VirtualAce; 04-26-2008 at 12:28 AM.

  8. #23
    Supermassive black hole cboard_member's Avatar
    Join Date
    Jul 2005
    Posts
    1,709
    I recently bought an entirely new system. My GPU at the time (7800GS) was on it's way out - overheating like you wouldn't believe. T'was packed with dust, the horrible sticks-to-everything-electronic dust so I actually dismantled it, which probably wasn't a good idea, even less so since I didn't have an Arctic Silver.

    So yeah I murdered my last GPU. Good times.
    Good class architecture is not like a Swiss Army Knife; it should be more like a well balanced throwing knife.

    - Mike McShaffry

  9. #24
    Cat without Hat CornedBee's Avatar
    Join Date
    Apr 2003
    Posts
    8,893
    Hehe, my last GPU burned out because I didn't clean it in time. All the dust killed the fan, and then it was a matter of minutes until I smelled burnt dust and the screen went completely insane.

    If your system is at 64c in idle, you've already got a problem.
    All the buzzt!
    CornedBee

    "There is not now, nor has there ever been, nor will there ever be, any programming language in which it is the least bit difficult to write bad code."
    - Flon's Law

  10. #25
    Registered User
    Join Date
    Sep 2006
    Posts
    8,868
    Quote Originally Posted by CornedBee View Post
    Hehe, my last GPU burned out because I didn't clean it in time. All the dust killed the fan, and then it was a matter of minutes until I smelled burnt dust and the screen went completely insane.

    If your system is at 64c in idle, you've already got a problem.
    My last dust killer was just the opposite. It had the same sticky brown dust on everything electronic, so I went to vacuum it out with a crevice tool on the end of the vacuum hose.

    I'm at a friend's home, and she's got a Dyson big vacuum, all yellow and such - which wouldn't suck a postage stamp off the ground, on a July day.

    I was SO disappointed with Dyson - he looks so clever in his commercials -

    So I blew the dust off the boards, and then vacuumed up what the pitiful Dyson would pick up, with it's $400 suction.

    Next week the thing overheated! The dust had been blown around enough that it had managed to smartly settle on the thermal grease of the cpu, making it stop working.

    Of course, my friend had tried to re-start it and actually ran it for short periods of time, throughout the week. Mobo was cooked.

    I went there to back up her data and install a new HD, but I should have brought the thermal compound along with me.

  11. #26
    Super Moderator VirtualAce's Avatar
    Join Date
    Aug 2001
    Posts
    9,598
    After cleaning everything including the power supply internals my temps are now:

    Idle
    CPU: 48-50 C
    Mobo: 30 C
    GPU: 47 C

    Under load
    CPU: 52 - 55 C
    Mobo: 32 C
    GPU: 52 C

    Before cleaning:

    Idle
    CPU: 64 - 67 C
    Mobo: 47 C
    GPU: 64 C

    Under load
    CPU: 72 - 76 C
    Mobo: 49 C
    GPU: 90 C (wow!)
    Last edited by VirtualAce; 04-26-2008 at 05:30 PM.

  12. #27
    (?<!re)tired Mario F.'s Avatar
    Join Date
    May 2006
    Location
    Portugal
    Posts
    7,577
    Quote Originally Posted by Bubba View Post
    GPU: 90 C (wow!)
    Holly!

    The fact you can still use it, makes that the sexiest graphics card I've ever seen. Brand and model please
    The programmer’s wife tells him: “Run to the store and pick up a loaf of bread. If they have eggs, get a dozen.”
    The programmer comes home with 12 loaves of bread.


    Originally Posted by brewbuck:
    Reimplementing a large system in another language to get a 25% performance boost is nonsense. It would be cheaper to just get a computer which is 25% faster.

  13. #28
    Super Moderator VirtualAce's Avatar
    Join Date
    Aug 2001
    Posts
    9,598
    BFG Tech GeForce 7800 GS 256 MB AGP OC

    I'm surprised it still works. Max temps for NVidia GPU's are actually around 120C before the chip starts to actually fry itself.

    Max temps for AMD CPUs are rumored to be around the 115C to 125C mark. I have a feeling you would know it was getting that hot long before it failed. I saw an old Athlon 1 GHz run up to 205 to 215F way back in the day. The customer requested a new chip so my friend who owned the biz took it and put it into one of his office systems. It ran just fine after that and prob would still run today.
    Last edited by VirtualAce; 04-26-2008 at 05:29 PM.

  14. #29
    (?<!re)tired Mario F.'s Avatar
    Join Date
    May 2006
    Location
    Portugal
    Posts
    7,577
    You betcha. If memory is still fully usable it certainly qualifies as a "good job guys!" email to BFG.

    It would probably pay for you to install some monitoring software. I'm not confident enough to suggest any. What I seem to know is that there is no way to gauge gpu temperatures from an agp slot, which if true is a bummer. Unless the card comes with a probe and the needed software, you are out of luck.
    The programmer’s wife tells him: “Run to the store and pick up a loaf of bread. If they have eggs, get a dozen.”
    The programmer comes home with 12 loaves of bread.


    Originally Posted by brewbuck:
    Reimplementing a large system in another language to get a 25% performance boost is nonsense. It would be cheaper to just get a computer which is 25% faster.

  15. #30
    Super Moderator VirtualAce's Avatar
    Join Date
    Aug 2001
    Posts
    9,598
    Started to implement geo-mip mapping but I'm still running out of video memory. I'm thinking I may have to store some vertices in system RAM and use DrawPrimitiveUP() on those. It will be the most distant patches so it should not be too much of a hit. Right now I do not have any 'transitory' patches that connect the different LODs and with a view distance of 262,144 units I'm completely out of video memory.

    With a view distance of 262,144 and 6 levels of LOD spaced every 1600 units I'm getting just over 50 FPS which is terrible.

    Running out of algos and options here. There is something someone is not sharing over the internet on how to get this thing to really fly. I have some ideas for updating the patches in which I will only check patches that are on the edges or on the horizon. Patches that are elsewhere will never be moving out of view so it is a waste of time to check them for wraparound.

    More details later.

    Early screenshot in wireframe:
    Last edited by VirtualAce; 03-12-2011 at 11:41 AM.

Page 2 of 2 FirstFirst 12
Popular pages Recent additions subscribe to a feed

Similar Threads

  1. Continous LOD Terrain Source... Released!
    By Perspective in forum Game Programming
    Replies: 13
    Last Post: 04-18-2006, 12:21 AM
  2. New terrain engine
    By VirtualAce in forum Game Programming
    Replies: 16
    Last Post: 03-16-2006, 02:47 AM
  3. Drawing only what you can see (terrain)
    By Eber Kain in forum Game Programming
    Replies: 8
    Last Post: 07-05-2004, 01:19 AM
  4. Terrain algos
    By VirtualAce in forum Game Programming
    Replies: 1
    Last Post: 04-10-2004, 03:50 PM
  5. OpenGL terrain demo (EDITOR)
    By Jeremy G in forum Game Programming
    Replies: 2
    Last Post: 03-30-2003, 08:11 PM

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21