WOAAAAA very nice!!!!
Printable View
WOAAAAA very nice!!!!
Last screenshot. Anyone who wants to help with this by contributing models, effects, or code please let me know. This should probably go on the recruitment board, but I wanted to show that a lot of code has already been written.
I need 3D models of ships, objects, space stations, etc as well as textures for lasers, lense flares, explosions, etc. Animation is just around the corner.
Some things that need to be implemented later are a small scripting engine for creating missions and controlling AI ships. There is a lot to be done here....but I need serious people only.
This is a shot just below the first planet in the system with the sun near the bottom of the pic. The sun needs a flare around it - right now it is just extremely bright. The light for the universe is the sun as is evidenced by the shadow on the planet. It is just a huge, very bright, point light source.
do you create your own textures and what is your algo for wrapping a texture to a sphere? and I guess while I am at it, what do you use for a sphere algo?
Yes I create my own textures. I'm using spherical texture mapping. There are two types, one based on the normals of the vertices and one based on the position of the vertices. I use both depending on the sphere type. The sphere algo was simply this:
newx=centerx+(cos(alpha)*sin(beta)*radius);
newy=centery+(sin(alpha)*sin(beta)*radius);
newz=centerz+(cos(beta)*radius);
0<alpha<360
-90<beta<90
But the D3DX library already has a D3DXCreateSphere function. So I scrapped the math code and used the D3DXCreateSphere(), then cloned the mesh with D3DXCloneMeshFVF() - the new FVF has normal and texture coordinates information. Here are the two algos I'm using for texturing. Direct3D already wraps the texture by default so I don't have to worry about creating an extra vertex per row of the sphere just so the thing wraps correctly.
Normal-based spherical texture mapping.
Position based spherical texture mappingCode:D3DXVECTOR3 vMin,vMax;
D3DXComputeBoundingBox(&Verts[0].pos,numVerts,sizeof(SphereVertex),&vMin,&vMax);
// calculate center
D3DXVECTOR3 vCent;
vCent=(vMax+vMin)*0.5f;
// loop through the vertices
for (int i=0;i<numVerts;i++) {
// calculate normalized offset from center
D3DXVECTOR3 v;
v=Verts[i].pos-vCent;
D3DXVec3Normalize(&v,&v);
// calculate texture coordinates
Verts[i].u=texmag*(asin(Verts[i].normal.x)/D3DX_PI+0.5f);
Verts[i].v=texmag*(asin(Verts[i].normal.y)/D3DX_PI+0.5f);
}
I found these algos by searching the web.Code:D3DXVECTOR3 vMin,vMax;
D3DXComputeBoundingBox(&Verts[0].pos,numVerts,sizeof(SphereVertex),&vMin,&vMax);
// calculate center
D3DXVECTOR3 vCent;
vCent=(vMax+vMin)*0.5f;
// loop through the vertices
for (int i=0;i<numVerts;i++) {
// calculate normalized offset from center
D3DXVECTOR3 v;
v=Verts[i].pos-vCent;
D3DXVec3Normalize(&v,&v);
// calculate texture coordinates
Verts[i].u=texmag*(asin(v.x)/D3DX_PI+0.5f);
Verts[i].v=texmag*(asin(v.y)/D3DX_PI+0.5f);
}
http://www.mvps.org/directx/articles/spheremap.htm
Nothing wrong with using the code as long as you understand it first. So I give credit where credit is due. But I put it all together in my code and used it. Knowing about an algorithm and knowing how to use the algorithm are two very different beasts.
Naturally in the first snippet I don't need the bounding box and vcent stuff...but I left it in.
Ok I lied...another screenie. This one with a non-textured atmosphere around planet 1. Backdrop has been changed and if I can't get rid of the distortion at 180 degrees on the sphere (both sides) I'm going to change to a skybox for the backdrop. This has a lot of the light characteristics significantly altered from the previous screenshots.
Ok I've found about 20 different spherical texture mapping equations/algos. I've even tried to reverse the cartesian coord to spherical coord transformation - but it produces some weird stuff.
Where newx, newy, and newz are the spherical coordinates of the vertex and Radius is the radius of the sphere. Doesn't work well at all.Code:float PI2=D3DX_PI*2.0f;
cur_v=acos(newz/Radius)/D3DX_PI;
cur_u=acos(newx/(Radius*sin(D3DX_PI*cur_v)))/(PI2);
So then I thought about it and tried some spherical environment mapping.
max_u2=Half the width of the textureCode:float max_u2=tex_width/2.0f;
float max_v2=tex_height/2.0f;
Verts[i].u=tex_mag*((max_u2+Verts[i].normal.x*max_u2)/tex_width);
Verts[i].v=tex_mag*((max_v2+Verts[i].normal.y*max_v2)/tex_height);
max_v2=Half the height of the texture
tex_width=Width of the texture
tex_height=Height of the texture
tex_mag=Tiling factor for texture (2.0f means that the texture will tile twice throughout the sphere)
Verts[] - array of SphereVertex type
This method still has distortion in it.Code:struct SphereVertex
{
D3DXVECTOR3 position;
D3DXVECTOR3 normal;
float u,v;
static const DWORD FVF;
SphereVertex(D3DXVECTOR3 _pos,float _u,float _v):position(_pos),u(_u),v(_v) {}
};
...
...
SphereVertex::FVF=(D3DFVF_XYZ | D3DFVF_NORMAL | D3DFVF_TEX1);
Then I thought about the structure of a sphere and realized that D3DXComputeNormals() was computing the normals incorrectly for my sphere. I do not want face normals because even though the faces of the sphere are flat, the vertexes that make up the face actually face different directions and have different normals. So if those normals are averaged with their three neighbors normals then there is a better approximation of the curvature of the face, rather than just using face normals. Then I read an article on gamasutra concerning water effects and refractive mapping.
Basically you can figure out the normal for any point on the sphere by subtracting the vertex vector from the center of the sphere vector and for non-unit spheres you then normalize the vector. This is the true normal for that point on the sphere. No need to use 3 vertices to compute a normal. We are not talking about a flat surface, it is curved and all curves wrap around the center of the sphere. So all points are equidistant from the centroid of the sphere - thus all normals point outward from the center of the sphere.
In refractive mapping you do the same except that you subtract the camera position or vector from the sphere vertex vector of the vertex in question. This then can be used to compute the u and v coordinates of the texture by computing the reflected ray using the dot product. Here is the sample code from gamasutra.
All of these seem to have some distortion in them. There has to be a simpler way to do this.Code:
//main loop
vertex_current = (VERTEX_TEXTURE_LIGHT *)water->mesh_info.vertex_list;
for (t0=0; t0<water->mesh_info.vertex_list_count; t0++,
vertex_current++){
camera_ray.x= camera.camera_pos.x - vertex_current->x;
camera_ray.y= camera.camera_pos.y - vertex_current->y;
camera_ray.z= camera.camera_pos.z - vertex_current->z;//avoid round off errors
math2_normalize_vector (&camera_ray);
//let's be more clear (the compiler will optimize this)
vertex_normal.x= vertex_current->nx;
vertex_normal.y= vertex_current->ny;
vertex_normal.z= vertex_current->nz;
//reflected ray
dot= math2_dot_product (&camera_ray, &vertex_normal);
reflected_ray.x= 2.0f*dot*vertex_normal.x - camera_ray.x;
reflected_ray.y= 2.0f*dot*vertex_normal.y - camera_ray.y;
reflected_ray.z= 2.0f*dot*vertex_normal.z - camera_ray.z;
math2_normalize_vector (&reflected_ray);
//interpolate and assign as uv's
vertex_current->u= (reflected_ray.x+1.0f)/2.0f;
vertex_current->v= (reflected_ray.y+1.0f)/2.0f;
}//end main loop
Anyone have any ideas?
Perhaps I need to write a pixel shader that actually computes the cosine of the angle between the pixel and the incoming light ray and then shades the pixel accordingly - or shades the texel accordingly.
I may also have to write a pixel shader that computes the reflected ray as well as the texel at the current pixel in question.
I'm really lost as to where to go.
Finally found a PDF file that actually texture maps a sphere correctly. To fix the distortion I must manually distort the horizontal component of the image so that when wrapped it looks correct.
Here is the formula:
cur_u=(DEGTORAD(alpha)+D3DX_PI)/(PI2);
cur_v=DEGTORAD(beta)/D3DX_PI;
Dude........that is very impressive. :eek:
that was a good idea, we should have thought of that.
Not nearly as impressive as when I get my pixel shader up and running. I'm sick of the default Direct3D vertex lighting. The only way to get good lighting effects is to use light maps and blend the texture in with your existing texture.
And by the way none of these textures have the distortion filter applied to them yet. I just thought they looked good this way. Unless you sit there and stare at the poles of the planet....you will never see it. I will probably only distort the space backdrops because if you are heading towards a target or waypoint - it is possible you will be heading towards one of the poles of the environment sphere. That would look ugly as you have to sit there and stare at the deformities while you travel. Not good.
But believe me you won't have time to sit and stare at the poles of the planet w/o getting jumped by some pirates or some baddies wanting to kick your butt, steal your cargo, etc.
I'm also thinking of creating a faction that has sleek steel ships - essentially refractive environment mapping. Problem is I would have to create an environment map for every single start system - especially if the backdrop changes color significantly. Would look stupid to be in a red-hue star system....and use a default black environment map for the ships. Will add a lot of maps to the program. Won't increase the code at all - just change the texture pointer - but it will change the storage amount on the hard drive.
I haven't even begun to do animation, models, or sound effects and I'm already getting a large storage footprint for this beast. Really need to examine a compressed texture scheme, but I still want to utilize D3DXCreateTexture...() functions. Probably will be decompressing textures at load time. Dunno yet.
Ooooh be sure to post screenies. My GFX card wont support pixel shading. Stupid Geforce2.. :p
Well the game will still run on your card, but it won't use the pixel shading. I will check the caps of the card before enabling the pixel shaders. Unfortunately I don't think I could emulate it in software, even in assembly language, so you won't get the cool special effects.
I will look into it though.