PDA

View Full Version : Found out why Oblivion slowed down



VirtualAce
05-13-2008, 11:30 PM
When I first bought Oblivion it ran very well on my machine. Later I installed Shivering Isles and another expansion pack. Since then I noticed huge slow downs.

However I must have also done some Direct3D programming in between installing the packs and installing the game. Found out that all of my Direct3D settings were set to use the debug runtime with shader debugging and validation enabled. Debug output slider was set to about half. Debug runtimes were being used for every component of DirectX.

I guess using your gaming computer for game dev is not such a good idea. :D :D

cboard_member
05-14-2008, 12:50 AM
I had something similar happen the last time I installed the DirectX dev stuff. Sucks. Are these DirectX debug settings you speak of in dxdiag or somewhere else?

Elysia
05-16-2008, 06:44 AM
This is something I've actually been looking for - debug runtimes - especially for DirectShow. But I couldn't find them.
I guess you got the debug from the latest DX SDK?

VirtualAce
05-16-2008, 04:29 PM
DirectShow used to be available in the old Platform SDK. However I believe it was nixed in the new Windows SDK which is just a Microsoft way of saying their new flaming pile of poo SDK for their latest and worst ever operating system: Vista.

Elysia
05-16-2008, 07:05 PM
Just our luck with Microsoft.
Old, useless things stay, and useful things disappear and a slew of other crap shows up, and out of that, 70% fails, and what remains may not be what one wants.

VirtualAce
05-16-2008, 09:14 PM
I should be nicer because Microsoft is by no means full of idiots. It just seems to me they took a giant leap backwards this time. I'm also not keen on them trying to hype their C# baby by attempting to force game designers to move to C# via XNA. If C# can really do retail level games then great but did they really have to invent C# to gain cross-platform ability with the XBox? Somehow I think not.

I'm just dissatisfied with the new directions they are taking. Probably doesn't matter since my opinion does not count for much in the grand scheme of things.

Elysia
05-17-2008, 04:00 AM
I'm just dissatisfied with the new directions they are taking. Probably doesn't matter since my opinion does not count for much in the grand scheme of things.
I can't help but agree...

Mario F.
05-17-2008, 05:30 AM
If C# can really do retail level games then great but did they really have to invent C# to gain cross-platform ability with the XBox? Somehow I think not.

In a way yes, because .Net was introduced as a cross-platform framework that could span across any number of platforms, being this how it was sold to us some 8 years ago during the beta stages. Ah!

What I doubt however is that XNA can compete on those platforms where there are already alternatives and a long standing game development culture, like in Windows. I also sincerely doubt XNA/C# capability to compete on what really matters, performance.

You know what really, really, annoyed me about XNA? When I learned it could only support C#. It may not seem much at first glance, but this puts an end to yet another marketing ploy about .net; that VB.Net and C# only differed in syntax. Both languages used the same interpreter and could be used without any gain or loss of functionality. Obviously, this is not true.

And I also doubt .Net in general on what comes to... why? Why exactly we need it? In my view, we really don't. There is absolutely nothing in .net that can't be achieved by other means without having to install a monolith framework. But that's just my opinion.

zacs7
05-17-2008, 07:15 AM
And I also doubt .Net in general on what comes to... why? Why exactly we need it? In my view, we really don't. There is absolutely nothing in .net that can't be achieved by other means without having to install a monolith framework. But that's just my opinion.

I agree 110%, I think that Microsoft are really trying to pioneer the way -- through brute force, hoping to get a break sooner or later. I've got nothing against them (they provide millions of jobs -- often indirectly), it just seems far less "black and white" as it used to be.

Mario F.
05-17-2008, 08:48 AM
I agree 110%, I think that Microsoft are really trying to pioneer the way -- through brute force, hoping to get a break sooner or later.

I have no doubts the .Net framework spawns of the willingness of Microsoft to stop supporting Java. And I think it is exactly Java, that Microsoft is answering. After all what is .Net that Java isn't? Nothing! Please someone tell me otherwise.

This framework has been under development for almost 20 years now. But Java predates that. So it isn't even a matter of Microsoft pioneering anything.

The framework alone is a good thing. I cannot deny that. Offering an alternative to old and always badly designed MFC is a good move in my book. But that's a framework like so many others (Java's included). What I hold a grunge is the fact they turned it into a purely business decision by diluting even more the world of windows programming with the addition of new unnecessary programming languages and at the same time barred access to the framework from outside languages. It really annoyed me :mad:

Microsoft has an uncanny ability to make incredibly good business decisions (for them and them only) that only a blind shareholder can't throw a party on. Net is exactly one of those decisions. I have to take my hat. I can only imagine the gloating look on their faces as they realize they have a credible Java alternative and at the same time can draw into the platform C++ companies and their programmers with the completely irrelevant and redundant C# programming language.

Why is this successful? Because quiet frankly many companies can't tell a monitor from a computer, the media loves hype, and there's a breed of sharks called consultants that breed like rabbits.

[/rant]

Elysia
05-17-2008, 08:56 AM
Indeed. One thing that dotNet may be better than Java is interoperability, I guess.
Several languages can communicate via the framework, but I guess that point is kindof moot because the only thing that differs between the languages is how the syntax is done. They all share the same functionality in the framework.

maxorator
05-17-2008, 09:20 AM
Java can be compiled into machine code. Can C# do that too?

Elysia
05-17-2008, 09:21 AM
No, any managed language is strictly interpreted, but compiled into machine code on-the-fly.

maxorator
05-17-2008, 09:22 AM
No, any managed language is strictly interpreted, but compiled into machine code on-the-fly.
On-the-fly means it's not compiled then. It simply means it is basically a processor emulator.

Elysia
05-17-2008, 09:24 AM
Yep.
Microsoft likes to claim that managed code is faster than native because it's compiled into machine code dynamically as it's executed. They claim that since it's dynamic, it can generate code that is most efficient to the client machine it runs on.
However. I don't agree. It's interpreted and will therefore never be as fast as native languages. Plus the overhead of the library which isn't little.

maxorator
05-17-2008, 09:28 AM
Also, it is not faster than Java bytecode nor ActionScript. Microsoft just wanted to have control over every aspect of programming. They couldn't stand 3rd-party tools being that popular.

And how much faster are bytecode languages from scripting languages? The only difference between them is the parser...

Elysia
05-17-2008, 09:30 AM
Microsoft just wanted to have control over every aspect of programming. They couldn't stand 3rd-party tools being that popular.

Hoho, now who can argue with that? :D
Microsoft wants to take over the world!

VirtualAce
05-17-2008, 11:25 AM
Did some 'pruning' in this thread to keep it on track.

zacs7
05-18-2008, 10:03 PM
Bubba does that mean most of your personal work is in D3D rather than OpenGL?

VirtualAce
05-18-2008, 11:28 PM
All of my 3D work is in Direct3D. I picked D3D and DX for many reasons but I won't get into that here lest we start a D3D/OGL war.

I could pick up OGL easily enough since the 3D principles are exactly the same.

Mad_guy
05-23-2008, 08:50 AM
On-the-fly means it's not compiled then. It simply means it is basically a processor emulator.

First off, in the case of .NET, the language is still compiled because it is compiled to a binary, which runs on an abstract machine on which it will execute. That is what a compiler does. A compiler doesn't have to compile down to the raw assembly to be considered a compiler. It only has to go from source -> target.

Second, it compiles the intermediate byte code into native code as it is executed. .NET and the JVM will do advanced optimizations on the bytecode you give it and covert it to native code, and for the rest of the time, execute that instead (in the case of .NET I believe the native code is actually stored inside the .exe after being run; when a method is invoked it has native code generated if it hasn't before. Java's HotSpot will do performance/profiling analysis as the program is running, and compile it to very highly optimized native code, then execute that.)

I fail to see how this makes it not a compiler, of which the only definition is to transform a program from a source language into a target language. Now, our definitions of compiler may vary (which is totally fine, I might have missed your point some,) but I can't agree that it's a "processor emulator."


And how much faster are bytecode languages from scripting languages? The only difference between them is the parser...
What? No it isn't; the difference is in the actual execution of the source program. There is no difference in the parser. If you have a parser for Ruby, whether or not your backend generates bytecode for a VM or walks the AST and interprets it, the parser will return the same AST regardless; it's purely a backend implementation issue.

As for a performance comparison, generally speaking virtual-machine based languages will execute faster than ones that purely generate an AST and then walk it; this is primarily because the virtual machine will be written to be fast and highly tuned to the language semantics, and the bytecode will be low level and tuned to the semantics as well.

Elysia
05-23-2008, 09:17 AM
I would call the dotNet a dynamic compiler, as opposed to C/C++ and other languages which use a static compiler.

abachler
05-23-2008, 09:45 AM
I fail to see how this makes it not a compiler, of which the only definition is to transform a program from a source language into a target language.

No, a compiler transforms it from a programmign language to executable code. What you described is a translator, not a compiler. These have been around for awhile. Used to be all C++ code was translated into C code then compiled, until compiler could directly compile C++ code. BASIC also transforms its language into BYTECODE, so .NET is really just a glorified interpreter. Unless the bytecode is being sent directly to the processor as the instruction stream, its being interpreted. Interpreted languages are inhernetly slower even if there is a one to one relationship between the bytecode and the machine language. The bytecode must still be loaded by the interpreter and the equivelant machine instruction performed. This means that at most, and interpreted application will be half as fast as native code even with a perfectly optimized interpreter. I/O will generally be almost as fast for technical reason I wont go into here. Because computers are so fast, and becuase most .NET appications are heavily I/O bound (internet bandwidth limitations), the perceived speed difference will be negligeable.

Mad_guy
05-23-2008, 03:21 PM
No, a compiler transforms it from a programmign language to executable code. What you described is a translator, not a compiler.
You do realize this is a circular definition, right? Compilers go from source -> target, end of story. Check wikipedia if you wish; check anywhere, the idea is all that matters and what you're talking about are useless details. Whether or not the target is actual assembly code, or the compiler has a backend to compile to another languages (many high level languages like chicken scheme will compile scheme to C) like C, it's still a - strictly speaking - a translation. Therefore, all compilers are inherently translators, and all 'programming language translators' are inherently compilers.

You're doing nothing but nitpicking on words, and you haven't in any way made the definition I gave of a compiler false. How do the semantics and definition of this 'black box program' change, based on what the target output is? It's always going from source -> target, and there is nothing that strictly specifies the target must be machine code.


Used to be all C++ code was translated into C code then compiled, until compiler could directly compile C++ code.
Do you have something to back this up (honestly)? I've never heard of this approach, and I'm pretty skeptical unless you're just generalizing it a bit in which case I think I might get what you mean..


Interpreted languages are inhernetly slower even if there is a one to one relationship between the bytecode and the machine language.
Bringing up the speed of the approach completely misses the point I was trying to make - I was only pointing out that the JVM will take that java bytecode, and on the fly, compile it into adequate machine code. I never mentioned speed, although I might have said the word 'optimized.' It's irrelevant either way.


This means that at most, and interpreted application will be half as fast as native code even with a perfectly optimized interpreter.
Unfounded, useless statement.


I/O will generally be almost as fast for technical reason I wont go into here. Because computers are so fast, and becuase most .NET appications are heavily I/O bound (internet bandwidth limitations), the perceived speed difference will be negligeable.
This - again - is completely orthogonal and utterly irrelevant to the point I was making.

Mario F.
05-23-2008, 05:00 PM
You do realize this is a circular definition, right? Compilers go from source -> target, end of story. Check wikipedia if you wish; check anywhere, the idea is all that matters and what you're talking about are useless details.

Very well then. An image converter is a compiler then. And I'm sure you agree the details are useless.

laserlight
05-23-2008, 10:50 PM
Do you have something to back this up (honestly)? I've never heard of this approach, and I'm pretty skeptical unless you're just generalizing it a bit in which case I think I might get what you mean..
A certain fellow by the name of Bjarne Stroustrup, who might know something about how C++ was invented, makes a similiar claim (http://www.research.att.com/~bs/bs_faq.html#bootstrapping).

The same fellow calls his C++ to C translator "a traditional compiler", contradicting abachler's statement. Perhaps we could defend abachler's statement by pointing out that, at the end of the process, we still get machine code, much like how modern C++ compilers invoke an assembler. In other words, "source -> target" is too vague: we need to define what is the source, and what is the target.

Mario F.
05-24-2008, 06:14 AM
I believe we can agree on some few general concepts that, while not aiming to encompass all scenarios, at least have the benefit of separating the waters. The last thing we need is to end this admiting against all that is natural, interpreted languages are the same as compiled languages.

Their translation, compilation, linking and execution plans are completely different.

abachler
05-24-2008, 09:20 AM
To build that, I first used C to write a "C with Classes"-to-C preprocessor. "C with Classes" was a C dialect that became the immediate ancestor to C++. That preprocessor translated "C with Classes" constructs (such as classes and constructors) into C. It was a traditional preprocessor that didn't understand all of the language, left most of the type checking for the C compiler to do, and translated individual constructs without complete knowledge.

He calls its a preprocessor, and then goes on to say that it leaves the type checkign to the compiler, obviously making a distinction between the preprocessor which did the translating, and the compiler, implying that the preprocessor/translator is not a compiler.


I was only pointing out that the JVM will take that java bytecode, and on the fly, compile it into adequate machine code.

Yes, this is called interpretting the code. Some distinction may be drawn if it only does this the first time the application is run, but last I checked JVM does this every single time, otherwise it would be the Java Runtime Engine or somethign similar and would completely defeat one of the purposes of using a virtual machine, and that is to prevent malicious code from having direct access to the execution stream.

Im not saying Java is bad because its interpretted, this is in fact one of its strengths, but that still doesnt make it compiled code. I personally dont use java becuase it is ill suited to my applications, but for applications that are heavily I/O bound, there may be little point in optimizing execution speed. Again, this doesnt make JVM just as fast as say C/C++, it just mitigates the effect of that weakness in teh final analysis.

laserlight
05-24-2008, 10:28 AM
He calls its a preprocessor, and then goes on to say that it leaves the type checkign to the compiler, obviously making a distinction between the preprocessor which did the translating, and the compiler, implying that the preprocessor/translator is not a compiler.
I think you misread. The "preprocessor" refers to "C with Classes"-to-C, and this was used to write Cfront. Cfront itself was what he called a "a traditional compiler", and this compiled C++ to C.

Mad_guy
05-26-2008, 08:18 PM
Very well then. An image converter is a compiler then. And I'm sure you agree the details are useless.
Taken out of context? Sure.


Im not saying Java is bad because its interpretted, this is in fact one of its strengths, but that still doesnt make it compiled code.
Whoa, look, the last thing I was talking about was the word translation vs. the word compiler; have I missed something here, or did we just skip a beat? In any case, it is arguable whether the approach taken by the JVM is a traditional 'compilation' strategy (which I would classify as 'yes,' still) or not (the fence you're on,) and that is just left up to the interpretation of the reader. It's arguable what qualifies as 'compiled code.' I think it's best to leave it at that. On the note of compiler vs. translator, I find on the note of any ambiguity or any noticable difference, there is none and it's a clear-cut issue.

We're shifting points though - the statement of mine you're quoting is in a response to a useless statement you made earlier when we were on the topic of 'what's a compiler,' only clarifying what my point was and how speed wasn't it and nothing about I/O-bounds and all this, and now we've moved onto what's compiled code and strengths, weaknesses and other stuff: if we're going to try and make a point then we need to stay on topic, because it's getting tough to try and remember what the original topic even was.