This is a discussion on Character Movement within the C Programming forums, part of the General Programming Boards category; Originally Posted by mike65535 OK, I'll have to think about this one a bit (not familiar with moving cursors around) ...
I didn't have it before, but I added it cause I was getting confused. But it made me more confused.
Another thing I just thought I should mention.
getch() is not used anymore.
Also, you have includes to determine if this is a windows platform or DOS? Because I thought you were looking for some cross-platform capabilities...but you're relying on a (deprecated) DOS specific library.
getch() is part of the console IO library (conio.h) which is now pretty much deprecated. The only good thing is it has non-buffered I/O which is why I'm guessing you're using it. Most people use some other form if they're looking for cross platform stuff. Perhaps you should create the logic for it in 2 different objects, and link to different ones depending on what platform it's on - because this will only work in DOS/Windows.
Just some food for thought. Not sure what people use now-a-days for non-buffered I/O, most people don't use non-buffered I/O, you can obviously find some stuff with ncurses, I'm not sure what the ncurses lib compatibility with Windows is, though.
The normal forms (which used buffered input) in the c std libs are getc() or getchar() - one's typically implemented differently than the other, think getc() is faster - or some crap like that - but for all intents and purposes they're the same thing. You would have to call to the terminal though to disable echo'ing, might be able to do that with a VT100 command (think you can, don't recall).
getch() actually works for me with just <stdio.h>, not even conio.h is needed.
You're in all likelihood using a compiler that is auto-linked against that library because it recognizes the getch() function. If you're using a M$ compiler (ie. Visual Studio), seeing as they are the ones who made the console I/O library, it isn't farfetched that they would automatically link against it.
Sun's compiler, CC auto-links against the stdio library, stdlib, and a few others if it recognizes a function from there in code. Doesn't mean that code will compile on another OS that has the same standardized libraries.
Regardless, I'm sure if you pull stdio out of your includes, it will still work, if it DOESN'T, then M$ stuck the function into the stdio package...which is a standard, and they're even dumber and less friendly to developers than I originally thought. The stdio package is an established standard and people shouldn't be bolting additional functionality to it. My guess would be that stdio has nothing to do with it working, it's probably just seeing it and linking against the console I/O lib.
This creates a two points:
A) Just because your compiler auto-includes it, doesn't make it a standard
B) People shouldn't be bolting stuff onto standardized libraries - I don't even think M$ is that arrogant
Last edited by Syndacate; 04-26-2011 at 06:25 PM. Reason: typo
Well, I'm on Dev-C++. This project is merely for my own learning benefit, and to show off to a few friends Cross-platform compatability wasn't really a big deal for me. If it works, it works
You should take a look at the library structure of some of these runtime libraries... lots of headers... one library with everything in it.
Pelles C for example has one crt.lib and crt64.lib from which it extracts all standard library functions so it only has to link with 1 of 2 libraries to assemble an executable.
The header structure is a different matter. They are for the compiler (as you know) and many times there are duplications of functions that are called by other library functions.
It's a fluke that it works without conio.h ... and should not be trusted as you say...
I never would have thought Dev-C++ would auto link, since it's a beta... Some things bug the crap out of me about it, but I'm to lazy to uninstall it for another one haha
Yes well... get unlazy... DEV-C++ has been abandoned for nearly a decade now.
You can do so much better.
It ........es me off to no end when I'm working on a Sun machine and CC auto-includes stdio and I didn't think to include it, then I move the code somewhere, give it to somebody, another dev, another machine, w.e - and it breaks in like 40,000 places. I rather deal with the error that I create when I create it.
I'm sure there's a flag you can pass it, but I guess it hasn't bothered me that much to check, lol.
Though I suppose, in the end, if this code's never even leaving the dev-c++ ide, then it really doesn't matter.
With the C standard library in a single .lib file the linker can easily be set up to always (invisibly) link against that file. Which is a good thing...
This should not, however, allow the compiler to succeed without the necessary .h files.
Unless you modify the compiler, though, there's no way to determine that a problem with linked libraries exist. That will cause the linker to puke.
There must be something in the header included that defines the functions (just the prototypes), even if they're extern'ed, or the compiler won't be satisfied. It can't just be left out, or the compiler will error. Unmet dependencies for definitions can then be linked to after object compilation, if not, the linker will throw an error. If there was nothing in the header, it would not know that it needs to link against it, even if it adds the link path to the output compilation or sets up some code for dynamic linking.
Also, the linker doesn't have to be a runtime system, unless it's dynamically linked (ie. dll's or so's), it typically isn't. Most linking is done at compile time, that yields the best performance and is much more reliable. Though even if it's dynamically linked, there still has to be a call to the dynamic lib loader in the header included it to know to do so, this will contain the code to dynamically load the lib at runtime. Though C/C++ was meant to do everything at compile time, hence, static linking. DLL's and the like was shoved into it, and isn't part of C or C++ in and of itself as preprocessor macro expansions are.
Syndicate ... You don't appear to be understanding my point...
For example ...
Pelles C has one library file ... crt.lib which is about 800k in size and contains a few hundred function objects.
The same package also has 63 different header files that prototype functions from this one library.
It's not a one to one relationship between header and library... never has been as far as I know and it certainly isn't so in Windows.
So in your source you #include <stdio.h> ... there is no stdio.lib... the actual function objects are in crt.lib ... and there is nothing in stdio.h that refers to stdlib.h.
stdio.h informs the *compiler* of the functions it prototypes from crt.lib so that your program can compile.
At this point it's just another library... an object file, not a program.
The linker automatically loads crt.lib and links in the required function objects --and only the required objects-- to complete the executable program.
There neither is nor does there need to be a one to one relationship here... In fact, having a one to one relationship would slow the linker down to a painful crawl.
Yes, they can all exist in one object, that doesn't mean anything. The point is it has multiple headers to deal with it. I know what the include macro expansions do.
I'm well aware of how the compilation process works. It doesn't change my point, it's not dynamic linking..
I never said anything about there needing to exist a 1-1 correlation between functions and headers. You made that ........ up on your own.
I simply don't agree with the idea that everything is set with one header inclusion, like windows has, with its windows.h header. I also find it completely useless to make multiple headers which all reference the same object file to link against. It's pointless. If you pack everything into one object, might as well make one header to link against.
In my ideal world, there would be multiple headers, each header prototypes functions which are in its corresponding library to link against. That's the way a good portion of C libs are setup, it's the way I'm used to, and it's a way that makes sense.
If I want to replace an I/O library, I would rather not have ALL the C lib stuff become an issue - name collisions galore. It's nice when you can just tell it not to look for include paths, then specify where it can find things - or replace the library for static linking temporarily.
It goes under the simple philosophy of not putting all your eggs in one basket. Except this time the eggs are functions.
Ok... now I get what you're saying... I don't necessarily agree but I get the point now.
Replacing parts of libraries is no big deal... for example my setup has string.h with functions like strlen(), strcat() etc. that we're all familiar with. I also have a private library doing the same things with dynamically allocated strings (dstring.h and dstring.lib)... no conflicts result because I simply prefixed the function names... dstrlen(), dstrcat() etc. This of course also leaves the original functions available to me when I need them too... I just don't see where this is much of a problem.
For windows includes... You do realize there are nearly 2000 header files provided in the Windows SDK, don't you? Somehow I'd prefer using windows.h to having to list the 50 or 60 headers it couples in for me... And windows.h only includes the most basic windows functions (mostly contained in kernel32.lib)... there's a ton of other stuff (comctrl.h, shlwapi.h, winsock.h, winsock2.h, etc. etc.) that goes into making a windows program which is not included by windows.h.
I don't really see a problem with this and neither does my compiler...
Last edited by CommonTater; 04-27-2011 at 01:11 PM.