CPU Bottleneck

This is a discussion on CPU Bottleneck within the Tech Board forums, part of the Community Boards category; I'm thinking about ordering a Radeon HD 3850, to replace my craptastic Geforce 7600GS. The thing is, I have a ...

  1. #1
    The Right Honourable psychopath's Avatar
    Join Date
    Mar 2004
    Location
    Where circles begin.
    Posts
    1,070

    CPU Bottleneck

    I'm thinking about ordering a Radeon HD 3850, to replace my craptastic Geforce 7600GS.

    The thing is, I have a Pentium Dual-Core e2160 running at 1.8Ghz. It's decent for a budget CPU, but it really isn't as high up on the CPU scale as the 3850 is on the GPU scale.

    Will the CPU bottleneck the graphics? I don't have any means to OC the processor either (P5VD2-MX SE - very basic mobo).
    Memorial University of Newfoundland
    Computer Science

    Mac and OpenGL evangelist.

  2. #2
    Kernel hacker
    Join Date
    Jul 2007
    Location
    Farncombe, Surrey, England
    Posts
    15,677
    Most graphics cards don't require much from the processor these days [becuase so much of the work is done by the GPU], so I wouldn't worry too much about the performance of the processor - obviously in games that do a heap of CPU intensive stuff, you may still lag behind the "hottest" processors on the market. But the CPU won't hold the graphics card back as such.

    --
    Mats
    Compilers can produce warnings - make the compiler programmers happy: Use them!
    Please don't PM me for help - and no, I don't do help over instant messengers.

  3. #3
    Registered User AloneInTheDark's Avatar
    Join Date
    Feb 2008
    Posts
    74
    In case you are running Vista x64, If you wait with it, I'm gonna buy a Nvidia card and will tell you if I still have those blackscreen problems.

    My last ATI card which I ran XP/2k3 with it had some driver issues too, I'm becomming doubtful regarding ATI.

    My brother is a gamer and he got Nvidia this time. Seems like a better choice.

    I don't think there will be any CPU bottlenecks, besides your CPU isn't that bad. You should be fine.

  4. #4
    Super Moderator VirtualAce's Avatar
    Join Date
    Aug 2001
    Posts
    9,598
    But the CPU won't hold the graphics card back as such.
    The CPU can hold the graphics processor back. CPU limited is still a problem and it still happens.

  5. #5
    The Right Honourable psychopath's Avatar
    Join Date
    Mar 2004
    Location
    Where circles begin.
    Posts
    1,070
    Quote Originally Posted by AloneIntheDark
    In case you are running Vista x64, If you wait with it, I'm gonna buy a Nvidia card and will tell you if I still have those blackscreen problems.
    I'd get an NVidia card again, if it weren't for the fact that that the 8600 (their only card that I can afford atm) blows. And I can't afford the 8800. The GeForce 9's are supposed to come out this month, and the projected specs for the 9600 match the 3850. Except I don't feel like waiting around for NV to get it's act together. The mid-rage cards never come out at the same time as the high-end, so I don't expect the 9600 to surface for another while.

    Drivers can be fixed. Subpar hardware can't. Although I'm back on 32-bit XP so I don't expect the drivers to be too bad there. And I've put 2600 Pro's in a few machines at work lately, and the drivers weren't too painful.

    Quote Originally Posted by Bubba
    The CPU can hold the graphics processor back. CPU limited is still a problem and it still happens.
    Like I thought. Hopefully mine won't hold it back too much. It's basically a crippled Core2, which isn't that bad I guess .
    Memorial University of Newfoundland
    Computer Science

    Mac and OpenGL evangelist.

  6. #6
    Kernel hacker
    Join Date
    Jul 2007
    Location
    Farncombe, Surrey, England
    Posts
    15,677
    Quote Originally Posted by Bubba View Post
    The CPU can hold the graphics processor back. CPU limited is still a problem and it still happens.
    Sure, it is still possible to come up with such problems - but it's much less of a problem than it used to be.

    Very little of modern graphics is done by the processor [although, as I stated, some (or even many) games do a lot of math to form the scene, move objects around and such], the actual graphics is formed by writing the list of triangles that make up the frame, and storing texture maps to give the triangles the right colour. For the most part these things are stored in main memory in the system, and then pulled in by the GPU into the graphics memory as and when it's needed.

    On a scene with MANY triangles, it's possible that the CPU can't write enough triangle data for the GPU to keep itself busy all the time, and that would reduce the effectiveness of the graphics card. Or if the application (game for example) modifies the textures frequently, it will need to re-issue the texture maps, and that would also require a CPU copy of that data. But for most things, the CPU can produce more data than the GPU can process - the difficulty comes with actually producing something that looks nice and realistic and still have time to spare.

    --
    Mats
    Compilers can produce warnings - make the compiler programmers happy: Use them!
    Please don't PM me for help - and no, I don't do help over instant messengers.

  7. #7
    Cat without Hat CornedBee's Avatar
    Join Date
    Apr 2003
    Posts
    8,893
    And of course, the CPU can simply be a bottleneck for other parts of the game, like physics or AI. Games that are highly demanding in graphics typically are highly demanding in other areas too.
    All the buzzt!
    CornedBee

    "There is not now, nor has there ever been, nor will there ever be, any programming language in which it is the least bit difficult to write bad code."
    - Flon's Law

  8. #8
    C++まいる!Cをこわせ!
    Join Date
    Oct 2007
    Posts
    22,920
    I haven't encountered any specific game that can bottleneck an Athlon 64 X2 3800+, though. Not even Oblivion's CPU-heavy AI & physics. Dunno about newer games, though.
    Quote Originally Posted by Adak View Post
    io.h certainly IS included in some modern compilers. It is no longer part of the standard for C, but it is nevertheless, included in the very latest Pelles C versions.
    Quote Originally Posted by Salem View Post
    You mean it's included as a crutch to help ancient programmers limp along without them having to relearn too much.

    Outside of your DOS world, your header file is meaningless.

  9. #9
    Kernel hacker
    Join Date
    Jul 2007
    Location
    Farncombe, Surrey, England
    Posts
    15,677
    Quote Originally Posted by CornedBee View Post
    And of course, the CPU can simply be a bottleneck for other parts of the game, like physics or AI. Games that are highly demanding in graphics typically are highly demanding in other areas too.
    Yes, and whilst I may have mentioned it INSIDE some of my babbles, I think I did mention something along those lines - and it's very difficult, unless you have profiling options inside the graphics driver, to tell the difference between a game that draws 80fps instead of 100fps because the graphics card is held up by the CPU's ability to write the vertex buffer, and one that is held up by the AI and physics calculations and thus doesn't do the writing of the vertex buffer until it has done those calculations.

    But I still think that it's very rare that the CPU in itself is limiting the throughput of the graphics in itself - the graphics may be limited by other calculations, yes.

    --
    Mats
    Compilers can produce warnings - make the compiler programmers happy: Use them!
    Please don't PM me for help - and no, I don't do help over instant messengers.

  10. #10
    pwns nooblars
    Join Date
    Oct 2005
    Location
    Portland, Or
    Posts
    1,094
    It is overall game play that you have to worry about though. I know that Supreme Commander bottlenecked at my cpu and you can see my specs in my sig.

  11. #11
    Registered User AloneInTheDark's Avatar
    Join Date
    Feb 2008
    Posts
    74
    Quote Originally Posted by psychopath View Post
    I'd get an NVidia card again, if it weren't for the fact that that the 8600 (their only card that I can afford atm) blows. And I can't afford the 8800. The GeForce 9's are supposed to come out this month, and the projected specs for the 9600 match the 3850. Except I don't feel like waiting around for NV to get it's act together. The mid-rage cards never come out at the same time as the high-end, so I don't expect the 9600 to surface for another while.
    Oh ok, I think I'll be getting one of those 8800, same as my brother has. It is fanless which is such a bliss!

    But hm... since 9600 is about to come, perhaps I should wait, 8800 will probably drop in price a little bit.

  12. #12
    Malum in se abachler's Avatar
    Join Date
    Apr 2007
    Posts
    3,189
    Be careful, the fanless ones are 'come without a fan' not 'run without a fan'. They are intended for water cooled systems.
    Until you can build a working general purpose reprogrammable computer out of basic components from radio shack, you are not fit to call yourself a programmer in my presence. This is cwhizard, signing off.

  13. #13
    Super Moderator VirtualAce's Avatar
    Join Date
    Aug 2001
    Posts
    9,598
    Even though all that fancy graphics hardware is sitting in the machine there are thousands and thousands of lines of code that don't use it in a game. For one algorithms like quad-tree frustum culling, BSP portals, etc, etc are not done on the GPU. There is an area of research that is attempting to do just that and now NVidia claims their 8800 CPUs can be used for physics instead of using an Ageia PhysX.

    But the main bottleneck is getting the data to the card. Another huge bottleneck is deciding what to send to the card and what not to. CPU plays a HUGE part in the performance of a game. That fancy graphics card you have is probably only called from the software a very minimum number of times if the game is coded right. Deciding what not to send to the card is a huge task that is still, for the most part, done in software.

  14. #14
    Fountain of knowledge.
    Join Date
    May 2006
    Posts
    794
    I have always struggled a bit as to what a graphics card actually does apart from converting
    a digital number into an analogue voltage, which I guess a basic grapics card like my onboard
    one, could do. Would someone care to have a stab at explaining it to me, in terms which a layman,
    or idiot could understand?

  15. #15
    Registered User
    Join Date
    Oct 2001
    Posts
    2,129
    The video card takes information in the form of bus signals over ISA, PCI, AGP, or PCI Express. It gets information and commands from the CPU to do things like change video modes, give back video information (like modes available), and other things. Also, it can be sent information, which can correspond to pixel information, vertex information, textures, lighting, or whatever else the driver supports to send to the card. It then processes it with its onboard processor and stores it in the video RAM (VRAM). Then another part of the card reads the VRAM to output a signal to the monitor.

    http://en.wikipedia.org/wiki/Graphics_card
    Last edited by robwhit; 02-14-2008 at 12:57 AM.

Page 1 of 3 123 LastLast
Popular pages Recent additions subscribe to a feed

Similar Threads

  1. pthread question
    By quantt in forum Linux Programming
    Replies: 7
    Last Post: 04-07-2009, 02:21 AM
  2. questions on multiple thread programming
    By lehe in forum C Programming
    Replies: 11
    Last Post: 03-27-2009, 08:44 AM
  3. Upgrading my old CPU (for another old one!)
    By foxman in forum Tech Board
    Replies: 16
    Last Post: 01-11-2008, 05:41 PM
  4. Can you still view the bios screen with a bad CPU?
    By HyperCreep in forum Tech Board
    Replies: 4
    Last Post: 12-31-2006, 06:57 PM
  5. CPU temp
    By PING in forum Tech Board
    Replies: 5
    Last Post: 01-28-2006, 06:25 AM

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21