Database connections/actions a sec?

This is a discussion on Database connections/actions a sec? within the Tech Board forums, part of the Community Boards category; Heey, I am about to make a program that does about 100.000 read/write actions a sec to a database. My ...

  1. #1
    Registered User
    Join Date
    Jan 2011
    Posts
    2

    Database connections/actions a sec?

    Heey,

    I am about to make a program that does about 100.000 read/write actions a sec to a database.
    My question:
    Can a database handle this amount of connections in such a short time?

    specs:

    2.4 Ghz quad core
    4 Gb ddr2 ram

    Greetings,
    Niels

  2. #2
    Registered User
    Join Date
    Nov 2010
    Location
    Long Beach, CA
    Posts
    5,512
    This isn't really a C question, and I'm certainly not a database expert, but I can tell you that your answer depends on a lot of things, including:

    OS and whatever other services may be running besides your database
    What database you're using
    Query complexity
    Is this a "burst" or "sustained" rate? I can't imagine you're asking such a question here if you are really doing 100.000 queries/second all day, every day. Bursts of queries will be processed eventually, hopefully before the next burst.
    Can you use a single/persistent connection? Make all queries on that one connection since creating/destroying connections is reasonably expensive.
    Will your reads and writes be interspersed?
    Will your writes be updates of existing records or additions of new records?
    What kind of locks will the writes require?
    How many columns and tables are involved?
    What key(s) do you have defined on the table(s)?
    Are there any complex joins?
    Can you parallelize any queries, or do they need to happen in a specific order?
    ...

    Just try it out and see how the system performs.

  3. #3
    Registered User
    Join Date
    Jan 2011
    Posts
    2
    It is an algoritme that interacts on the users behavior.
    Every time the user does something it analyses that, this wil create about 10.000+ strings/functions.
    All those 10.000+ strings must be added to the database if they don't exist yet.
    It also counts how often such string is made.

    It then analyses something else that also gives about 10.000+ values.
    all those values need to be added to the database, their unique id's must me added to all the first made strings.

    The size of the database would probably be a few gigs

    No need to connect to other databases, just one connection.

    //edit

    About just trying it out.... I first want to know if it is possible before making that huge algoritme xD
    the algoritme I described, is nog everything, that are just the database write/read actions, it also needs to render al the strings/values to put in the db and that is quite some code I can tell you:P
    Last edited by nielskool; 01-03-2011 at 04:29 PM.

  4. #4
    Registered User
    Join Date
    Nov 2010
    Location
    Long Beach, CA
    Posts
    5,512
    Again, this is not really the best forum for such a question...

    Your question is far too vague to answer. I would need fairly intimate knowledge of your system, beyond just proc and RAM, including whatever algorithms you're using (their time complexity), how the user is interacting, whether it's over a network, etc. Even then, I could only wager a very inaccurate guess at best.

    This sounds more like a case of pre-optimization (which is the root of all evil), or just worrying about maybe having to pre-optimize. Maybe the database won't even be the problem. Maybe you will spend the bulk of your time waiting for user input (which is common). Perhaps the complicated string rendering/generation algorithms you're using will be the bottleneck. Who knows?

    Presumably you have to solve this problem (i.e. it's for work, school, etc), and thus you need to write the code. So write it flexibly so that you can change your underlying storage engine without having to change too much of your code.

    Now that that's out of the way, I would reckon that computer could hold up, at least if a few things were true:
    1. It wasn't bogged down by too many other processes
    2. You had well-designed databases and optimized queries
    3. You had periodic breaks in the 100.000+ queries generated from some user input, i.e. the user stopped typing/clicking from time to time.

    Maybe it could hold up under only two of those conditions, or one, or maybe even zero. There just isn't enough information to say for sure. Once again, try it out.

Popular pages Recent additions subscribe to a feed

Similar Threads

  1. literature database: help with planning
    By officedog in forum C++ Programming
    Replies: 1
    Last Post: 01-23-2009, 11:34 AM
  2. Creating a database
    By Shamino in forum Game Programming
    Replies: 19
    Last Post: 06-10-2007, 01:09 PM
  3. Replies: 10
    Last Post: 05-18-2006, 11:23 PM
  4. Developing database management software
    By jdm in forum C++ Programming
    Replies: 4
    Last Post: 06-15-2004, 04:06 PM
  5. Making a Simple Database System
    By Speedy5 in forum C++ Programming
    Replies: 1
    Last Post: 03-14-2003, 09:17 PM

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21