Here's a good example of what I'm talking about that you need to get a grasp of in order to build your calculator:
http://www.absolutepoker.com/poker-a...te-poker-odds/
This is a discussion on Designing a program to deal out seven card stud poker hands within the C++ Programming forums, part of the General Programming Boards category; Here's a good example of what I'm talking about that you need to get a grasp of in order to ...
Here's a good example of what I'm talking about that you need to get a grasp of in order to build your calculator:
http://www.absolutepoker.com/poker-a...te-poker-odds/
Ubuntu Desktop
GCC/G++
Geany (for quick projects)
Anjuta (for larger things)
Doyle doesn't teach you how to know the odds. Him and the other writers in that book give you rules based on their experience and doing computer simulations to find the probability of something occurring. Sure there are ways to know the odds of getting a pair of aces in holdem without dealing millions of hands. But how about working out why do a pair of aces play better against a small number of hands as suggested in this book. That is something poker authors don't explain. I want to find out how they work that out mathematically.
Last edited by killsthehorse; 12-07-2008 at 11:45 PM.
And I'm suggesting that a computer simulation doesn't give you that answer either.
If a particular odd depends on how many hands are involved, there's still some mathematical reason behind it, not a some strange statistical distribution with some great deal of uncertainty as you're suggesting.
Ubuntu Desktop
GCC/G++
Geany (for quick projects)
Anjuta (for larger things)
Poker is a uncertain game. I know the intuitive reasoning behind that theory I posted above. But I do not know how to work it out mathematically and I thought maybe designing a program to deal out tons of hands in different situations could tell me that this theory (large pairs play better against a small number of opponents is correct)
Poker is not so uncertain to a computer however. Our only uncertainty lies within the cards that we don't know their value. In the case of 7card-stud, we don't know the value of the cards in the deck, burnt cards, or the face-down cards.
So, let's say we have four players, each at the point where they have two down and one up. What do we know about the "uncertain" cards? If there are 52 cards in the deck, and we know for sure what 4 of them are, then all the "uncertain" cards share a 1/48 chance of being any card other then the face up cards, right?
After the next face-up cards are dealt, the uncertain cards are 1/44 likely not to have the value of the face-up cards.
Next round, 1/40.
Next, 1/36.
So what if we had 6 players? Let's do another go round...
1st round: 1/46
2nd round: 1/40
3rd round: 1/34
4th round: 1/28
How about 8 players?
1st round: 1/44
2nd round: 1/36
3rd round: 1/28
4th round: 1/20
So the uncertainty is definately chipped away as we add players to the table. This is pretty obvious. What's not obvious is the link between why a pair of aces is stand up better with fewer players....
Well, as far as poker hands go, you have the best odds up front of getting a pair of anything as opposed to a straight or flush, right?
When you have fewer players, and subsequently more cards left in the deck, it's more unlikely someone will get a hand like a straight or a flush, simply because less combinations of cards are occurring, make sense?
Think about it, if one hand is dealt, what are the odds that hand contains a pair (regardless of what type of poker we're talking about)? But if 7 hands are dealt, what are the odds that at least one of those hands have a pair (the chances are better now).
So I would say, that holding a pair of aces is good no matter how many people are playing, but are even better when fewer people are playing because statistically speaking, it's less likely a hand exists that can beat them.
Ubuntu Desktop
GCC/G++
Geany (for quick projects)
Anjuta (for larger things)
Think of it this way:
Chances I'm dealt a pair: 1/220 (0.45%
Chances 1 player is dealt a pair among 7 players: 7/220 (3.2%)
100 players: 5/11 (45%)
So the more players there are at a table, the more likely better poker hands exist then my pair of Aces.
NOTE: The above odds are for getting a pocket pair in texas hold 'em. If you're considering a pker hand of 5 or 7 cards, a pair is much more likely, though this idea holds true for any type of poker hand.
Last edited by dudeomanodude; 12-08-2008 at 12:47 AM.
Ubuntu Desktop
GCC/G++
Geany (for quick projects)
Anjuta (for larger things)
> 100 players: 5/11 (45%
How do 100 players get dealt 2 cards each from a pack of 52 cards?
> Then take a course in probability; there's nothing that you can get out of a simulation.
Agreed.
The odds of these games were worked out long before there were computers, nevermind before now where it might be reasonable to produce a meaningful sample size within a reasonable time-frame.
If you dance barefoot on the broken glass of undefined behaviour, you've got to expect the occasional cut.
If at first you don't succeed, try writing your phone number on the exam paper.
I support http://www.ukip.org/ as the first necessary step to a free Europe.
What's with all this elitism Why can't someone set up a Monte Carlo simulation if they so want?
I might be wrong.
Quoted more than 1000 times (I hope).Thank you, anon. You sure know how to recognize different types of trees from quite a long way away.
And with me it would be the other way round...
I might be wrong.
Quoted more than 1000 times (I hope).Thank you, anon. You sure know how to recognize different types of trees from quite a long way away.
Running a simulation would be fine, just don't use the standard library rand() to do it.
http://www.eternallyconfuzzled.com/a..._art_rand.aspx
If you dance barefoot on the broken glass of undefined behaviour, you've got to expect the occasional cut.
If at first you don't succeed, try writing your phone number on the exam paper.
I support http://www.ukip.org/ as the first necessary step to a free Europe.
No elitism concerning Monte Carlo. Personally, I think for calculating poker odds, it's more difficult to set up a simulation and extract meaningful data from it as opposed to figuring out the odds of whatever situation you're trying to get a handle on.
The OP stated he wanted to figure out why certain cards play better against fewer opponents. I would rather go about it setting up a small iterative calculation that would produce the odds based on "i" number of players. Setting up a simulation involves: designing a deck of cards, creating the situation (involves partially designing the poker game itself), storing the output of the data, extracting meaning from the data... it gets complicated quickly.
So seriously, no snobbery or elitism on my part, I just thought the better advice was to offer a simpler solution.
Ubuntu Desktop
GCC/G++
Geany (for quick projects)
Anjuta (for larger things)
For some of the most complicated situations, the math can be difficult and easy to get wrong. (I have two kids and one is a girl, what are the odds the other is a girl?)
Once you have the simulation working accurately then you can be reasonably confident in the numbers you receive, whereas doing it through math means that you must make an accurate calculation every time.
Chances are somebody has already done the math for seven card stud, but it might be fun and good learning to try to create the simulation.
>> How would I go about doing this? I know nothing about programming at all.
Here's the sticky part, though. Without any programming knowledge, it will be very difficult to get this right. You have to start by learning a programming language. If C++ is your choice, spend a few months with Accelerated C++ and then you might be ready to start towards the simulation.