CPA
Elite Member
- Nov 19, 2001
- 30,322
- 4
- 0
It that's a Smart car, I'd rather have the goat.
As before, you are totally ignoring initial conditions. What you just stated is not the same as the Monty Haul problem at all.
You have to be trolling this.It's the exactly same logical problem.
It's the exactly same logical problem.
^Idiots who think they're right.
But it's not smart to call people idiots when they ARE right. Really, no kidding, no trolling, in all seriousness - it's better to switch. It is.
That hasn't been sufficiently explained. The counterintuitive nature of the problem disregards the simple solution.
And no matter how right you think you are, you're still wrong if you can't articulate your position.
That hasn't been sufficiently explained. The counterintuitive nature of the problem disregards the simple solution.
And no matter how right you think you are, you're still wrong if you can't articulate your position.
Funny how this time the Monty Hall problem spawned the war instead of .999 = 1. It's also funny how adamantly people insist on being wrong. OK guys, put it to the test then:
Monty Hall Simulator
I wish I could find a simulator that supports more than 3 doors, because that would make it incredibly obvious that switching is better.
lol when you're disregarding everything that happens prior to the final pick. You're right: it's a 50/50 chance if you drop the ticket you started with (after the other 174,999,998 are eliminated) onto the winning ticket, suddenly experience amnesia, and have to pick up one of the two. But that's not the situation here.
No, the 'game' was *always* going to end up with your ticket and another ticket, and one of them the winner.It's irrelevant what happens before the final pick because we're standing with two countable numbers.
There's a 100% chance that either my ticket or the other is the winner, so explain how mine is one in a million and the other is a sure thing. Sounds like you don't understand probability.
With regard to #2,
0.999... = 1 > 0.999
No, the 'game' was *always* going to end up with your ticket and another ticket, and one of them the winner.
The 'one ticket' represents the chance that the entire group of 'tickets you did not pick' contained the winning ticket. If any of those (N-1) tickets was the winner, then switch and you win.
You only win by sticking with your original ticket if you beat the original odds and picked the winner. This is not 50%.
To get back to 50% you have to choose randomly between your two remaining options.
At this point, he is either intentionally trolling or just incredibly dense. In either case, further argument is a waste of time.
To me the issue with the original version of the problem has always been one of psychology. There's actually a fairly large chance that you picked the right door (1/3) and you're going to feel 'dumb' if you switch and lose, even though it was the correct way to play.
The 'one ticket' represents the chance that the entire group of 'tickets you did not pick' contained the winning ticket. If any of those (N-1) tickets was the winner, then switch and you win.
To get back to 50% you have to choose randomly between your two remaining options.
The problem is much simpler. You aren't 'trying to pick the right door' in the first step.
You are dividing the total into two groups:
One is a group of 1 and the other is a group of (N-1).
Now decide whether you want 1/N or (N-1)/N chance of winning, and act accordingly.
Here's another explanation.
How does a "switcher" win. He needs to pick a wrong ticket, and then the ticket remaining in the bowl will be the winning ticket. So the probability of a "switcher" winning is 174,999,999/175,000,000 (the probability of your first pick being the wrong ticket).
How does a "non-switcher" win. He needs to pick the right ticket, because then the ticket in the bowl will be a losing ticket. So the probability of a "non-switcher" winning is 1/175,000,000 (the probability of your first pick being the right ticket)
To get back to 50% you have to choose randomly between your two remaining options.
I think the three quotes above explain it quite nicely.
Right, that's why I recast it into the Powerball problem, since everyone knows your chance of buying a winning Powerball ticket is pretty damn low.