i think i figured out why i'm right about the monty hall problem:
at work, we sell lottery scratch tickets. by law, they have to have the odds of winning written on them. the only one i remember is that one of the 1$ tickets is 1 in 4.72 chance of winning (keep in mind that this is your chance of winning your dollar back. if you want to make a profit, it's slimmer pickings).
now of course, if you buy a bunch of tickets, the chance of at least one of them being a winner goes up. for fun, let's say that if you buy 5 tickets, you have a 1 in 3 chance of having at least one winner. now, say you scratch four of those tickets and they're losers. is the chance of the last ticket being a winner 1 in 3? no, it's still 1 in 4.72, just like always. as you scratch the tickets, each time you get a loser the chance of the remaining tickets containing a winner gets smaller because there are less unscratched tickets left. you can't count the ones you already scratched because you know htey're not winners already: they're out of contention.
similarly, once the wrong door is revealed, the door you 1st picked is no more or less likely to be right than the last door. if this weren't true, he could show both doors you didn't pick to be wrong, and your door would still have only a 1/3 chance of being right.
therefore: dan wins!
 this only stops being true if you have bought a significant percentage of the total production run, assuming they don't have some nedlessly complex method of assuring each card's chances are independent of the total run.
(yes, this is essentially teh same argument i was making all along. the difference is then i was saying that both answers seem to be valid, and i was defending mine cuz john said only the other was true. now i think mine is in fact correct)