Since I can't for obvious reasons examine their source code, look at their hardware schematics, or even talk to the technical people who did it, etc. I have to make educated guesses.
It would have been something like...
unsigned int payout; // Oops we are about to forget to initialize this variable, either here or elsewhere
...
...
Then later in the code, instead of the correct payout=20; // Pay the winner 20 cents winnings
..
..
// Mistake in next line is that a '-' minus sign was accidentally appended before the equals sign, which unfortunately WON'T generate an error message
payout -= 20; // Oops again, now the uninitialised value has had 20 subtracted from it // Pay the winner $0.20
Which now roughly gives the payout she received (I think it is still out by a count of one, but there are various explanations for how that could happen).
EDIT:
Actually I think I cracked it.
payout unitialised starts at ZERO = 0
then instead of payout = 20; // Player wins $0.20
the mistake is ....
payout -= 20; // pay winner $0.20 // but mistaken - means = -20 which UNDERFLOWS to amount she apparently won on the screen
i.e. 2^32 - 20 = what the screen says. See earlier pictured posts.
2^32 - 20 = 4294967276
which in cents =
$42,949,672.76
See picture: