This page from the PCGuide created in the mid 90's says otherwise:
http://www.pcguide.com/intro/fun/bindec-c.html
"What's worse is that the percentage discrepancy between the decimal and binary measures increases as the numbers get larger: there is only a 2.4% difference between a decimal and a binary kilobyte, which isn't that big of a deal. However, this increases to around a 5% difference for megabytes, and around 7.5% for gigabytes, which is actually fairly significant. This is why with today's larger hard disks, more people are starting to notice the difference between the two measures.
Hard disk capacities are always stated in decimal gigabytes, while most software uses binary. So, someone will buy a "30 GB hard disk", partition and format it, and then be told by Windows that the disk is "27.94 gigabytes" and wonder "where the other 2 gigabytes went". Well, the disk is 27.94 gigabytes--27.94 binary gigabytes. The 2 gigabytes didn't go anywhere."
If you aren't familiar with Charles Kozierok who is the guy that wrote the PC Guide, he has masters degrees in the electrical engineering and computer science from MIT. He likely has a pretty good handle on what he is talking about here.
The only change that happened in the hard drive industry in the late 90's was that manufacturers had to start stating they were basing their capacity on 1MB being equal to 1 million bytes. Sort of like when CRT manufacturers had to start listed the viewable screen diagonal along with whatever size they were claiming the monitor was.