Certain units are always understood as decimal even in computing contexts. For example, hertz (Hz), which is used to measure clock rates of electronic components, and bit/s, used to measure bitrate. So a 1 GHz processor performs 1,000,000,000 clock ticks per second, a 128 kbit/s MP3 stream consumes 128,000 bits (15.625 KiB) per second, and a 1 Mbit/s internet connection can transfer 1,000,000 bits (approx. 122 KiB) per second.
Measurements of electronic memory such as RAM and ROM are given in binary units, because the physical structure of the device makes it naturally come in sizes that are powers of two. This is the case whether the capacity is given in bits or bytes.
Hard disk drive manufacturers state capacity decimal units, so what is advertised as a "30 GB" hard drive will hold 30 × 109 bytes, roughly equal to 28×230 bytes (i.e. 28 GiB). This usage has a long engineering tradition, and was probably not influenced by marketing. It arose because nothing about the physical structure of the disk drives makes power-of-two capacities natural: the number of platters, tracks and sectors per track are all continuously variable.
Modern-day PC users, of course, regard both RAM and disk as kinds of storage and expect their capacities to be measured in the same way. Operating systems usually report HD space using the binary version. To the purchaser of a "30 GB" hard drive, rather than reporting either "30 GB" or "28 GiB", Microsoft Windows reports "28 GB." This creates hard feelings, sometimes made worse by other technical issues such as failure to distinguish between unformatted and formatted capacities.