Say what you will, but Megabytes is universally accepted to mean 1024 Kilobytes, which equals 1024 Bytes. Which means there's 1,048,576 Bytes in a Megabyte. You can keep you Mebibytes, and shove it. Agreed. It's not because marketing really cocked everything up that binary bases should be changed to decimal. 1024 bytes in a Kilobyte, 1024 Kilobytes in a Megabyte, 1024 Megabytes in a Gigabyte and 1024 Gigabytes in a Terabyte. You can say this, but it is confusing and error prone to have different values for the same prefix. That's why the IEC introduced the binary prefixes. An example: The PCI bus runs at 33MHz, and is 32 bit wide. What is the theoretical maximum bandwidth? Well, 33MHz * 4 byte makes 132 MB/sec. True? Yes, but this is a decimal Mega because the Mega in MHz is decimal. The answer which makes more sense is 125.9 MiB. A difference of 4.8% An other point: How much time will it take to download a 2GB ISO over a 4Mbit line? Well, that is... Wait, the 4Mbit, is that 4*10^6 or 4*2^20? How should I know?