Megabyte (MB)
The megabyte, symbolized as MB, is equal to 1,000,000 bytes in the decimal (SI) system, although in computing, it is often regarded as 1,048,576 bytes (1,024 KB). The term emerged in the 1960s as computers and storage capacity increased, making kilobytes insufficient for describing larger files. Megabytes are used for text, images, and small software applications, providing a human-readable scale for digital data capacity while bridging metric and binary conventions.
Bit (b)
The bit, symbol b, is the fundamental unit of digital information. Short for binary digit, it was introduced in 1948 by Claude Shannon in his work on information theory. A bit represents either 0 or 1, forming the basis of modern computing and digital communication. Bits measure data transfer, storage efficiency, and computational processes. Individually small, billions of bits combine to store files, images, and programs, making the bit essential in the digital era.