Terabyte (TB)
The terabyte, symbolized as TB, equals 1,000,000,000,000 bytes in decimal (approximately 1,024 GB in binary). It emerged in the 1990s as storage systems, databases, and servers experienced exponential growth. Terabytes are used to measure hard drives, enterprise storage, data centers, and cloud services. With the rise of big data, video streaming, and high-resolution imaging, the terabyte has become a practical unit for both consumers and professionals. Its adoption enables comprehension of massive digital storage in manageable terms.
Bit (b)
The bit, symbol b, is the fundamental unit of digital information. Short for binary digit, it was introduced in 1948 by Claude Shannon in his work on information theory. A bit represents either 0 or 1, forming the basis of modern computing and digital communication. Bits measure data transfer, storage efficiency, and computational processes. Individually small, billions of bits combine to store files, images, and programs, making the bit essential in the digital era.