<< Previous page TOC Next page >>
4.1. BITS AND BYTES
Computers speak in a "code" called machine language, which uses only two numerals: 0 and 1. Different combinations of 0s and 1s form what are called binary numbers. These binary numbers form instructions for the chips and microprocessors that drive computing devices - such as computers, printers, hard disk drives, and so on. You may have heard the terms "bit" and "byte." Both of these are units of information that are important to computing. The term bit is short for "binary digit." As the name suggests, a bit represents a single digit in a binary number; a bit is the smallest unit of information used in computing and can have a value of either 1 or a 0. A byte consists of 8 bits. Almost all specifications of your computer's capabilities are represented in bytes. For example, memory capacity, data-transfer rates, and data-storage capacity are all measured in bytes or multiples thereof (such as kilobytes, megabytes, or gigabytes).
This discussion of bits and bytes becomes very relevant when it comes to computing devices and components working together. Here, we'll address specifically how bits and bytes form the basis of measuring memory component performance and interaction with other devices like the CPU.