A Match Reminder
- The letter “G” (for “giga”) before a unit of measure means the value is multiplied by 109 (one billion).
- The letter “T” (for “tera”) before a unit of measure means the value is multiplied by 1012 (one million million).
- Therefore, 1 Tera = 1,000 Giga and 1 Giga = 1,000 Mega.
This is the speed at which a computer can run applications.
It is measured in hertz (abbreviation: Hz). As technology develops, recent computers’ processing speeds have come to be noted in either MHz (megahertz) or GHz (gigahertz) because they are so fast.
The higher its processing speed, the faster a computer will be able to run word processing applications (Word), music players, imaging software, etc.
For example, programs will respond faster on a computer with a 2 GHz processor than on one with a 1.66 GHz processor.