ANSWERS: 1
  • Computers are complex systems, which makes it difficult to define a given computer's speed in a single, easy to compare value. Because of this, multiple measurements and methods exist for measuring a machine's speed for comparison to other machines.

    Megahertz / Gigahertz

    The most common description of a computer's speed is the clock speed of the processor--the number of times per second that the processor executes instructions. This number is described using the word "megahertz," which means a million times per second or, on the newest personal computers, "gigahertz," meaning a billion times per second.

    The Megahertz Myth

    Despite the prevalence of this term, there are problems with the "speed equals megahertz" definition. Other aspects of central processing unit, or CPU, design, including cache size and the instruction set, affect overall CPU speed.

    Benchmarks

    For this reason, the most accurate method of determining a computer's speed is to perform a benchmark--that is, to measure the time it takes a machine to complete a given task, often using a real application, and compare the speeds of machines that way.

    FLOPS

    A common CPU benchmark is floating point operations per second, or FLOPS. This benchmark, rather than attempting to express a hardware speed, measures the time it takes the hardware to perform a set number of mathematical operations.

    Frames Per Second

    In gaming, a common computing benchmark is to measure the number of frames per second (FPS) drawn to the monitor during a given game across multiple computers using a standard configuration of game settings and video resolution.

    Source:

    Clock Speed

    The Megahertz Myth

    FLOPS

Copyright 2023, Wired Ivy, LLC

Answerbag | Terms of Service | Privacy Policy