explaining computer

Hardware

INTRODUCTION

Hardware refers to all of the physical parts of a computer system. For a traditional desktop computer this comprises the main system unit, a display screen, a keyboard, a mouse, and sometimes a orinter. Speakers, a webcam and an external hard drive for back-up storage are often also included.
The following gives a basic overview of personal computer (PC) hardware, with the focus being on desktop computers. Inevitably, other sections of this website -- most notably those covering storagemobile computing and networking -- also discuss particular areas of computer hardware and its application and specification. For a more technical hardware guide, see the excellent Introduction to Computer Hardware written by Howard Gilbert of Yale University. And if you are interested in the evolution of computing, you may like to read The History of the Microcomputer Revolution by Frank Delaney or this brief history of computing.

A DECREASING CONCERN

When the first microcomputers were introduced in the late 1970s, and in particular when the IBM PC was launched (in 1981 in the USA and 1983 in the UK), the computer industry was dominated by hardware. This was because most of the money spent on a computer system went on hardware, with a direct trade-off existing between processing power and overall system cost. The exact hardware specification was usually also critical. Today, however, neither of these points remains the case.
Since the turn of the century the cost of a typical desktop PC has fallen in both real and monetary terms. Almost all new computers are now also capable of performing most of the tasks that can be demanded of them, with the exact hardware specification being largely irrelevant for all but the most demanding or specialist users. Indeed, on the 23rd of August 2005, Intel declared the "clock frequency war" to be over, with the new computing mantra to be performance per watt. Or to put it another way, no longer would the speed of a computer's processor be the primary measure of its capability in terms of either consumer expectation, or the market dominance of its microprocessor manufacturer.
To a large extent, time was called on the clock frequency war because of the difficulties encountered in cooling microprocessors as they became faster and faster. However, another driver was simply that raw processing power was starting to become a secondary concern for many purchasers. By 2005, factors such how much noise a computer makes, case style and size, and a computer's green credentials, were starting to be perceived as important. And such non-processing-power measures are increasingly driving both consumer and business computer purchase decisions today.

THE BASIC TECHNICAL SPECIFICATIONS

Having said that the technical specification of a computer matters far less than it did even a few years ago, some understanding of a little hardware technobabble will still inevitably prove useful. Most obviously such knowledge is handy when purchasing or upgrading a computer and/or related peripherals to ensure that everything will connect together and work OK.
Decisions on hardware specification are often driven by the minimum hardware required to run specific software (such as a specific application program or operating system). Indeed, it remains most sensible for many users to decide on the software they want or need to run, and to choose or upgrade their hardware accordingly.
In broad terms, the performance of a computer depends on four factors: the speed and architecture of its processor or "central processing unit" (CPU), how much random access memory (RAM) it has, its graphics system, and its internal hard drive speed and capacity. Also of importance to most users will be the specification of its Internet connection. Most computer users -- and in particular those working with a lot of photographs, music files or videos -- should also think about the most suitable storage devices they will need in order to keep and back-up all of thier valuable data.

PROCESSOR SPEED AND ARCHITECTURE

The speed of a computer's processor chip (technically known as its "clock speed") in measured in gigahertz (GHz), with the fastest modern processors currently running at up to 4.7GHz. However, for most computing tasks -- including web browsing, sending e-mails, word processing and spreadsheet work -- any processor running at 1GHz or more remains perfectly sufficient. (No really guys, it does!).
Where higher processor speeds become more important is for applications such as video editing, 3D graphics work and (for the majority of "power users") playing computer games! For any of these applications, within reason the faster the processor the better. This said, people in need a very powerful computer have to be aware that CPU performance is now determined by far more than raw speed alone. Intel made this very clear when it introduced its system of processor numbers. These provide an indication of a processor's "architecture", "cache" and "front side bus (FSB) speed" in addition to its clock speed.
Alongside clock speed, the architecture of a processor is the most important factor to determine its performance, and refers to its basic design and complexity. Some processors are simply more sophisticated than others, with Intel (for example) producing "basic" processors called Celerons and Pentiums, as well as more powerful processors under its "Core" processor family. The later include the Core 2, Core i3, Core i5 and Core i7, with the last of these being the most powerful.
All Intel Core processors feature more than one "core" -- or in other words more than one physical processor -- manufactured as a single component. Intel's "Core 2 Duo" chips, for example, feature two processors core on a single chip, whilst "Core 2 Quad" processors have four processor cores. In most situations multi-core processors are far more powerful than traditional single core processors. Quite literally this is because they can do several things at the same time (something single core processors can only achieve by constantly switching back and fourth between doing one thing and doing another). In turn this means that multi-core processors can run at lower speeds than single-core processors and yet be far more powerful. A 2.4GHz Core 2 processor, for example, usually proves far more productive than a single core 3GHz Pentium processor. All of this hopefully makes it clear why clock speed by itself is no longer a straight-forward indicator of processor power, with the architecture of the processor -- and most notably including its number of cores -- now being at least as significant.
Intel Celeron, Pentium and Core processors are today all to be found at the heart of new desktop and laptop PCs. Intel additionally also manufacture very-high-specification chips named Xeons and Itaniums to drive the most powerful business workstations and servers. If this range of choice all sounds a bit confusing then to be honest it is -- with Intel itself having resorted to a range of processor section wizards in an effort to explain its processor ranges on its own website.
To add further to Intel's abundance of processor choice and complexity, the company also offers a range of low-power processors called Atoms. These are highly energy efficient, and were primarily first intended to be used in mobile computers including netbooks. However, today the latest dual core Atom processors are increasingly finding their way into highly energy-efficient desktop computers. For many people a computer with the latest 1.66GHz or 1.8GHz dual core Atom processor will be capable of undertaking any computing task they require, and probably at least four times more energy efficiently than a Celeron, Pentium or Intel Core based computer. You can watch me construct a dual core Atom-based computer in the following video:






In addition to clock speed and architecture, a processor's cache and front side bus (FSB) speed also determine a computer's overall power. In brief, cache is a form of very fast memory integrated into the processor chip, and used to store up instructions (work for the processor) so that it has to slow down as little as possible between tasks. Cache is measured in megabytes (MB), with (for example) low-end Celeron processors having as little as 0.25MB of cache (256KB), and high-end Itaniums having up to 24MB. The simple message is, the more cache the better -- though high levels of cache still come at a very significant price.
Front side bus (FSB) speed is a measure of how fast a microprocessor communicates with the computer's main circuit board (or "motherboard") into which it is physically connected. Again, the higher the measure the better for overall performance, with FSB speeds currently ranging from 533MHz (still perfectly sufficient for the vast majority of applications) up to 1600Mhz.
NOTE: Whilst the examples in the above section all refer to Intel microprocessors, it should be noted that the PC processor market is dominated by both Intel (with about 80 per cent market share) and its main rival AMD. AMD's low specification processors are called Semprons, its mid-range chips called Athlons, and its high-end chips called Phenoms and A-Series.

RAM

RAM -- or "random access memory" -- is the temporary storage space that a computer loads software applications and user data into when it is running. All current RAM technologies are "volatile", which means that everything held in RAM is lost when a computer's power is removed. To a large extent, the more RAM a computer has the faster and more effectively it will operate. Computers with little RAM have to keep moving data to and from their hard disks in order to keep running. This tends to make them not just slow in general, but more annoyingly intermittently sluggish.
[The above all said, those hoping to speed up thier PC by installing more RAM need to note that any PC with a 32 bit operating system can only access a maximum of 4GB of RAM. Add more, and the PC simply will not recognise it. In practice this that means the vast majority of PCs in use and being sold today cannot benefit from more than 4GB of RAM -- and this includes many PCs running Windows 7 (which is very widely sold in its 32 rather than 64 bit format to maximise compatabilty with older software and perhipherals).]
RAM is measured in megabytes (MB) and gigabytes (GB), as detailed on the storage page. Just how much RAM a computer needs depends on the software it is required to run effectively. A computer running Windows XP will usually function quite happily with 1GB of RAM, whereas twice this amount (ie 2GB) is the realistic minimum for computers running Windows 7. Most mobilecomputers usually feature far less RAM, and indeed even desktop computers running smaller operating systems (such as some versions of Linux or Windows 98) can run very effectively with as little as 128MB of RAM in certain situations.

GRAPHICS SYSTEM

A computer's graphics system determines how well it can work with visual output. Graphics systems can either be integrated into a computer's motherboard, or plugged into the motherboard as a separate "video card". Graphics systems integrated into the motherboard (also known as "onboard graphics") are now quite powerful, and sufficient for handling the requirements of most software applications aside from games playing, 3D modelling, and some forms of video editing.
Any form of modern computer graphics system can now display high-resolution colour images on a standard-sized display screen (ie any monitor up to about 19" in size). What the more sophisticated graphics cards now determine is how well a computer can handle the playback of high definition video, as well as the speed and quality at which 3D scenes (including games!) can be rendered. Another key feature of separate graphics cards is that most of them now allow more than one display screen to be connected to a computer. Others also permit the recording of video.
In effect, modern graphics cards have become dedicated computers in their own right, with their own processor chips and RAM dedicated to video decoding and 3D rendering. Hardly surprisingly, when it comes to final performance, the more RAM and the faster and more sophisticated the processor available on a graphics card the better. This said, top-end graphics cards can cost up to a few thousand dollars or pounds.
As a basic rule, unless a computer is going to be used to handle 3D graphics or to undertake a significant volume of video editing or recording, today there is little point in opting for anything other than onboard graphics (not least because separate graphics cards consume quite a lot of electricity and create quite a lot of heat and noise). Adding a new graphics card to a computer with onboard graphics is also a very easy upgrade if required in the future.
Graphics cards connect to what is known as either a "PCI Express" or an "AGP" slot on a computer's motherboard. PCI Express is the more powerful and modern standard, with the best graphics cards requiring the use of two PCI Express slots. A PC being upgraded from onboard graphics sometimes also requires an upgraded power supply if it is to continue to run in a stable fashion.

HARD DRIVE SPEED AND CAPACITY

Hard disk drives are the high capacity storage devices inside a computer from which software and user data are loaded. Like most other modern storage devices, the capacity of the one or more internal hard disks inside a computer is measured in gigabytes (GB), as detailed on the storagepage. Today 40GB is an absolute minimum hard drive size for a new computer running Windows 7, with a far larger capacity being recommended in any situation where more than office software is going to be installed. Where a computer will frequently be used to edit video, a second internal hard disk dedicated only to video storage is highly recommended for stable operation. Indeed, for professional video editing using a program like Premiere Pro CS5, Adobe now recommend that a PC has at least three internal hard disks (one for the operating system and programs, one for video project files, and one for video media). This is also not advice to be lightly ignored if you want your computer to actually work!
Most computers are configured to use a proportion of a computer's internal hard disk to store temporary files. Such a "swap file" enables the computer to operate effectively, and means that some free hard disk space always needs to be available for a computer to run properly. However, providing that a hard disk is large enough to store the required software and user data without getting beyond about 80 per cent full, hard disk capacity will have no impact on overall system performance. However, what does impact significantly on overall system performance is the speed of a computer's main internal hard disk. This is simply because the longer it takes to read softwareand data from the disk, and to access temporary files, the slower the computer will run.
Two key factors determine the speed of traditional, spinning hard disks. The first is the rotational velocity of the physical disk itself. This can currently be 4200, 5400, 7200, 10000 or 15000 rpm (revolutions per minute). The faster the disk spins, the quicker data can be read from or written to it, hence the faster the disk the better (although faster disks consumer more power, make more noise, and generate more heat). Most desktop hard disks run at either 5400 or 7200 rpm, whilst most laptop hard disks run at 4200 or 5400. However, upgrading to a 10000 or 15000 rpm disk -- such as a Velociraptor from Western Digital -- can prove one of the most cost-effective upgrades for increasing the performance and responsiveness of a desktop computer.
The second key factor that determines performance of a traditional, internal hard disk is the interface used to connect it to the computer's motherboard. Three types of interface exist: SATA, which is the most modern and now pretty much the norm on new PCs; IDE (also known as UDMA), which is a slower and older form of interface, and finally SCSI, which is happens to be the oldest but in it most modern variant is still the fastest disk interface standard. This said, SCSI is now all but redundant in desktop computing since the introduction of SATA, as SATA provides a fairly high speed interface at much lower cost and complexity than SCSI.
The above points all noted, for users seeking ultimate performance, there is now the option of installing a computer's operating system, programs and data on a solid state drive (SSD), rather than a traditional, spinning hard disk. SSDs are far faster and more energy efficient than traditional, spinning hard disks, which in time they will largely replace. This said, at present SSDs are still a lot more expensive than traditional spinning hard disks in terms of cost-per-gigabyte. You can learn more about SSDs on the storage page and/or in the following video:





INPUT DEVICES
Whilst the specification of the components within a computer's system case does matter, today of far more importance to most users is the range of computer peripherals they have available -- or in other words the input and output hardware that allows them to interface with the digital world. Over the past five years in particular, what has mattered most for the majority of the population have been the quite staggering changes that have taken place in the ways in which individuals can now create, output and work with computer data. This section and the next therefore provide a very brief summary of computing input and output devices. You can also find a more conceptual overview of the development and integration of computers into the physical world in the Second Digital Revolution section of ExplainingTheFuture.com.
Keyboards remain the dominant means of getting most textual and numeric data into a computer. Computer keyboards have also changed relatively little over the past couple of decades. Those developments that have taken place tend to have involved the inclusion of more and more special function keys, wireless technologies, and improvements to assist with display screen equipment health and safety regulations. Early IBM PC keyboards, for example, whilst being extremely robust, had such solid keyboard switches that many people who typed on them all day soon developed repetitive strain injury problems. In contrast, modern keyboards (designed for typists rather than engineers who do not spend all day typing) require a far lighter touch.
Alongside keyboards, mice and pointing devices are the other dominant form of computer input device. The first mouse was made out of wood at the Stanford Research Institute in the 1960s, with its history now detailed on their MouseSite. The basic principle of moving around a small, buttoned device to in turn position and select with a pointer on a computer screen also remains unchanged to this day. What has changed is the variety of "rodents" now available. Many are now wireless (and hence I guess technically "hamsters"), whilst others have evolved into pads or trackballs built into laptop computers.
For accurate graphics work such a photo retouching, graphics tablets are now the choice of many, with a pen or other tool being used on a special surface (the absolute market leader in this area being Wacom). Many mobile computing devices now also feature a touchscreen that allows the device used to control a computer to be a pen or finger directly in contact with the display. Touchscreens are now included on almost all smartphones and tablet computers, as well as many point of sale systems, and Wacom's lovely Cintiq.
Webcams and digital cameras have over the past ten years also significantly expanded the way that a great many people work with and think about computers. Digital photography is now commonplace, with the uploading of images onto a PC for e-mailing, sharing over the web, or printout, now the norm. I remember in the late 1990s the manager of the largest chain of photo processing shops in the UK telling me that digital photography would have no real impact on their business. Oh how wrong he was!
All forms of digital camera continue to converge. Webcams remain devices primarily designed for capturing movies directly into a PC (perhaps for upload to YouTube), or to enable desktop videoconferencing. However, many digital stills cameras can also be used as webcams. Many digital stills cameras can in addition capture video clips, whilst many digital video cameras can take still photos. Mobile phones of course also share these capabilities. Developments like Microsoft's surface computer will also make it easier and easier to share both still and moving images between computers and all kinds of mobile devices in the future.
Alongside cameras, low cost scanners have also allowed millions of us to easily capture documents and images directly into a computer. In turn, scanners are now converging with printers -- with multi-function devices (MFDs) now commonly including a printer, scanner, photocopier and sometimes fax machine. Used with optical character recognition (OCR) software, scanners also permit the capture not just of images, but of editable text.
Finally on the input side, microphones and audio recorders are now commonly used for digital data capture. Microphones are obviously needed to permit online audio and video conferencing. However, the use of portable audio recorders is now also on the increase. These can capture sound in range of formats, including MP3, WAV, and (for broadcast film and video purposes) BWF. Fostexis widely regarded as the leading manufacturer of high-end digital audio recording hardware, although I personally prefer hardware from Tascam!

OUTPUT DEVICES

Display screens remain the dominant form of computing output peripheral, with most new modern desktop displays being flat panels of between 15" and 19" in diagonal. However, far more bulky cathode ray tube (CRT) monitors are still favoured by some in high-end graphics work where absolute colour control is required. For other types of visual output (especially in education and training) video projectors are now also in widespread use. Whilst most flat computer displays are currently based on TFT (thin-film transistor) LCD (liquid crystal display) technology, over the next decade these are likely to be replaced with OLED (organic light emitting diode) screens as already used on some mobile phones and media players. [Note that OLED screens should not be confused with LED-backlit LCD screens however much some manufactures try to confuse potential buyers otherwise. LED-backlit screens are very nice indeed. OLED are amazing, if currently incredibly expensive.]
With the dream of the paperless office still just that, printers are of course the other dominant type of computer output hardware. Most printers these days are either laser (where toner is fused to the paper via a heated drum) or inkjet (where ink is sprayed onto the paper). Both of these technologies now offer high quality colour output, if with inkjet technologies still just having the edge for photo printing. Inkjet printers are more expensive to run than laser printers, but cheaper to buy (largely because most inkjet manufacturers discount the selling price of the hardware to make their money back on selling ink cartridges and photopaper).
Today, many printers have an integrated scanner and fax machine, and are hence sometimes referred to as multi-function devices (MFDs) capable of printing, scanning, photocopying and sending a fax.
Printing is also no longer just a flat, 2D process. Whilst yet to become a domestic desktop technology, 3D printers are now available from companies such as 3D SystemsSolid Scape and ZCorp. These allow a computer to output a physical 3D object in various plastics, resins or other materials, or even to print organic tissue! For more information on 3D printing, please see the 3D printing and bioprinting sections on ExplainingTheFuture.com.



 explainingthefuture



Other computer output hardware includes devices such as speakers (which can cost from a few pounds to several hundred) as well as Ipods and other music players that millions of people now use to extract music from their PC to listen to elsewhere. As with digital cameras (some of which are also music players!), in terms of a paradigm shift this is highly significant in that a personal computer iss rapidly becoming a "digital hub" into which many of our most used hardware devices are only ever temporarily connected. In turn, one could argue that our computers are increasingly with us all of the time in the form of those hardware devices that travel with us, but which functionally depend on at least an occasional interaction with a PC and often a website.

CONNECTION TECHNOLOGIES

USB Ports are now nearly universal. Devices including printers, modems, scanners, digital cameras and a great many storage devices now connect via USB (the "universal serial bus", first introduced in 1996). USB currently comes in three standards -- USB 1.1, USB 2.0 and USB 3.0.
USB 1.1 ports are now only found on older computers, and can transfer data as 12 Mbps (megabits per second). USB 2.0 ports are by far the most common, and are ten times faster at 480 Mbps. However, recently USB 3.0 has been introduced, with a theoretical maximum data transfer speed of 4,800 Mbps, or 4.8 Gbps (gigabits per second).
Physically USB 1.1, USB 2.0 and USB 3.0 ports appear pretty much identical. However, USB 3.0 connectors have an extra set of pins to accommodate higher speed data transfers. These are positioned in the back of a standard "Type A" plug (and hence in the front of a standard Type A socket). All other USB 3.0 connectors have been re-designed with a larger plugs and sockets to accomodate the extra set of pins. This means that USB 3.0 cables cannot be used with USB 2.0 and USB 1.1 peripherals, although USB 2.0 cables can be used with USB 3.0 devices, if at USB 2.0 speeds. For ease of identification, all USB cables and connectors are colour-coded bright blue. You can learn more about USB 3.0 in the following video:





VGA or DVI ports are used to connect display screens to computers. DVI is the more modern standard. However, adapters exist to allow VGA screens to connect to DVI ports and vice-versa, so in practical terms when purchasing a screen all that really matters is that you get the right cable. This said a DVI connection should be used if possible when running a high resolution display (a 19" or greater monitor at anything above 1024x768 resolution) for the sharpest and most stable picture. This is because the VGA port is an old analogue standard and was never intended for today's high display resolutions (even thought it often works reasonably well!).
Some printers still connect to a computer's parallel port. This is starting to cause problems as many new computers do not have a parallel port! So this is something to check if you are purchasing a new PC and don't want to change your printer.
Finally in common use are firewire and E-SATA ports. Firewire ports (also known as i-link or 1394 ports for various legal reasons) are most commonly used to connect digital cameras and external hard disks to computers. E-SATA ports are used to connect external hard disks or other external storage devices, such as DVD writers.

ONLINE HARDWARE

High-speed Internet connectivity and cloud computing have recently given rise to a new phenomenon for hardware resources to be delivered over the Internet. This development was initially known as "Hardware as a Service" or HaaS. However, this term has now been sub-divided in common usage into Platform as a Service or PaaS, and Infrastructure as a Service or IaaS.
Several key cloud computing vendors now offer computing processing capacity and data storage online. Amazon, for example, now have an IaaS offering called Elastic Compute Cloud or EC2. This allows users to purchase computer processing power online from Amazon. Such online hardware capacity is purchased in "instances", with each instance having its own defined amount of processing power, memory and storage. For example, an EC2 "small instance" currently comprises 1.7 GB of memory, 1 EC2 Compute Unit (1 virtual core with 1 EC2 Compute Unit), and 160 GB of storage. Computing instances are charged by the instance hour consumed, with data transfer charged by the GB.
Renting online hardware can offer many advantages. Amazon, for example, highlight how EC2 is elastic - because it allows users to increase or decrease their requirements within minutes, flexible - because users can choose the specification of each individual instance of computing power, inexpensive - as no dedicated capital investment is required, and reliable - as EC2 makes use of Amazon's proven data centres and network infrastructure.
For many businesses, online hardware is likely to be the future. For more information, please see the cloud computing page and/or the Cloud Computing Directory for a list of PaaS and IaaS providers.

SUMMARY

Unless you are a games player, 3D graphics artist or professional video editor, you will probably find that any modern personal computer will be adequate for your requirements. The input and output devices you require, as well as the software you wish and need to run, should therefore primarily drive your hardware needs. So do try to be wary of sales people trying to flog you hardware of a specification you will not use (such as a PC with a Core i7 processor for accessing the web and running office applications).
Finally, please note that some key hardware issues have not been covered here, as they are included on the pages for storage (which details back-up devices), networking (which includes information on wired and wireless networking), the Internet (which details broadband and how to get online) and green computing (which includes coverage of lower power hardware). In fact, every other section of this website aside from the Web 2.0 pages contains some additional information on hardware use or specification. As stated at the start of this section, computer hardware may in many ways no longer matter as much as it did in the past. However, equally it has come to provide the infrastructural backbone for so many human activities that it can now also no longer be completely ignored.

Komentar