ENIAC The world's first electronic digital computer, developed in 1946.

Computer industry is a broad field due to its numerous inventions and its rapid changes. Many scientists and inventors have contributed to the development and process of computer manufacture while many still work on it. And so the technology of this complex machine that consists from many parts, each of which represents a special invention, grows very quickly.

Automatic calculation machine that invented in the early 1800's, by Charles Babbage, a mathematic professor, acclaimed to be the inspirational idea. This machine was steam powered and could store up to 1,000 50-digit numbers, programmed by, and stored data on cards with holes punched in them, known as punch cards. Nevertheless, the first generation computers, -that somehow followed the operation of the calculation machine- used vacuum tubes for circuitry and magnetic drums for memory, relied on machine language to perform operations. Input was based on punched cards and paper tape, while a number of assignments for the computer were gathered up and processed in batch mode. The results were collected, in some cases after many hours’ even days.

The first computer to use vacuum tubes (electronic valves) acclaimed to be “(Electronic Numerical Integrator and Computer)” designed in 1946, while in 1949 was constructed “EDSAC (Electronic Delay Storage Automatic Calculator)”, the first stored program computer.

But, these early computers were often enormous, taking up entire rooms. Also, they were very expensive to operate, using a great deal of electricity a fact that also generated a lot of heat which often caused malfunctions. These computers usually owned by large corporations, universities and other “similar-size” institutions.

A more interactive use of computers was developed with the invented of microprocessor in the middle 60s (the world’s first commercial microprocessor was “4004” that Intel released on November 1971). Since then, were developed multiple computer terminals that allowed more than one user to work on.

Later, was developed the “minicomputer”, the precursor of personal computer (PC), that one had the exclusive use of processor. The name minicomputer came from the fact that they managed to shorten their size, and from the huge machine that took up a whole room, were made in the size of a refrigerator. Of course their cost was still too expensive for an individual but in that case they were accessible to smaller laboratories and research projects unlike to the previous model.

The rapid progress of technology played a major role in the process of minicomputers. From the middle 1970s, the accession of a high resolution screen, a graphical user interface, a large internal and external memory, special software, opened the way for the construction of personal computers that could be accessible to individuals.

Then, technologists invented microprocessor, in an attempt to shorten the size of computers. Since then, all the functions of a computer were available in one chip, the use of which conduced in the dramatically decreased of the cost and the size of the machine. Of the first constructions of this new form made in 1974 was “Altair 8800” followed by “IMSAI 8800”.

In the year of 1976 designed for the government officials, an operating system with integrated keyboard, monochrome monitor, 8 inch floppy disk drive and 16k of ram, made by the Kooro Manufacturing and Electronics Cooperative.

In the late 1970s, Steve Wozniak designed the Apple II and since then became known the definition “home computers”. These computers were designed in the beginning only for fun and an educational purpose, besides their technology wasn’t as capable as of the large ones. Inchmeal appeared the “personal computers” the graphics and sound capacities of which were matched by those intended for business and so by the late 1980s gradually replaced the home computers. Also, these computers were pre-assembled, often pre-configured with bundled software that required little technical knowledge to operate.

In 1981 the term “personal computer” was a trademark of IBM that few years later, after the first “conflict” between IBM and Apple, justice declare that “personal computer” was a generic term for any personal computer not made by Apple. So IBM PCs was born as well as Macintosh (for Apple).

The next years, microcomputers turned into a business tool, while good “word processor” programs like Microsoft Word and Apple Macintosh in 1985 were also appeared for many home computers. During the 1990s, the power of personal computers increased rudimentarily.

Modern personal computers are normally operated by one user at a time to perform such general tasks as word processing, Internet browsing, e-mail and other digital messaging, multimedia playback, computer programming, etc, but there are no longer the exclusive tools of their users (in business), due to the accretion of internet and networks.

Also, most of modern personal computers are upgradeable, especially desktop and workstation class computers. Devices such as main memory, mass storage, even the motherboard and central processing unit may be easily replaced. This upgradeability is, however, not indefinite due to rapid changes in the personal computer industry. A PC that was considered of the top some few years before may now be impractical to upgrade due to changes in industry standards.

Personal computers can be categorized by size and portability to Desktop computers, Laptops or notebooks, Personal digital assistants (also known as PDAs), Portable computers, Tablet computers and Wearable computers.