Monday, May 4, 2009

History of Microprocessor

A microprocessor is a single chip integrating all the functions of a central processing unit (CPU) of a computer. It includes all the logical functions, data storage, timing functions and interaction with other peripheral devices. In some cases, the terms 'CPU' and 'microprocessor' are used interchangeably to denote the same device. Like every genuine engineering marvel, the microprocessor too has evolved through a series of improvements throughout the 20th century. A brief history of the device along with its functioning is described below.

History of Microprocessor

Working of a Microprocessor

It is the central processing unit which coordinates all the functions of a computer. It generates timing signals, sends and receives data to and from every peripheral used inside or outside the computer. The commands required to do this are fed into the device in the form of current variations which are converted into meaningful instructions by the use of a Boolean Logic System. It divides its functions in two categories, logical functions and processing functions. The arithmetic and logical unit and the control unit handle these functions respectively. The information is communicated through a bunch of wires called buses. The address bus carries the 'address' of the location with which communication is desired while the data bus carries the data that is being exchanged.

Types of Microprocessors

There are different ways in which microprocessors are categorized. They are
  • CISC (Complex Instruction Set Computers)
  • RISC(Reduced Instruction Set Computers)
  • VLIW(Very Long Instruction Word Computers)
  • Super scalar processors
Other types of specialized processors are
  • General Purpose Processor (GPP)
  • Special Purpose Processor (SPP)
  • Application-Specific Integrated Circuit (ASIC)
  • Digital Signal Processor (DSP)
History and Evolution of Microprocessors

The invention of the transistor in 1947 was a significant development in the world of technology. It could perform the function of a large component used in a computer in the early years. Shockley, Brattain and Bardeen are credited with this invention and were awarded the Nobel prize for the same. Soon it was found that the function this large component was easily performed by a group of transistors arranged on a single platform. This platform, known as the integrated chip (IC), turned out to be a very crucial achievement and brought along a revolution in the use of computers. A person named Jack Kilby of Texas Instruments was honored with the Nobel Prize for the invention of IC, which laid the foundation on which microprocessors were developed. At the same time, Robert Noyce of Fairchild made a parallel development in IC technology for which he was awarded the patent.

ICs proved beyond doubt that complex functions could be integrated on a single chip with a highly developed speed and storage capacity. Both Fairchild and Texas Instruments began the manufacture of commercial ICs in 1961. Later, complex developments in the IC led to the addition of more complex functions on a single chip. The stage was set for a single controlling circuit for all the computer functions. Finally, Intel corporation's Ted Hoff and Frederico Fagin were credited with the design of the first microprocessor.

The work on this project began with an order from a Japanese calculator company Busicom to Intel, for building some chips for it. Hoff felt that the design could integrate a number of functions on a single chip making it feasible for providing the required functionality. This led to the design of Intel 4004, the world's first microprocessor. The next in line was the 8 bit 8008 microprocessor. It was developed by Intel in 1972 to perform complex functions in harmony with the 4004.

This was the beginning of a new era in computer applications. The use of mainframes and huge computers was scaled down to a much smaller device that was affordable to many. Earlier, their use was limited to large organizations and universities. With the advent of microprocessors, the use of computers trickled down to the common man. The next processor in line was Intel's 8080 with an 8 bit data bus and a 16 bit address bus. This was amongst the most popular microprocessors of all time.

Very soon, the Motorola corporation developed its own 6800 in competition with the Intel's 8080. Fagin left Intel and formed his own firm Zilog. It launched a new microprocessor Z80 in 1980 that was far superior to the previous two versions. Similarly, a break off from Motorola prompted the design of 6502, a derivative of the 6800. Such attempts continued with some modifications in the base structure.

The use of microprocessors was limited to task-based operations specifically required for company projects such as the automobile sector. The concept of a 'personal computer' was still a distant dream for the world and microprocessors were yet to come into personal use. The 16 bit microprocessors started becoming a commercial sell-out in the 1980s with the first popular one being the TMS9900 of Texas Instruments.

Intel developed the 8086 which still serves as the base model for all latest advancements in the microprocessor family. It was largely a complete processor integrating all the required features in it. 68000 by Motorola was one of the first microprocessors to develop the concept of microcoding in its instruction set. They were further developed to 32 bit architectures. Similarly, many players like Zilog, IBM and Apple were successful in getting their own products in the market. However, Intel had a commanding position in the market right through the microprocessor era.

The 1990s saw a large scale application of microprocessors in the personal computer applications developed by the newly formed Apple, IBM and Microsoft corporation. It witnessed a revolution in the use of computers, which by then was a household entity.

This growth was complemented by a highly sophisticated development in the commercial use of microprocessors. In 1993, Intel brought out its 'Pentium Processor' which is one of the most popular processors in use till date. It was followed by a series of excellent processors of the Pentium family, leading into the 21st century. The latest one in commercial use is the Pentium Dual Core technology and the Xeon processor. They have opened up a whole new world of diverse applications. Supercomputers have become common, owing to this amazing development in microprocessors.

Certainly, these little chips will go down as history but will continue to reign in the future as an ingenious creation of the human mind

Discovery of Gunpowder

Among the list of chemical explosives, the gunpowder was the only known 'recipe' for many centuries. Discovered by the ancient Chinese civilization, gunpowder and its ingredients were briefly mentioned in the Taoist text Zhenyuan miaodao yaolue, though, the properties and its usage as an explosive agent was not explored and experimented with. The first written procedure for the manufacture of gunpowder is seen in the Chinese military guide Wujing Zongyao. Though, many burning agents such as the Greek fire had been used previously, the first explosive agent is the gunpowder whose roots can be traced back to the Chinese alchemy experiments. It is believed that the Chinese alchemists of the 9th Century, discovered gunpowder accidentally when an experiment for the search of elixir of life went haywire. Sometimes it is also argued that gunpowder might have been discovered or invented earlier since the Chinese alchemists were familiar with substances like saltpeter and sulfur.

Discovery of Gunpowder

Gunpowder and Wujing Zongyao

In ancient China, gunpowder was initially used as a propellant in firecrackers. The invention of firearms and the discovery of gunpowder and many more related explosive recipes of the gunpowder led to a drastic change in the battlefield. Crude bombs and firearms started appearing in the Asian continent in the 9th and 10th Centuries. The first standardized and most successful procedures were laid down in the Wujing Zongyaowhich was a Chinese military guide written by prominent scholars, Zeng Gongliang, Yang Weide and Ding Du who collaborated in 1044 AD to pen a "collection of the most important military techniques". Chinese alchemists had, by this time, discovered gunpowder of the most explosive nature that consisted of saltpeter, sulfur, charcoal and some other ingredients. Many new discoveries and variants kept on appearing till the modern era when, substance like nitrocellulose, nitroglycerin, smokeless powder and TNT were developed.

One of the most famous discovery of gunpowder variants mentioned in the Wujing Zongyao, consists of 48.5% saltpeter, 25.5% sulfur and 21.5% of other ingredients. This combination was used to manufacture incendiary bombs that were hurled by the siege engines.

Another mixture contained 38.5% of saltpeter, 19% sulfur, 6.4% charcoal and 35.85% other ingredients. This mixture was used as a fuel for poisonous smoke bombs. Arsenic and mercury were also, many a times, added to make the gunpowder poisonous.

Variants of Gunpowder

There are several variants and different uses of the gunpowder. The discovery of the gunpowder drastically changed warfare. It not only led to the use of firearms on the battle field but many more weapons such as poisonous bombs, grenades, fire arrows and even land mines were developed.

A Chinese text known as Hu Long Jing, from the 14th Century depicts the multi-stage rockets, fire arrows, different types of fireworks and military as well as naval and military explosives and mines. The book also describes Chinese musketeers.

During the siege of Pyongyang in the year 1593, about 40,000 Chinese soldiers used a variety of cannons and firearms such as muskets.

The Chinese empire tried very hard to keep the discovery of the recipe of the gunpowder a secret . However, it soon leaked out. Kingdoms in Mongolia and India began to make the use of gunpowder especially on the battlefield and for various purposes like fireworks, making mine shafts, tunneling, and the construction of canals.

Dispute over Further Development of Gunpowder

Countries in Asia like China, India, Mongolia and the Islamic Sates came up with their own variants, innovations and different discoveries of gunpowder mixtures.

The variant of gunpowder that was used by the Europeans was the black powder. The discovery of black powder is however, disputed as two people claim the credit. Some people believe that the innovator who discovered it was Roger Bacon, who was a Franciscan monk and an alchemist. Another Franciscan monk who is said to have innovated the black powder is Berthold der Schwarze. He was also known as 'Berthold the Black' and is said to have invented the first gun. However, facts about Berthold the Black are not clearly known and the dates of his birth, death and the time when he invented the gun or the black powder are disputed. In his epitaph it is said:

"Here lies Berthold the Black,
the most abominable of humans,
who by his invention has brought misery,
to the rest of humanity."

One could argue with the writer of the epitaph of Berthold the Black. The contribution of Chinese civilization in the discovery of gunpowder did not just change the battlefield scenario. In fact, more innovative and creative uses of gunpowder have helped man move mountains during mining and tunneling, turn deserts into lush green fields by building canals and make the civilization more comfortable and safer than before. It is certain, that the discovery of gunpowder has dictated the course of many events in history, in war and in peace.

Timeline and History of NASA

NASA was founded on July 29th, 1958 under the patronage of the the US government. The statute that defined its role read, "An act to pioneer research in space exploration, scientific discovery and aeronautical fields".

Timeline and History of NASA

The idea of setting up a scientific and technologically advanced institute was conceived due to the conditions prevailing in the Cold War period. USA and USSR emerged as the two superpowers in the aftermath of World War II and began a race to establish their influence over the world. The resulting battle of intellectual and political supremacy, led to a series of developments especially in military and space research. Space exploration became an important area of competition and each nation tried to outsmart the other to gain a stronghold in the 'space war'. The U.S pursued a policy of extensive work in astronomy and related space sciences to accentuate its technological supremacy.

However, the ingenious creation that is NASA, was not a sudden fallout of some rivalry. There were a sequence of events that finally led to the congregation of a super project called NASA. Space and aeronautics were a subject of great interest in the beginning of the 19th century. On March 3rd, 1915 National Advisory Committee for Aeronautics was formed in USA which was later rechristened National Advisory Council on Aeronautics. This period witnessed a series of scientific developments such as liquid fuel rockets launch by Dr. Goddard in the US, rocket planes in Germany, ballistic missiles in the erstwhile USSR and so on. Finally, October 4th, 1957 marked the dawn of the 'Space Age' when Soviet Union launched Sputnik, the first man-made space satellite. It was immediately followed by Sputnik 2 which carried a dog named Laika, the first animal on board a space flight. The first successful US launch was Explorer 1. It discovered the 'Van Allan Belts' which were present around the Earth. This was followed by Vanguard 1 and Explorer 3. The US had arrived big time on the space research scene with Russia challenging its dominion.

In the wake of all these developments, National Advisory Council on Aeronautics (NACA) was renamed NASA and formally inaugurated on October 1st, 1958, as a dedicated body for advanced research. It began full scale operations with a staff of around 8000 people and three advanced labs; Langley Aeronautical Laboratory, Ames Aeronautical Laboratory and Lewis Flight Propulsion Laboratory. Gradually, the number of centers went on increasing. Today, it has 10 different centers across the country. There were several programs undertaken during the initial years of NASA. Wernher Von Braun, a German who later became a US citizen was the father of the US space program. He contributed heavily in the new setup by breakthrough research on jet engines and aviation technology.

The year 1958 saw some of the first efforts to test the human survival skills in space. The earliest NASA programs were devoted to the launch of a manned space flight, as soon as possible. Trained officers from the US Army, Navy and the Air Force worked in tandem with the specially formed task group of NASA; for testing new inventions. A special team was dedicated to work out the environment aboard a spacecraft in Project Mercury. These efforts bore fruit on May 5, 1961 when Alan Shepard became the first American to pilot a space vehicle 'Freedom 7'. John Glenn became the first American to successfully orbit the Earth on May 5th, 1961.

This endeavor was succeeded in the period of 4 yrs, from 1968-72, by Project Gemini to conduct tests on the Moon and Project Apollo to explore the Moon in detail. NASA also conducted a landmark research in the study of space adaptability. Humans learned more about dealing with weightlessness, ways to safely return to the Earth's atmosphere, stationing a spacecraft in space and other such vital techniques. Edward White is credited to be the first US astronaut to perform a 'spacewalk'.

A determined president, John F. Kennedy, had instructed his nation's best minds to leave no stone unturned in their quest to reach the Moon. The Apollo series of flights to the space were missions to make this dream a reality. In the process, a crew of 3 astronauts were burned to death in an accident aboard an Apollo capsule that exploded on January 27th, 1967 due to some technical snags. It was on July 20th, 1969 when Neil Armstrong and Edwin Aldrin landed on the moon and were immortalized forever in human history as the first humans to do so. The famous words uttered by Neil Armstrong when he first stepped on the Moon surface were, "That's one small step for man but a giant leap for mankind." The rigorous work and money put in by the NASA staff was lauded throughout the world. It marked the beginning of a new phase in human evolution, 'The Space Age'.

Further, there were 5 more such probes sent to the Moon and our knowledge about the lunar environment along with the survival strategies in space became more and more commendable. 1972 was a year of friendship and mutual cooperation in the space technology as the leaders the US and the USSR joined hands for collaborative space projects. The next phase of human travel began in 1981, aboard the STS - space shuttle series also known as Space Transportation System. Sally Ride became the first American woman to be in space on the NASA shuttle STS-7, on June 18th,1983.

NASA's journey of space exploration hasn't always been a pleasant experience. Tragedy struck on January 28th, 1986 when 'Challenger' orbiter's liquid fuel tank burst, resulting in the death of the 7 member crew and again on February 1st, 2003 when the 'Columbia' mission failed on its re-entry into the Earth's atmosphere. The silver-lining amidst all these tragic losses was that NASA was able to achieve many milestones in its never-ending quest for technological advancement. The various communication and weather satellites that orbit in space, the super fast and latest gadgetry in airplanes and jets or the scram-jet technology to fly ten times faster than the speed of sound; every little innovation of NASA is a testimony to the brilliance and dedication of its work culture.

NASA is still holding onto its place of prominence in science and technology and is definitely a great asset for the future of human innovation.

How are Crystals Formed

Crystals are basically solids that are formed by the orderly and repeated arrangement of constituent elements like atoms, ions or molecules. The word 'crystal' is derived from the Greek word, krustallos, that has the same meaning, but once referred to as quartz and rock crystals.
How are Crystals Formed

Types of Crystals
Crystals can be classified into different types depending on their shape and properties. Based on their shape, they are divided into seven types, namely cubic or isometric, tetragonal, orthorhombic, hexagonal, rhombohedral, monoclinic and triclinic crystals.
On the basis of their physical and chemical properties, crystals are classified into the following four types: covalent, metallic, molecular and ionic.

How are Crystals Formed
The process of crystal formation is known as crystallization. Crystallization is the process of formation of crystals from solutions, molten substances and even gas. The process can be divided into two main stages, namely nucleation and crystal growth.

Nucleation involves the accumulation of solute dissolved in the solvent into clusters. However, the clusters should be stable to ensure the formation of crystals, otherwise they get redissolved in the solution. The stable clusters form the nuclei. For forming stable nuclei, the clusters have to attain a critical size determined by the operating conditions, like temperature and supersaturation (refers to a solution containing more of the dissolved material than that could be dissolved under normal conditions). At this stage of nucleation, atoms get arranged in geometrical shape in a periodic or repeated manner, which plays a significant role in determining the structure of the crystals.

The next stage of crystallization is crystal growth, which involves the growth of the stable nuclei. This helps the crystals to attain the critical cluster size, after which it can no longer dissolve in the solution. Nucleation and crystal growth takes place simultaneously as long as supersaturation exists. So, the most important condition for crystallization is the existence of supersaturation, as it determines the rate of nucleation and crystal growth. When supersaturation ceases to exist, the solid-liquid system attains equilibrium and the process of crystallization comes to an end.

Crystals are generally formed when magma or molten rock cools and solidifies. Rapid cooling of the molten rock generally results in the formation of small crystals. However, if they cool down slowly, then large crystals are formed. Some crystals like diamonds are formed deep in the earth from the carbon atoms present in the molten rocks. The high pressure and intense heat causes the carbon atoms to come together to form small diamond crystals, that are held in molten rock.

Crystals can be formed due to evaporation. When you dissolve a soluble substance or solute in a solvent, the crystal structure of the substance breaks down into individual atoms, ions or molecules and gets dissolved in the solution. When evaporation takes place the amount of solution available for dissolution gets reduced. This in turn causes the excess solute to gather into clusters and crystallize.

Crystallization can be augmented by changing the temperature of the solvent. Generally, solubility can be reduced by lowering the temperature of the solvent, which helps in the formation of crystals. The rate of crystallization can also be increased by changing the nature of the solvent. This can be done by adding a non-solvent to the solution, which reduces solubility and hence ensures rapid crystallization.

These shining, glittering crystals have a wide range of application. Crystals like diamonds, emeralds, rubies and other gemstones are known for their dazzling beauty, while others like, sugar and salt are indispensable part of human diet. They can also be used for healing purposes, known as crystal healing, an important part of astrology. Crystals are able to hold electric charge, which facilitates the healing work. They are also capable of enhancing the energy fields of the body by emitting uniform vibrations. Quartz is another important crystal that is nowadays used in computers, watches and radio stations for its astounding and constant energy field.

They Were Predicted to Fail - But Thank Goodness They Didn’t!

Experts in virtually every field of research and science have made predictions, both bad and good, in response to learning about some new innovation, design, or invention. Many have scoffed through the years, and many inventions never made it past the drawing-board stage, but there are few items that didn’t fail beyond their initial concept - and everyone has come to depend on them as a part of daily life.
They Were Predicted to Fail - But Thank Goodness They Didn’t!

Television
In the United States there are about 220 million "boob tubes" that Americans sit around for hours a day. Televisions have become the main source of news, entertainment, alerts, information, and water cooler topics for more than half a century. A few decades ago designers were focused on making TVs smaller and more portable. Today they are focused on making them thinner and larger. But no matter what the current design trend, televisions are firmly fixed in society the world over. Yet when pioneers of television technology first came on the scene in the early 1900s, people turned up their noses. Scientists said that although the basic idea of the television was probably feasible, it was impossible to create both financially and commercially, and developers need not waste their time dreaming about it. Can you imagine what the world would be like today if those early developers had decided to abandon those dreams?

Air conditioning
George Westinghouse bought from Nikola Tesla the original patent for the transmission of air conditioning, and that’s what started it all. Thomas Edison had a good time taunting Westinghouse about the foolishness of his invention, but thank goodness his taunts didn’t keep Westinghouse from perfecting it. The truth today is that distributing power with air conditioningtoday is even easier and more efficient than with the direct current perfected by Edison!

Automobiles
Over a century ago, people thought the idea of a "horseless carriage" was jut a luxury that only wealthy people would ever be able to indulge in. In fact, popular opinion was that although automobiles would cost less as time went by, it would never be as commonly used as a bicyle. Boy, were those soothsayers wrong. People today are rediscovering the joys and health benefits of cycling, but even still - more than 50 million new cars hit the road every year. It would be hard to take the family to DisneyWorld on a bike, wouldn’t it?

Personal computers
A few decades ago pundits liked to scoff at designers, saying that the limits of possibilities with computers had already been reached, and there would certainly never be any way or need for regular people to use them at home. And then came the integrated circuit (known now as the microchip). Once that tiny gem was developed, the sky was the limit, and that limit keeps besting itself. Computers allowed fantastic advances in research, academia, astronomy, and numerous other disciplines. And once they were created in convenient desktop models, they allowed human beings all around the globe to connect in ways that were never even considered before. Talk about having the world at your front door! Real-time news and communication we enjoy today would have been possible without the personal computer.

These inventions and many others are clear evidence that if you have what you think is a good design for a useful product, the worst thing you can do is pay attention to those who say it can’t be done. There are plenty of excellent, groundbreaking inventions that so-called "experts" were quick to discount. And if the designers had listened to them, where would society be today? Actor Peter Ustinov had it right when he said, "If the world should blow itself up, the last audible voice would be that of an expert saying it can’t be done."