Monday, May 4, 2009

History of Microprocessor

A microprocessor is a single chip integrating all the functions of a central processing unit (CPU) of a computer. It includes all the logical functions, data storage, timing functions and interaction with other peripheral devices. In some cases, the terms 'CPU' and 'microprocessor' are used interchangeably to denote the same device. Like every genuine engineering marvel, the microprocessor too has evolved through a series of improvements throughout the 20th century. A brief history of the device along with its functioning is described below.

History of Microprocessor

Working of a Microprocessor

It is the central processing unit which coordinates all the functions of a computer. It generates timing signals, sends and receives data to and from every peripheral used inside or outside the computer. The commands required to do this are fed into the device in the form of current variations which are converted into meaningful instructions by the use of a Boolean Logic System. It divides its functions in two categories, logical functions and processing functions. The arithmetic and logical unit and the control unit handle these functions respectively. The information is communicated through a bunch of wires called buses. The address bus carries the 'address' of the location with which communication is desired while the data bus carries the data that is being exchanged.

Types of Microprocessors

There are different ways in which microprocessors are categorized. They are
  • CISC (Complex Instruction Set Computers)
  • RISC(Reduced Instruction Set Computers)
  • VLIW(Very Long Instruction Word Computers)
  • Super scalar processors
Other types of specialized processors are
  • General Purpose Processor (GPP)
  • Special Purpose Processor (SPP)
  • Application-Specific Integrated Circuit (ASIC)
  • Digital Signal Processor (DSP)
History and Evolution of Microprocessors

The invention of the transistor in 1947 was a significant development in the world of technology. It could perform the function of a large component used in a computer in the early years. Shockley, Brattain and Bardeen are credited with this invention and were awarded the Nobel prize for the same. Soon it was found that the function this large component was easily performed by a group of transistors arranged on a single platform. This platform, known as the integrated chip (IC), turned out to be a very crucial achievement and brought along a revolution in the use of computers. A person named Jack Kilby of Texas Instruments was honored with the Nobel Prize for the invention of IC, which laid the foundation on which microprocessors were developed. At the same time, Robert Noyce of Fairchild made a parallel development in IC technology for which he was awarded the patent.

ICs proved beyond doubt that complex functions could be integrated on a single chip with a highly developed speed and storage capacity. Both Fairchild and Texas Instruments began the manufacture of commercial ICs in 1961. Later, complex developments in the IC led to the addition of more complex functions on a single chip. The stage was set for a single controlling circuit for all the computer functions. Finally, Intel corporation's Ted Hoff and Frederico Fagin were credited with the design of the first microprocessor.

The work on this project began with an order from a Japanese calculator company Busicom to Intel, for building some chips for it. Hoff felt that the design could integrate a number of functions on a single chip making it feasible for providing the required functionality. This led to the design of Intel 4004, the world's first microprocessor. The next in line was the 8 bit 8008 microprocessor. It was developed by Intel in 1972 to perform complex functions in harmony with the 4004.

This was the beginning of a new era in computer applications. The use of mainframes and huge computers was scaled down to a much smaller device that was affordable to many. Earlier, their use was limited to large organizations and universities. With the advent of microprocessors, the use of computers trickled down to the common man. The next processor in line was Intel's 8080 with an 8 bit data bus and a 16 bit address bus. This was amongst the most popular microprocessors of all time.

Very soon, the Motorola corporation developed its own 6800 in competition with the Intel's 8080. Fagin left Intel and formed his own firm Zilog. It launched a new microprocessor Z80 in 1980 that was far superior to the previous two versions. Similarly, a break off from Motorola prompted the design of 6502, a derivative of the 6800. Such attempts continued with some modifications in the base structure.

The use of microprocessors was limited to task-based operations specifically required for company projects such as the automobile sector. The concept of a 'personal computer' was still a distant dream for the world and microprocessors were yet to come into personal use. The 16 bit microprocessors started becoming a commercial sell-out in the 1980s with the first popular one being the TMS9900 of Texas Instruments.

Intel developed the 8086 which still serves as the base model for all latest advancements in the microprocessor family. It was largely a complete processor integrating all the required features in it. 68000 by Motorola was one of the first microprocessors to develop the concept of microcoding in its instruction set. They were further developed to 32 bit architectures. Similarly, many players like Zilog, IBM and Apple were successful in getting their own products in the market. However, Intel had a commanding position in the market right through the microprocessor era.

The 1990s saw a large scale application of microprocessors in the personal computer applications developed by the newly formed Apple, IBM and Microsoft corporation. It witnessed a revolution in the use of computers, which by then was a household entity.

This growth was complemented by a highly sophisticated development in the commercial use of microprocessors. In 1993, Intel brought out its 'Pentium Processor' which is one of the most popular processors in use till date. It was followed by a series of excellent processors of the Pentium family, leading into the 21st century. The latest one in commercial use is the Pentium Dual Core technology and the Xeon processor. They have opened up a whole new world of diverse applications. Supercomputers have become common, owing to this amazing development in microprocessors.

Certainly, these little chips will go down as history but will continue to reign in the future as an ingenious creation of the human mind

Discovery of Gunpowder

Among the list of chemical explosives, the gunpowder was the only known 'recipe' for many centuries. Discovered by the ancient Chinese civilization, gunpowder and its ingredients were briefly mentioned in the Taoist text Zhenyuan miaodao yaolue, though, the properties and its usage as an explosive agent was not explored and experimented with. The first written procedure for the manufacture of gunpowder is seen in the Chinese military guide Wujing Zongyao. Though, many burning agents such as the Greek fire had been used previously, the first explosive agent is the gunpowder whose roots can be traced back to the Chinese alchemy experiments. It is believed that the Chinese alchemists of the 9th Century, discovered gunpowder accidentally when an experiment for the search of elixir of life went haywire. Sometimes it is also argued that gunpowder might have been discovered or invented earlier since the Chinese alchemists were familiar with substances like saltpeter and sulfur.

Discovery of Gunpowder

Gunpowder and Wujing Zongyao

In ancient China, gunpowder was initially used as a propellant in firecrackers. The invention of firearms and the discovery of gunpowder and many more related explosive recipes of the gunpowder led to a drastic change in the battlefield. Crude bombs and firearms started appearing in the Asian continent in the 9th and 10th Centuries. The first standardized and most successful procedures were laid down in the Wujing Zongyaowhich was a Chinese military guide written by prominent scholars, Zeng Gongliang, Yang Weide and Ding Du who collaborated in 1044 AD to pen a "collection of the most important military techniques". Chinese alchemists had, by this time, discovered gunpowder of the most explosive nature that consisted of saltpeter, sulfur, charcoal and some other ingredients. Many new discoveries and variants kept on appearing till the modern era when, substance like nitrocellulose, nitroglycerin, smokeless powder and TNT were developed.

One of the most famous discovery of gunpowder variants mentioned in the Wujing Zongyao, consists of 48.5% saltpeter, 25.5% sulfur and 21.5% of other ingredients. This combination was used to manufacture incendiary bombs that were hurled by the siege engines.

Another mixture contained 38.5% of saltpeter, 19% sulfur, 6.4% charcoal and 35.85% other ingredients. This mixture was used as a fuel for poisonous smoke bombs. Arsenic and mercury were also, many a times, added to make the gunpowder poisonous.

Variants of Gunpowder

There are several variants and different uses of the gunpowder. The discovery of the gunpowder drastically changed warfare. It not only led to the use of firearms on the battle field but many more weapons such as poisonous bombs, grenades, fire arrows and even land mines were developed.

A Chinese text known as Hu Long Jing, from the 14th Century depicts the multi-stage rockets, fire arrows, different types of fireworks and military as well as naval and military explosives and mines. The book also describes Chinese musketeers.

During the siege of Pyongyang in the year 1593, about 40,000 Chinese soldiers used a variety of cannons and firearms such as muskets.

The Chinese empire tried very hard to keep the discovery of the recipe of the gunpowder a secret . However, it soon leaked out. Kingdoms in Mongolia and India began to make the use of gunpowder especially on the battlefield and for various purposes like fireworks, making mine shafts, tunneling, and the construction of canals.

Dispute over Further Development of Gunpowder

Countries in Asia like China, India, Mongolia and the Islamic Sates came up with their own variants, innovations and different discoveries of gunpowder mixtures.

The variant of gunpowder that was used by the Europeans was the black powder. The discovery of black powder is however, disputed as two people claim the credit. Some people believe that the innovator who discovered it was Roger Bacon, who was a Franciscan monk and an alchemist. Another Franciscan monk who is said to have innovated the black powder is Berthold der Schwarze. He was also known as 'Berthold the Black' and is said to have invented the first gun. However, facts about Berthold the Black are not clearly known and the dates of his birth, death and the time when he invented the gun or the black powder are disputed. In his epitaph it is said:

"Here lies Berthold the Black,
the most abominable of humans,
who by his invention has brought misery,
to the rest of humanity."

One could argue with the writer of the epitaph of Berthold the Black. The contribution of Chinese civilization in the discovery of gunpowder did not just change the battlefield scenario. In fact, more innovative and creative uses of gunpowder have helped man move mountains during mining and tunneling, turn deserts into lush green fields by building canals and make the civilization more comfortable and safer than before. It is certain, that the discovery of gunpowder has dictated the course of many events in history, in war and in peace.

Timeline and History of NASA

NASA was founded on July 29th, 1958 under the patronage of the the US government. The statute that defined its role read, "An act to pioneer research in space exploration, scientific discovery and aeronautical fields".

Timeline and History of NASA

The idea of setting up a scientific and technologically advanced institute was conceived due to the conditions prevailing in the Cold War period. USA and USSR emerged as the two superpowers in the aftermath of World War II and began a race to establish their influence over the world. The resulting battle of intellectual and political supremacy, led to a series of developments especially in military and space research. Space exploration became an important area of competition and each nation tried to outsmart the other to gain a stronghold in the 'space war'. The U.S pursued a policy of extensive work in astronomy and related space sciences to accentuate its technological supremacy.

However, the ingenious creation that is NASA, was not a sudden fallout of some rivalry. There were a sequence of events that finally led to the congregation of a super project called NASA. Space and aeronautics were a subject of great interest in the beginning of the 19th century. On March 3rd, 1915 National Advisory Committee for Aeronautics was formed in USA which was later rechristened National Advisory Council on Aeronautics. This period witnessed a series of scientific developments such as liquid fuel rockets launch by Dr. Goddard in the US, rocket planes in Germany, ballistic missiles in the erstwhile USSR and so on. Finally, October 4th, 1957 marked the dawn of the 'Space Age' when Soviet Union launched Sputnik, the first man-made space satellite. It was immediately followed by Sputnik 2 which carried a dog named Laika, the first animal on board a space flight. The first successful US launch was Explorer 1. It discovered the 'Van Allan Belts' which were present around the Earth. This was followed by Vanguard 1 and Explorer 3. The US had arrived big time on the space research scene with Russia challenging its dominion.

In the wake of all these developments, National Advisory Council on Aeronautics (NACA) was renamed NASA and formally inaugurated on October 1st, 1958, as a dedicated body for advanced research. It began full scale operations with a staff of around 8000 people and three advanced labs; Langley Aeronautical Laboratory, Ames Aeronautical Laboratory and Lewis Flight Propulsion Laboratory. Gradually, the number of centers went on increasing. Today, it has 10 different centers across the country. There were several programs undertaken during the initial years of NASA. Wernher Von Braun, a German who later became a US citizen was the father of the US space program. He contributed heavily in the new setup by breakthrough research on jet engines and aviation technology.

The year 1958 saw some of the first efforts to test the human survival skills in space. The earliest NASA programs were devoted to the launch of a manned space flight, as soon as possible. Trained officers from the US Army, Navy and the Air Force worked in tandem with the specially formed task group of NASA; for testing new inventions. A special team was dedicated to work out the environment aboard a spacecraft in Project Mercury. These efforts bore fruit on May 5, 1961 when Alan Shepard became the first American to pilot a space vehicle 'Freedom 7'. John Glenn became the first American to successfully orbit the Earth on May 5th, 1961.

This endeavor was succeeded in the period of 4 yrs, from 1968-72, by Project Gemini to conduct tests on the Moon and Project Apollo to explore the Moon in detail. NASA also conducted a landmark research in the study of space adaptability. Humans learned more about dealing with weightlessness, ways to safely return to the Earth's atmosphere, stationing a spacecraft in space and other such vital techniques. Edward White is credited to be the first US astronaut to perform a 'spacewalk'.

A determined president, John F. Kennedy, had instructed his nation's best minds to leave no stone unturned in their quest to reach the Moon. The Apollo series of flights to the space were missions to make this dream a reality. In the process, a crew of 3 astronauts were burned to death in an accident aboard an Apollo capsule that exploded on January 27th, 1967 due to some technical snags. It was on July 20th, 1969 when Neil Armstrong and Edwin Aldrin landed on the moon and were immortalized forever in human history as the first humans to do so. The famous words uttered by Neil Armstrong when he first stepped on the Moon surface were, "That's one small step for man but a giant leap for mankind." The rigorous work and money put in by the NASA staff was lauded throughout the world. It marked the beginning of a new phase in human evolution, 'The Space Age'.

Further, there were 5 more such probes sent to the Moon and our knowledge about the lunar environment along with the survival strategies in space became more and more commendable. 1972 was a year of friendship and mutual cooperation in the space technology as the leaders the US and the USSR joined hands for collaborative space projects. The next phase of human travel began in 1981, aboard the STS - space shuttle series also known as Space Transportation System. Sally Ride became the first American woman to be in space on the NASA shuttle STS-7, on June 18th,1983.

NASA's journey of space exploration hasn't always been a pleasant experience. Tragedy struck on January 28th, 1986 when 'Challenger' orbiter's liquid fuel tank burst, resulting in the death of the 7 member crew and again on February 1st, 2003 when the 'Columbia' mission failed on its re-entry into the Earth's atmosphere. The silver-lining amidst all these tragic losses was that NASA was able to achieve many milestones in its never-ending quest for technological advancement. The various communication and weather satellites that orbit in space, the super fast and latest gadgetry in airplanes and jets or the scram-jet technology to fly ten times faster than the speed of sound; every little innovation of NASA is a testimony to the brilliance and dedication of its work culture.

NASA is still holding onto its place of prominence in science and technology and is definitely a great asset for the future of human innovation.

How are Crystals Formed

Crystals are basically solids that are formed by the orderly and repeated arrangement of constituent elements like atoms, ions or molecules. The word 'crystal' is derived from the Greek word, krustallos, that has the same meaning, but once referred to as quartz and rock crystals.
How are Crystals Formed

Types of Crystals
Crystals can be classified into different types depending on their shape and properties. Based on their shape, they are divided into seven types, namely cubic or isometric, tetragonal, orthorhombic, hexagonal, rhombohedral, monoclinic and triclinic crystals.
On the basis of their physical and chemical properties, crystals are classified into the following four types: covalent, metallic, molecular and ionic.

How are Crystals Formed
The process of crystal formation is known as crystallization. Crystallization is the process of formation of crystals from solutions, molten substances and even gas. The process can be divided into two main stages, namely nucleation and crystal growth.

Nucleation involves the accumulation of solute dissolved in the solvent into clusters. However, the clusters should be stable to ensure the formation of crystals, otherwise they get redissolved in the solution. The stable clusters form the nuclei. For forming stable nuclei, the clusters have to attain a critical size determined by the operating conditions, like temperature and supersaturation (refers to a solution containing more of the dissolved material than that could be dissolved under normal conditions). At this stage of nucleation, atoms get arranged in geometrical shape in a periodic or repeated manner, which plays a significant role in determining the structure of the crystals.

The next stage of crystallization is crystal growth, which involves the growth of the stable nuclei. This helps the crystals to attain the critical cluster size, after which it can no longer dissolve in the solution. Nucleation and crystal growth takes place simultaneously as long as supersaturation exists. So, the most important condition for crystallization is the existence of supersaturation, as it determines the rate of nucleation and crystal growth. When supersaturation ceases to exist, the solid-liquid system attains equilibrium and the process of crystallization comes to an end.

Crystals are generally formed when magma or molten rock cools and solidifies. Rapid cooling of the molten rock generally results in the formation of small crystals. However, if they cool down slowly, then large crystals are formed. Some crystals like diamonds are formed deep in the earth from the carbon atoms present in the molten rocks. The high pressure and intense heat causes the carbon atoms to come together to form small diamond crystals, that are held in molten rock.

Crystals can be formed due to evaporation. When you dissolve a soluble substance or solute in a solvent, the crystal structure of the substance breaks down into individual atoms, ions or molecules and gets dissolved in the solution. When evaporation takes place the amount of solution available for dissolution gets reduced. This in turn causes the excess solute to gather into clusters and crystallize.

Crystallization can be augmented by changing the temperature of the solvent. Generally, solubility can be reduced by lowering the temperature of the solvent, which helps in the formation of crystals. The rate of crystallization can also be increased by changing the nature of the solvent. This can be done by adding a non-solvent to the solution, which reduces solubility and hence ensures rapid crystallization.

These shining, glittering crystals have a wide range of application. Crystals like diamonds, emeralds, rubies and other gemstones are known for their dazzling beauty, while others like, sugar and salt are indispensable part of human diet. They can also be used for healing purposes, known as crystal healing, an important part of astrology. Crystals are able to hold electric charge, which facilitates the healing work. They are also capable of enhancing the energy fields of the body by emitting uniform vibrations. Quartz is another important crystal that is nowadays used in computers, watches and radio stations for its astounding and constant energy field.

They Were Predicted to Fail - But Thank Goodness They Didn’t!

Experts in virtually every field of research and science have made predictions, both bad and good, in response to learning about some new innovation, design, or invention. Many have scoffed through the years, and many inventions never made it past the drawing-board stage, but there are few items that didn’t fail beyond their initial concept - and everyone has come to depend on them as a part of daily life.
They Were Predicted to Fail - But Thank Goodness They Didn’t!

Television
In the United States there are about 220 million "boob tubes" that Americans sit around for hours a day. Televisions have become the main source of news, entertainment, alerts, information, and water cooler topics for more than half a century. A few decades ago designers were focused on making TVs smaller and more portable. Today they are focused on making them thinner and larger. But no matter what the current design trend, televisions are firmly fixed in society the world over. Yet when pioneers of television technology first came on the scene in the early 1900s, people turned up their noses. Scientists said that although the basic idea of the television was probably feasible, it was impossible to create both financially and commercially, and developers need not waste their time dreaming about it. Can you imagine what the world would be like today if those early developers had decided to abandon those dreams?

Air conditioning
George Westinghouse bought from Nikola Tesla the original patent for the transmission of air conditioning, and that’s what started it all. Thomas Edison had a good time taunting Westinghouse about the foolishness of his invention, but thank goodness his taunts didn’t keep Westinghouse from perfecting it. The truth today is that distributing power with air conditioningtoday is even easier and more efficient than with the direct current perfected by Edison!

Automobiles
Over a century ago, people thought the idea of a "horseless carriage" was jut a luxury that only wealthy people would ever be able to indulge in. In fact, popular opinion was that although automobiles would cost less as time went by, it would never be as commonly used as a bicyle. Boy, were those soothsayers wrong. People today are rediscovering the joys and health benefits of cycling, but even still - more than 50 million new cars hit the road every year. It would be hard to take the family to DisneyWorld on a bike, wouldn’t it?

Personal computers
A few decades ago pundits liked to scoff at designers, saying that the limits of possibilities with computers had already been reached, and there would certainly never be any way or need for regular people to use them at home. And then came the integrated circuit (known now as the microchip). Once that tiny gem was developed, the sky was the limit, and that limit keeps besting itself. Computers allowed fantastic advances in research, academia, astronomy, and numerous other disciplines. And once they were created in convenient desktop models, they allowed human beings all around the globe to connect in ways that were never even considered before. Talk about having the world at your front door! Real-time news and communication we enjoy today would have been possible without the personal computer.

These inventions and many others are clear evidence that if you have what you think is a good design for a useful product, the worst thing you can do is pay attention to those who say it can’t be done. There are plenty of excellent, groundbreaking inventions that so-called "experts" were quick to discount. And if the designers had listened to them, where would society be today? Actor Peter Ustinov had it right when he said, "If the world should blow itself up, the last audible voice would be that of an expert saying it can’t be done."

Monday, April 27, 2009

What are Computers Going to be Like in the Future

What are computers going to be like in the future? Have you thought about this? Computers of the future might supersede human intellect. Some believe that computers might even acquire the abilities of replacing human brains. Some researchers have proposed that computers of the future will have an inbuilt artificial intelligence. They may be able to implement robotics.Computer networking is sure to result in the death of distance and world will become a very small place to live.

Looking at the history and timeline of computers, we realize that computers have evolved from simple electronic calculators. Great mathematicians and logicians of the olden times have brought brilliant advances to computing. They have given computers the potential of having a bright future.

What does the future hold for computer technology? The future generations might experience interactions with robots. Robots may replace servants. They may be employed for laborious, repetitive as well as life-risking tasks. Computer researchers also picture the advent of a fully developed artificial intelligence in the world of computing. To know all about it, go through thepros and cons of artificial intelligence. Robotics and AI may make it possible for all daily tasks at the household and workplace to be performed without human intervention.

Nanotechnology is one of the very popular fields that is being looked forward to. Researchers look at nanotechnology as one of the promising fields to merge with computing technology in future.

Looking at the different possibilities of what computers will be like in the forthcoming years, we know one thing is for sure; that computers are bound to have a great future.

Spy Cell Phones

Imagine a cell phone secretly watching your behavior. Think of your mobile phone eyeing your actions and recording them in its memory. Sounds impossible? I am afraid, it isn’t. In fact, cell phones can be used to track user behavior and record conversations and text messages that are exchanged through them.

A spy phone is a mobile phone or a spy device that allows a user to monitor and hear or record conversations and other activities taking place over the phone. Spy phones can function in different ways. They can be used as listening devices whereby secretive conversations can be tracked. They are popularly used by secret agencies to track criminal activities that are carried out over cellular networks. They can be used for tracking periodic calls and recording the frequency of calls from certain suspicious numbers. Spy phones can be used for monitoring business and household activities while the cell phone user is away.

Previously, simple wiretapping techniques were used to spy telephonic conversations. Secret agencies and security authorities commonly used phonetapping as a tool to track the suspects’ behavior. It would enable the security officials to listen to the telephonic conversations taking place over phones they tapped. However, with the advancements in technology, spying became easier.

Today, installable software can be used to record cell phone calls and messages. Also, there are cell phone spying systems, wherein a spy is automatically alerted when the cell phone user dials a certain number. There are certain software applications, which can be loaded onto the cell phones and be used for maintaining call logs, recording text messages and monitoring Internet activity over the mobile phone. Some of these applications also let a user call the target cell phone number from a preset number and track the target user’s activities. ‘Phone Dead’ is a relatively recent technology that enables a cell phone to be used as a spy phone even when switched off. In this case, the spy cell phone is configured to function in a ghost mode, whereby it silently answers the calls it receives.

With the implementation of modern spying techniques, cell phones can be converted into call and text message interceptors or GSM trackers, thus making them function as spy cell phones. Besides tracking messages and conversations, spy cell phones can also alert third-party users of the target phone’s outgoing calls. Spy phones can also be used to track the target user’s location by means of GPS technologies. They can be programmed to record audio or video for a predetermined period of time.

Spy cell phones give rise to legal as well as ethical concerns. Their positive side is that they can be used by parents to monitor the behavior of their children. Also, they can be used by business officials to track workplace activity. The most important application of spy cell phones is in tracking criminal activities. But there is another side to this and it is not positive. Yes, spy cell phone software is easily available. Cell phones are easily convertible to spy phones. Their ready availability makes them subject to illegitimate use. Malicious cell phone users can gain an unauthorized access to other users’ mobile phones and dampen the overall security of cellular networks.

Perhaps, it is ironical that the technology that helps you stay connected to the world can also provide any individual with an unwarranted access to your life. It is probably contrary to the concept of connectivity.

Buying Guide: Plasma TV vs LCD TV - Reviews and Comparison

In order to compare plasma TV with LCD, you need to understand the differences in their technologies. It is important to analyze your usage requirements before you make the buy. Following is a television-buying guide, which will help you come to a decision about whether to go for a plasma TV or get home an LCD. Here is an overview of both the technologies followed by their comparative study.

Plasma TV: It is a flat panel display, which is commonly used for large television screens. Plasma displays are bright and have a low-luminance level in relation to an LCD screen. Its power consumption depends largely on the picture content whereby brighter pictures draw more power than the darker ones.

How does a plasma TV work? A plasma television display consists of two plates of glass that hold thousands of small cells containing xenon and neon gases. Long electrodes are placed between the glass plates on both sides of the cells. The control circuits of a plasma television charge the electrodes to generate a potential difference. The voltage difference results in the ionization of gases, forming plasma. The collision of the gas ions while they move towards the electrodes, results in the emission of photons. Each pixel of a plasma display is made of three subpixel cells of red, blue and green colored phosphors. The intensity of each of the subpixel colors can be increased or decreased by varying the current flowing through them, thus creating different combinations of red, green and blue. This is why, plasma TV can produce most of the visible colors and give an enriched user experience.

LCD TV: An LCD display consists of an array of liquid crystals that are placed between two glass plates with a source of light at the back. The electric charge applied to the crystals, results in the production of images.

An LCD display consists of a flat panel that is made up of pixels filled with liquid crystals. Each pixel of an LCD display consists of a layer of molecules placed between two transparent electrodes and two polarizing filters. Before the application of electric field, the liquid crystal molecules are aligned in a particular direction. A variation in the voltage applied to the liquid crystal layer in each pixel, results in the production of different levels of gray. In color LCD displays, each pixel is divided into three subpixels of colors red, green and blue and the color elements are generated by subtracting colors from white light. As against the different-colored phosphors used in a plasma display, the cells in an LCD TV are colored by means of pigment filters, metal oxide filters and dye filters.

Plasma TV vs. LCD TV

Plasma TV provides a larger screen availability as compared to LCDs. Plasma displays can render a better contrast ratio and enhanced color accuracy. Plasma television can render deeper blacks. Plasma TVs support a wide range of colors and can produce large-sized pictures. LCDs produce brighter pictures. However, greens can sometimes appear over-green and reds might appear warmer. While plasma TV outweighs LCD in brightness and contrast, LCDs are better off in terms of picture resolution.

In plasma TVs, light is not spread across the screen from a central source. Rather, each pixel of a plasma display produces its own light and hence, is readily visible with its brightness consistent with the other pixels on the screen. Plasma screens offer wider viewing angles.

At times, the pixels of LCD displays lack the ability to give quick responses to the changes in color in case of moving images. In such cases, pictures appear to smudge. Plasma displays better motion tracking capabilities resulting in little or no motion lag in moving images. Plasma TV has a good refresh capacity due to which it can handle rapid movements of pictures.

One of the major disadvantages of plasma TV is that it suffers from screen burn-in. With the phosphor-based electronic displays, a prolonged display can result in the formation of ghost-like images. It is caused by the loss of luminosity of the phosphor compounds. Screen burn-ins result in a decline in the picture quality. LCD TVs are not susceptible to burn-ins. However, there are chances that individual pixels of an LCD television burn out, resulting in the appearance of small black or white dots on the screen.

Moreover, LCD TVs consume less power in comparison to plasma TVs. The estimated power saving is of about 25%.

The price difference between plasma TV and LCDs cannot be ignored. LCDs are affordable in small screen sizes while plasma screens are affordable in larger screen sizes, above 42".

Now that you have understood the pros and cons of both plasma and LCD technologies, you should be able to decide which one suits your needs. While plasma TV ranks higher in terms of picture quality and overall user experience, LCDs are better options for cost-cutting and power-saving. Whichever option you decide to go for, remember to approach renowned television companies and purchase only from authorized dealers. Market research will help you analyze the cost-effectiveness of each of the technologies, weigh them against your requirements and affordability and find the best TV in the market. You need to find a trade off.

Geothermal Energy - An Informative Introduction to Clean Energy

I'm sure you've herd of terms like alternative energy, or think green. The fact is that its never been a better time to start thinking of better ways to use, and conserve our natural resources. Even if you don't care much about conserving our resources. You can count on the fact that you can save money through some types of alternative energy. This will spark most peoples interest.
Geothermal Energy - An Informative Introduction to Clean Energy

Geothermal energy is one way you can cute out on that ever rising power bill. If you've never heard the word geothermal before then that's ok. Its a new process in which contractors drill down deep into the earth to use thermal energy as a power source. Geothermal energy is a thermal based energy that is stored deep inside of the earth. The different types of thermal resources available give us different types of engineering and drilling challenges. Unlike fossil fuels there is almost unlimited amounts of clean energy available in the earths crust. There are several types of geothermal energy. They include:
  1. Conventional Geothermal sources
    • The Binary cycle power plants - Which draws up hot water from deep within the earth and then applies a second liquid that has a much lower boiling point then water that then vaporizes instantly. The steam that it shots out rotates large power turbines.
    • Hot and dry rock geothermal power - This type is much more simple. It uses a well that is drilled so deep that it hits hot bedrock. You then pore fluid down the well to produce steam. Which can then be converted into power.
  2. A geothermal heat pump - This type of system is used mainly to heat and cool houses and buildings. If you dig ten to fifteen feet down you will find that it maintains a constant sixty degrees Fahrenheit all year around. The point is to draw up that air into your home or office to maintain that ideal temperature for the whole year, or you can pump a fluid threw a series of pips and run it through a air pump reducing or eliminating the need for electric heat and air.
  3. Pumping out the hot geothermal water - This is self explanatory. You pump out the hot water from within the earths crust for use in home or for commercial applications. This type is not available in all places of the earth unless you can dig extremely deep.
There are geothermal power plants in just four states in the united state. These states are:
  • California - There are 33 geo power plants in this state and it makes up over 90% of our nations power.
  • Nevada - There are 14 in this state.
  • Hawaii - Has only one.
  • Utah - Has only one.
Earthquakes and volcanoes oh my...

The highest output of conventional geothermal energy is found along the path of tectonic plates where you see volcanoes and earthquakes the most. This is why California has 33 power plants using geo thermal, as California has the largest fault line in the USA. You can have a geothermal heat pump anywhere in the world but the ones along the fault lines are the most common. This type is called geothermal reservoirs. These natural reservoirs are filled with water. When the magma from the earths mantle comes up due to the rising of tectonic plats it heats the water in the geothermal reservoirs making them perfect for farming the pure clean energy. A good example of a geothermal well is old faithful in yellow stone national park. However this is protected by law as not to disturb this natural wonder.

I know what your thinking now. How long does the water in the reservoirs last? Well the power plants that use this type of energy have a special process in which they inject the used steam and water back into the reservoir. This type of power plant is still very new, but in theory the water should last as long as the power plant.

All these types of new power plants are 99% clean. The natural sulfur that does come up with the water is filtered out so this process is as good for the environment as possible.

How can you get geothermal heating and air conditioning?

There are plenty of contractors that provide a full installation service. There is even a two thousand dolor tax credit for people and businesses that install geothermal heating and cooling. The majority of the cost will be for the drilling, and the other gear you will need for the geothermal heat pump.

The earth absorbs about forty sever percent of all the sun light that hits the ground. That's what makes the ground below ten feet stay a constant temperature. The holes are often drilled in a vertical series of loops about one hundred to two hundred feet down.

If you live close to water they will place the pips around 50 feet down and horizontally. If you live right at a body of water you can put your pips in the water and anchored to the bottom. This causes a friction as the fluid passes in a series of loops threw the water. which will heat in the winter, and cool in the summer. You can also do this with a well if you have one. If you live in a geo active aria you can also tap into enough hot water to not have to really on a water heater. Saving you even more money.

Advantages of Information Technology

Information Technology:

Information Technology or IT mainly deals with computer applications. The common work environment today is totally dependent on computers. This has led to the need to develop and consistently upgrade dedicated computer software like project management software, for a number of related requirements. These include storage and protection of content, processing and transmitting of dedicated information and the secured retrieval of information, when and as required. IT promotes computing technology, covering everything from installing applications to developing databases.

Why is Information Technology Important:

All our work related applications are now completely automated, thanks to the IT sector. IT professionals are people involved in essential management of sensitive data, exclusive computer networking and systems-engineering. The advancement of the IT sector has resulted in automated:
  • Administration of entire systems.
  • Production and manipulation of sensitive information.
  • Cultural development and communication.
  • Streamlining of business processes and timely upgradation.
Advantages of Information Technology:

The advantages of information technology are many. True globalization has come about only via this automated system. The creation of one interdependent system helps us to share information and end linguistic barriers across the continents. The collapse of geographic boundaries has made the world a 'global village'. The technology has not only made communication cheaper, but also possible much quicker and 24x7. The wonders of text messages, email and auto-response, backed by computer security applications, have opened up scope for direct communication.

Computerized, internet business processes have made many businesses turn to the Internet for increased productivity, greater profitability, clutter free working conditions and global clientèle. It is mainly due to the IT industry that people from diverse cultures are able to personally communicate and exchange valuable ideas. This has greatly reduced prejudice and increased sensitivity. Businesses are able to operate 24x7, even from remote locations.

Information technology has rippled on in the form of a Communication Revolution. Specialists in this field like programmers, analyzers and developers are able to further the applications and improve business processes simultaneously. The management infrastructure thus generated defies all boundaries. Among the many advantages of the industry are technical support post-implementation, network and individual desktop management, dedicated business applications and strategic planning for enhanced profitability and effective project management.

IT provides a number of low-cost business options to tap higher productivity with dedicated small business CRM and a special category for the larger operations. Regular upgrades have enabled many businessmen to increase productivity and identify a market niche that would never have been possible without the connectivity. With every subsequent increase in the ROI orReturn On Investment, businesses are able to remain buoyant even amidst the economic recession. Not only do people connect faster with the help of information technology, but they are also able to identify like-minded individuals and extend help, while strengthening ties.

This segment revolves around automated processes that require little or no human intervention at all. This in turn has minimized job stress levels at the work place and eliminated repetition of tasks, loss due to human error, risks involved due to negligence of timely upgrades and extensive paper-intensive business applications that result in the accumulation of unnecessary bulk. The sophistication of the modern work stations and general working conditions is possible only due to the development of Information Technology.