Monday, May 4, 2009

History of Microprocessor

A microprocessor is a single chip integrating all the functions of a central processing unit (CPU) of a computer. It includes all the logical functions, data storage, timing functions and interaction with other peripheral devices. In some cases, the terms 'CPU' and 'microprocessor' are used interchangeably to denote the same device. Like every genuine engineering marvel, the microprocessor too has evolved through a series of improvements throughout the 20th century. A brief history of the device along with its functioning is described below.

History of Microprocessor

Working of a Microprocessor

It is the central processing unit which coordinates all the functions of a computer. It generates timing signals, sends and receives data to and from every peripheral used inside or outside the computer. The commands required to do this are fed into the device in the form of current variations which are converted into meaningful instructions by the use of a Boolean Logic System. It divides its functions in two categories, logical functions and processing functions. The arithmetic and logical unit and the control unit handle these functions respectively. The information is communicated through a bunch of wires called buses. The address bus carries the 'address' of the location with which communication is desired while the data bus carries the data that is being exchanged.

Types of Microprocessors

There are different ways in which microprocessors are categorized. They are
  • CISC (Complex Instruction Set Computers)
  • RISC(Reduced Instruction Set Computers)
  • VLIW(Very Long Instruction Word Computers)
  • Super scalar processors
Other types of specialized processors are
  • General Purpose Processor (GPP)
  • Special Purpose Processor (SPP)
  • Application-Specific Integrated Circuit (ASIC)
  • Digital Signal Processor (DSP)
History and Evolution of Microprocessors

The invention of the transistor in 1947 was a significant development in the world of technology. It could perform the function of a large component used in a computer in the early years. Shockley, Brattain and Bardeen are credited with this invention and were awarded the Nobel prize for the same. Soon it was found that the function this large component was easily performed by a group of transistors arranged on a single platform. This platform, known as the integrated chip (IC), turned out to be a very crucial achievement and brought along a revolution in the use of computers. A person named Jack Kilby of Texas Instruments was honored with the Nobel Prize for the invention of IC, which laid the foundation on which microprocessors were developed. At the same time, Robert Noyce of Fairchild made a parallel development in IC technology for which he was awarded the patent.

ICs proved beyond doubt that complex functions could be integrated on a single chip with a highly developed speed and storage capacity. Both Fairchild and Texas Instruments began the manufacture of commercial ICs in 1961. Later, complex developments in the IC led to the addition of more complex functions on a single chip. The stage was set for a single controlling circuit for all the computer functions. Finally, Intel corporation's Ted Hoff and Frederico Fagin were credited with the design of the first microprocessor.

The work on this project began with an order from a Japanese calculator company Busicom to Intel, for building some chips for it. Hoff felt that the design could integrate a number of functions on a single chip making it feasible for providing the required functionality. This led to the design of Intel 4004, the world's first microprocessor. The next in line was the 8 bit 8008 microprocessor. It was developed by Intel in 1972 to perform complex functions in harmony with the 4004.

This was the beginning of a new era in computer applications. The use of mainframes and huge computers was scaled down to a much smaller device that was affordable to many. Earlier, their use was limited to large organizations and universities. With the advent of microprocessors, the use of computers trickled down to the common man. The next processor in line was Intel's 8080 with an 8 bit data bus and a 16 bit address bus. This was amongst the most popular microprocessors of all time.

Very soon, the Motorola corporation developed its own 6800 in competition with the Intel's 8080. Fagin left Intel and formed his own firm Zilog. It launched a new microprocessor Z80 in 1980 that was far superior to the previous two versions. Similarly, a break off from Motorola prompted the design of 6502, a derivative of the 6800. Such attempts continued with some modifications in the base structure.

The use of microprocessors was limited to task-based operations specifically required for company projects such as the automobile sector. The concept of a 'personal computer' was still a distant dream for the world and microprocessors were yet to come into personal use. The 16 bit microprocessors started becoming a commercial sell-out in the 1980s with the first popular one being the TMS9900 of Texas Instruments.

Intel developed the 8086 which still serves as the base model for all latest advancements in the microprocessor family. It was largely a complete processor integrating all the required features in it. 68000 by Motorola was one of the first microprocessors to develop the concept of microcoding in its instruction set. They were further developed to 32 bit architectures. Similarly, many players like Zilog, IBM and Apple were successful in getting their own products in the market. However, Intel had a commanding position in the market right through the microprocessor era.

The 1990s saw a large scale application of microprocessors in the personal computer applications developed by the newly formed Apple, IBM and Microsoft corporation. It witnessed a revolution in the use of computers, which by then was a household entity.

This growth was complemented by a highly sophisticated development in the commercial use of microprocessors. In 1993, Intel brought out its 'Pentium Processor' which is one of the most popular processors in use till date. It was followed by a series of excellent processors of the Pentium family, leading into the 21st century. The latest one in commercial use is the Pentium Dual Core technology and the Xeon processor. They have opened up a whole new world of diverse applications. Supercomputers have become common, owing to this amazing development in microprocessors.

Certainly, these little chips will go down as history but will continue to reign in the future as an ingenious creation of the human mind

Discovery of Gunpowder

Among the list of chemical explosives, the gunpowder was the only known 'recipe' for many centuries. Discovered by the ancient Chinese civilization, gunpowder and its ingredients were briefly mentioned in the Taoist text Zhenyuan miaodao yaolue, though, the properties and its usage as an explosive agent was not explored and experimented with. The first written procedure for the manufacture of gunpowder is seen in the Chinese military guide Wujing Zongyao. Though, many burning agents such as the Greek fire had been used previously, the first explosive agent is the gunpowder whose roots can be traced back to the Chinese alchemy experiments. It is believed that the Chinese alchemists of the 9th Century, discovered gunpowder accidentally when an experiment for the search of elixir of life went haywire. Sometimes it is also argued that gunpowder might have been discovered or invented earlier since the Chinese alchemists were familiar with substances like saltpeter and sulfur.

Discovery of Gunpowder

Gunpowder and Wujing Zongyao

In ancient China, gunpowder was initially used as a propellant in firecrackers. The invention of firearms and the discovery of gunpowder and many more related explosive recipes of the gunpowder led to a drastic change in the battlefield. Crude bombs and firearms started appearing in the Asian continent in the 9th and 10th Centuries. The first standardized and most successful procedures were laid down in the Wujing Zongyaowhich was a Chinese military guide written by prominent scholars, Zeng Gongliang, Yang Weide and Ding Du who collaborated in 1044 AD to pen a "collection of the most important military techniques". Chinese alchemists had, by this time, discovered gunpowder of the most explosive nature that consisted of saltpeter, sulfur, charcoal and some other ingredients. Many new discoveries and variants kept on appearing till the modern era when, substance like nitrocellulose, nitroglycerin, smokeless powder and TNT were developed.

One of the most famous discovery of gunpowder variants mentioned in the Wujing Zongyao, consists of 48.5% saltpeter, 25.5% sulfur and 21.5% of other ingredients. This combination was used to manufacture incendiary bombs that were hurled by the siege engines.

Another mixture contained 38.5% of saltpeter, 19% sulfur, 6.4% charcoal and 35.85% other ingredients. This mixture was used as a fuel for poisonous smoke bombs. Arsenic and mercury were also, many a times, added to make the gunpowder poisonous.

Variants of Gunpowder

There are several variants and different uses of the gunpowder. The discovery of the gunpowder drastically changed warfare. It not only led to the use of firearms on the battle field but many more weapons such as poisonous bombs, grenades, fire arrows and even land mines were developed.

A Chinese text known as Hu Long Jing, from the 14th Century depicts the multi-stage rockets, fire arrows, different types of fireworks and military as well as naval and military explosives and mines. The book also describes Chinese musketeers.

During the siege of Pyongyang in the year 1593, about 40,000 Chinese soldiers used a variety of cannons and firearms such as muskets.

The Chinese empire tried very hard to keep the discovery of the recipe of the gunpowder a secret . However, it soon leaked out. Kingdoms in Mongolia and India began to make the use of gunpowder especially on the battlefield and for various purposes like fireworks, making mine shafts, tunneling, and the construction of canals.

Dispute over Further Development of Gunpowder

Countries in Asia like China, India, Mongolia and the Islamic Sates came up with their own variants, innovations and different discoveries of gunpowder mixtures.

The variant of gunpowder that was used by the Europeans was the black powder. The discovery of black powder is however, disputed as two people claim the credit. Some people believe that the innovator who discovered it was Roger Bacon, who was a Franciscan monk and an alchemist. Another Franciscan monk who is said to have innovated the black powder is Berthold der Schwarze. He was also known as 'Berthold the Black' and is said to have invented the first gun. However, facts about Berthold the Black are not clearly known and the dates of his birth, death and the time when he invented the gun or the black powder are disputed. In his epitaph it is said:

"Here lies Berthold the Black,
the most abominable of humans,
who by his invention has brought misery,
to the rest of humanity."

One could argue with the writer of the epitaph of Berthold the Black. The contribution of Chinese civilization in the discovery of gunpowder did not just change the battlefield scenario. In fact, more innovative and creative uses of gunpowder have helped man move mountains during mining and tunneling, turn deserts into lush green fields by building canals and make the civilization more comfortable and safer than before. It is certain, that the discovery of gunpowder has dictated the course of many events in history, in war and in peace.

Timeline and History of NASA

NASA was founded on July 29th, 1958 under the patronage of the the US government. The statute that defined its role read, "An act to pioneer research in space exploration, scientific discovery and aeronautical fields".

Timeline and History of NASA

The idea of setting up a scientific and technologically advanced institute was conceived due to the conditions prevailing in the Cold War period. USA and USSR emerged as the two superpowers in the aftermath of World War II and began a race to establish their influence over the world. The resulting battle of intellectual and political supremacy, led to a series of developments especially in military and space research. Space exploration became an important area of competition and each nation tried to outsmart the other to gain a stronghold in the 'space war'. The U.S pursued a policy of extensive work in astronomy and related space sciences to accentuate its technological supremacy.

However, the ingenious creation that is NASA, was not a sudden fallout of some rivalry. There were a sequence of events that finally led to the congregation of a super project called NASA. Space and aeronautics were a subject of great interest in the beginning of the 19th century. On March 3rd, 1915 National Advisory Committee for Aeronautics was formed in USA which was later rechristened National Advisory Council on Aeronautics. This period witnessed a series of scientific developments such as liquid fuel rockets launch by Dr. Goddard in the US, rocket planes in Germany, ballistic missiles in the erstwhile USSR and so on. Finally, October 4th, 1957 marked the dawn of the 'Space Age' when Soviet Union launched Sputnik, the first man-made space satellite. It was immediately followed by Sputnik 2 which carried a dog named Laika, the first animal on board a space flight. The first successful US launch was Explorer 1. It discovered the 'Van Allan Belts' which were present around the Earth. This was followed by Vanguard 1 and Explorer 3. The US had arrived big time on the space research scene with Russia challenging its dominion.

In the wake of all these developments, National Advisory Council on Aeronautics (NACA) was renamed NASA and formally inaugurated on October 1st, 1958, as a dedicated body for advanced research. It began full scale operations with a staff of around 8000 people and three advanced labs; Langley Aeronautical Laboratory, Ames Aeronautical Laboratory and Lewis Flight Propulsion Laboratory. Gradually, the number of centers went on increasing. Today, it has 10 different centers across the country. There were several programs undertaken during the initial years of NASA. Wernher Von Braun, a German who later became a US citizen was the father of the US space program. He contributed heavily in the new setup by breakthrough research on jet engines and aviation technology.

The year 1958 saw some of the first efforts to test the human survival skills in space. The earliest NASA programs were devoted to the launch of a manned space flight, as soon as possible. Trained officers from the US Army, Navy and the Air Force worked in tandem with the specially formed task group of NASA; for testing new inventions. A special team was dedicated to work out the environment aboard a spacecraft in Project Mercury. These efforts bore fruit on May 5, 1961 when Alan Shepard became the first American to pilot a space vehicle 'Freedom 7'. John Glenn became the first American to successfully orbit the Earth on May 5th, 1961.

This endeavor was succeeded in the period of 4 yrs, from 1968-72, by Project Gemini to conduct tests on the Moon and Project Apollo to explore the Moon in detail. NASA also conducted a landmark research in the study of space adaptability. Humans learned more about dealing with weightlessness, ways to safely return to the Earth's atmosphere, stationing a spacecraft in space and other such vital techniques. Edward White is credited to be the first US astronaut to perform a 'spacewalk'.

A determined president, John F. Kennedy, had instructed his nation's best minds to leave no stone unturned in their quest to reach the Moon. The Apollo series of flights to the space were missions to make this dream a reality. In the process, a crew of 3 astronauts were burned to death in an accident aboard an Apollo capsule that exploded on January 27th, 1967 due to some technical snags. It was on July 20th, 1969 when Neil Armstrong and Edwin Aldrin landed on the moon and were immortalized forever in human history as the first humans to do so. The famous words uttered by Neil Armstrong when he first stepped on the Moon surface were, "That's one small step for man but a giant leap for mankind." The rigorous work and money put in by the NASA staff was lauded throughout the world. It marked the beginning of a new phase in human evolution, 'The Space Age'.

Further, there were 5 more such probes sent to the Moon and our knowledge about the lunar environment along with the survival strategies in space became more and more commendable. 1972 was a year of friendship and mutual cooperation in the space technology as the leaders the US and the USSR joined hands for collaborative space projects. The next phase of human travel began in 1981, aboard the STS - space shuttle series also known as Space Transportation System. Sally Ride became the first American woman to be in space on the NASA shuttle STS-7, on June 18th,1983.

NASA's journey of space exploration hasn't always been a pleasant experience. Tragedy struck on January 28th, 1986 when 'Challenger' orbiter's liquid fuel tank burst, resulting in the death of the 7 member crew and again on February 1st, 2003 when the 'Columbia' mission failed on its re-entry into the Earth's atmosphere. The silver-lining amidst all these tragic losses was that NASA was able to achieve many milestones in its never-ending quest for technological advancement. The various communication and weather satellites that orbit in space, the super fast and latest gadgetry in airplanes and jets or the scram-jet technology to fly ten times faster than the speed of sound; every little innovation of NASA is a testimony to the brilliance and dedication of its work culture.

NASA is still holding onto its place of prominence in science and technology and is definitely a great asset for the future of human innovation.

How are Crystals Formed

Crystals are basically solids that are formed by the orderly and repeated arrangement of constituent elements like atoms, ions or molecules. The word 'crystal' is derived from the Greek word, krustallos, that has the same meaning, but once referred to as quartz and rock crystals.
How are Crystals Formed

Types of Crystals
Crystals can be classified into different types depending on their shape and properties. Based on their shape, they are divided into seven types, namely cubic or isometric, tetragonal, orthorhombic, hexagonal, rhombohedral, monoclinic and triclinic crystals.
On the basis of their physical and chemical properties, crystals are classified into the following four types: covalent, metallic, molecular and ionic.

How are Crystals Formed
The process of crystal formation is known as crystallization. Crystallization is the process of formation of crystals from solutions, molten substances and even gas. The process can be divided into two main stages, namely nucleation and crystal growth.

Nucleation involves the accumulation of solute dissolved in the solvent into clusters. However, the clusters should be stable to ensure the formation of crystals, otherwise they get redissolved in the solution. The stable clusters form the nuclei. For forming stable nuclei, the clusters have to attain a critical size determined by the operating conditions, like temperature and supersaturation (refers to a solution containing more of the dissolved material than that could be dissolved under normal conditions). At this stage of nucleation, atoms get arranged in geometrical shape in a periodic or repeated manner, which plays a significant role in determining the structure of the crystals.

The next stage of crystallization is crystal growth, which involves the growth of the stable nuclei. This helps the crystals to attain the critical cluster size, after which it can no longer dissolve in the solution. Nucleation and crystal growth takes place simultaneously as long as supersaturation exists. So, the most important condition for crystallization is the existence of supersaturation, as it determines the rate of nucleation and crystal growth. When supersaturation ceases to exist, the solid-liquid system attains equilibrium and the process of crystallization comes to an end.

Crystals are generally formed when magma or molten rock cools and solidifies. Rapid cooling of the molten rock generally results in the formation of small crystals. However, if they cool down slowly, then large crystals are formed. Some crystals like diamonds are formed deep in the earth from the carbon atoms present in the molten rocks. The high pressure and intense heat causes the carbon atoms to come together to form small diamond crystals, that are held in molten rock.

Crystals can be formed due to evaporation. When you dissolve a soluble substance or solute in a solvent, the crystal structure of the substance breaks down into individual atoms, ions or molecules and gets dissolved in the solution. When evaporation takes place the amount of solution available for dissolution gets reduced. This in turn causes the excess solute to gather into clusters and crystallize.

Crystallization can be augmented by changing the temperature of the solvent. Generally, solubility can be reduced by lowering the temperature of the solvent, which helps in the formation of crystals. The rate of crystallization can also be increased by changing the nature of the solvent. This can be done by adding a non-solvent to the solution, which reduces solubility and hence ensures rapid crystallization.

These shining, glittering crystals have a wide range of application. Crystals like diamonds, emeralds, rubies and other gemstones are known for their dazzling beauty, while others like, sugar and salt are indispensable part of human diet. They can also be used for healing purposes, known as crystal healing, an important part of astrology. Crystals are able to hold electric charge, which facilitates the healing work. They are also capable of enhancing the energy fields of the body by emitting uniform vibrations. Quartz is another important crystal that is nowadays used in computers, watches and radio stations for its astounding and constant energy field.

They Were Predicted to Fail - But Thank Goodness They Didn’t!

Experts in virtually every field of research and science have made predictions, both bad and good, in response to learning about some new innovation, design, or invention. Many have scoffed through the years, and many inventions never made it past the drawing-board stage, but there are few items that didn’t fail beyond their initial concept - and everyone has come to depend on them as a part of daily life.
They Were Predicted to Fail - But Thank Goodness They Didn’t!

Television
In the United States there are about 220 million "boob tubes" that Americans sit around for hours a day. Televisions have become the main source of news, entertainment, alerts, information, and water cooler topics for more than half a century. A few decades ago designers were focused on making TVs smaller and more portable. Today they are focused on making them thinner and larger. But no matter what the current design trend, televisions are firmly fixed in society the world over. Yet when pioneers of television technology first came on the scene in the early 1900s, people turned up their noses. Scientists said that although the basic idea of the television was probably feasible, it was impossible to create both financially and commercially, and developers need not waste their time dreaming about it. Can you imagine what the world would be like today if those early developers had decided to abandon those dreams?

Air conditioning
George Westinghouse bought from Nikola Tesla the original patent for the transmission of air conditioning, and that’s what started it all. Thomas Edison had a good time taunting Westinghouse about the foolishness of his invention, but thank goodness his taunts didn’t keep Westinghouse from perfecting it. The truth today is that distributing power with air conditioningtoday is even easier and more efficient than with the direct current perfected by Edison!

Automobiles
Over a century ago, people thought the idea of a "horseless carriage" was jut a luxury that only wealthy people would ever be able to indulge in. In fact, popular opinion was that although automobiles would cost less as time went by, it would never be as commonly used as a bicyle. Boy, were those soothsayers wrong. People today are rediscovering the joys and health benefits of cycling, but even still - more than 50 million new cars hit the road every year. It would be hard to take the family to DisneyWorld on a bike, wouldn’t it?

Personal computers
A few decades ago pundits liked to scoff at designers, saying that the limits of possibilities with computers had already been reached, and there would certainly never be any way or need for regular people to use them at home. And then came the integrated circuit (known now as the microchip). Once that tiny gem was developed, the sky was the limit, and that limit keeps besting itself. Computers allowed fantastic advances in research, academia, astronomy, and numerous other disciplines. And once they were created in convenient desktop models, they allowed human beings all around the globe to connect in ways that were never even considered before. Talk about having the world at your front door! Real-time news and communication we enjoy today would have been possible without the personal computer.

These inventions and many others are clear evidence that if you have what you think is a good design for a useful product, the worst thing you can do is pay attention to those who say it can’t be done. There are plenty of excellent, groundbreaking inventions that so-called "experts" were quick to discount. And if the designers had listened to them, where would society be today? Actor Peter Ustinov had it right when he said, "If the world should blow itself up, the last audible voice would be that of an expert saying it can’t be done."

Monday, April 27, 2009

What are Computers Going to be Like in the Future

What are computers going to be like in the future? Have you thought about this? Computers of the future might supersede human intellect. Some believe that computers might even acquire the abilities of replacing human brains. Some researchers have proposed that computers of the future will have an inbuilt artificial intelligence. They may be able to implement robotics.Computer networking is sure to result in the death of distance and world will become a very small place to live.

Looking at the history and timeline of computers, we realize that computers have evolved from simple electronic calculators. Great mathematicians and logicians of the olden times have brought brilliant advances to computing. They have given computers the potential of having a bright future.

What does the future hold for computer technology? The future generations might experience interactions with robots. Robots may replace servants. They may be employed for laborious, repetitive as well as life-risking tasks. Computer researchers also picture the advent of a fully developed artificial intelligence in the world of computing. To know all about it, go through thepros and cons of artificial intelligence. Robotics and AI may make it possible for all daily tasks at the household and workplace to be performed without human intervention.

Nanotechnology is one of the very popular fields that is being looked forward to. Researchers look at nanotechnology as one of the promising fields to merge with computing technology in future.

Looking at the different possibilities of what computers will be like in the forthcoming years, we know one thing is for sure; that computers are bound to have a great future.

Spy Cell Phones

Imagine a cell phone secretly watching your behavior. Think of your mobile phone eyeing your actions and recording them in its memory. Sounds impossible? I am afraid, it isn’t. In fact, cell phones can be used to track user behavior and record conversations and text messages that are exchanged through them.

A spy phone is a mobile phone or a spy device that allows a user to monitor and hear or record conversations and other activities taking place over the phone. Spy phones can function in different ways. They can be used as listening devices whereby secretive conversations can be tracked. They are popularly used by secret agencies to track criminal activities that are carried out over cellular networks. They can be used for tracking periodic calls and recording the frequency of calls from certain suspicious numbers. Spy phones can be used for monitoring business and household activities while the cell phone user is away.

Previously, simple wiretapping techniques were used to spy telephonic conversations. Secret agencies and security authorities commonly used phonetapping as a tool to track the suspects’ behavior. It would enable the security officials to listen to the telephonic conversations taking place over phones they tapped. However, with the advancements in technology, spying became easier.

Today, installable software can be used to record cell phone calls and messages. Also, there are cell phone spying systems, wherein a spy is automatically alerted when the cell phone user dials a certain number. There are certain software applications, which can be loaded onto the cell phones and be used for maintaining call logs, recording text messages and monitoring Internet activity over the mobile phone. Some of these applications also let a user call the target cell phone number from a preset number and track the target user’s activities. ‘Phone Dead’ is a relatively recent technology that enables a cell phone to be used as a spy phone even when switched off. In this case, the spy cell phone is configured to function in a ghost mode, whereby it silently answers the calls it receives.

With the implementation of modern spying techniques, cell phones can be converted into call and text message interceptors or GSM trackers, thus making them function as spy cell phones. Besides tracking messages and conversations, spy cell phones can also alert third-party users of the target phone’s outgoing calls. Spy phones can also be used to track the target user’s location by means of GPS technologies. They can be programmed to record audio or video for a predetermined period of time.

Spy cell phones give rise to legal as well as ethical concerns. Their positive side is that they can be used by parents to monitor the behavior of their children. Also, they can be used by business officials to track workplace activity. The most important application of spy cell phones is in tracking criminal activities. But there is another side to this and it is not positive. Yes, spy cell phone software is easily available. Cell phones are easily convertible to spy phones. Their ready availability makes them subject to illegitimate use. Malicious cell phone users can gain an unauthorized access to other users’ mobile phones and dampen the overall security of cellular networks.

Perhaps, it is ironical that the technology that helps you stay connected to the world can also provide any individual with an unwarranted access to your life. It is probably contrary to the concept of connectivity.

Buying Guide: Plasma TV vs LCD TV - Reviews and Comparison

In order to compare plasma TV with LCD, you need to understand the differences in their technologies. It is important to analyze your usage requirements before you make the buy. Following is a television-buying guide, which will help you come to a decision about whether to go for a plasma TV or get home an LCD. Here is an overview of both the technologies followed by their comparative study.

Plasma TV: It is a flat panel display, which is commonly used for large television screens. Plasma displays are bright and have a low-luminance level in relation to an LCD screen. Its power consumption depends largely on the picture content whereby brighter pictures draw more power than the darker ones.

How does a plasma TV work? A plasma television display consists of two plates of glass that hold thousands of small cells containing xenon and neon gases. Long electrodes are placed between the glass plates on both sides of the cells. The control circuits of a plasma television charge the electrodes to generate a potential difference. The voltage difference results in the ionization of gases, forming plasma. The collision of the gas ions while they move towards the electrodes, results in the emission of photons. Each pixel of a plasma display is made of three subpixel cells of red, blue and green colored phosphors. The intensity of each of the subpixel colors can be increased or decreased by varying the current flowing through them, thus creating different combinations of red, green and blue. This is why, plasma TV can produce most of the visible colors and give an enriched user experience.

LCD TV: An LCD display consists of an array of liquid crystals that are placed between two glass plates with a source of light at the back. The electric charge applied to the crystals, results in the production of images.

An LCD display consists of a flat panel that is made up of pixels filled with liquid crystals. Each pixel of an LCD display consists of a layer of molecules placed between two transparent electrodes and two polarizing filters. Before the application of electric field, the liquid crystal molecules are aligned in a particular direction. A variation in the voltage applied to the liquid crystal layer in each pixel, results in the production of different levels of gray. In color LCD displays, each pixel is divided into three subpixels of colors red, green and blue and the color elements are generated by subtracting colors from white light. As against the different-colored phosphors used in a plasma display, the cells in an LCD TV are colored by means of pigment filters, metal oxide filters and dye filters.

Plasma TV vs. LCD TV

Plasma TV provides a larger screen availability as compared to LCDs. Plasma displays can render a better contrast ratio and enhanced color accuracy. Plasma television can render deeper blacks. Plasma TVs support a wide range of colors and can produce large-sized pictures. LCDs produce brighter pictures. However, greens can sometimes appear over-green and reds might appear warmer. While plasma TV outweighs LCD in brightness and contrast, LCDs are better off in terms of picture resolution.

In plasma TVs, light is not spread across the screen from a central source. Rather, each pixel of a plasma display produces its own light and hence, is readily visible with its brightness consistent with the other pixels on the screen. Plasma screens offer wider viewing angles.

At times, the pixels of LCD displays lack the ability to give quick responses to the changes in color in case of moving images. In such cases, pictures appear to smudge. Plasma displays better motion tracking capabilities resulting in little or no motion lag in moving images. Plasma TV has a good refresh capacity due to which it can handle rapid movements of pictures.

One of the major disadvantages of plasma TV is that it suffers from screen burn-in. With the phosphor-based electronic displays, a prolonged display can result in the formation of ghost-like images. It is caused by the loss of luminosity of the phosphor compounds. Screen burn-ins result in a decline in the picture quality. LCD TVs are not susceptible to burn-ins. However, there are chances that individual pixels of an LCD television burn out, resulting in the appearance of small black or white dots on the screen.

Moreover, LCD TVs consume less power in comparison to plasma TVs. The estimated power saving is of about 25%.

The price difference between plasma TV and LCDs cannot be ignored. LCDs are affordable in small screen sizes while plasma screens are affordable in larger screen sizes, above 42".

Now that you have understood the pros and cons of both plasma and LCD technologies, you should be able to decide which one suits your needs. While plasma TV ranks higher in terms of picture quality and overall user experience, LCDs are better options for cost-cutting and power-saving. Whichever option you decide to go for, remember to approach renowned television companies and purchase only from authorized dealers. Market research will help you analyze the cost-effectiveness of each of the technologies, weigh them against your requirements and affordability and find the best TV in the market. You need to find a trade off.

Geothermal Energy - An Informative Introduction to Clean Energy

I'm sure you've herd of terms like alternative energy, or think green. The fact is that its never been a better time to start thinking of better ways to use, and conserve our natural resources. Even if you don't care much about conserving our resources. You can count on the fact that you can save money through some types of alternative energy. This will spark most peoples interest.
Geothermal Energy - An Informative Introduction to Clean Energy

Geothermal energy is one way you can cute out on that ever rising power bill. If you've never heard the word geothermal before then that's ok. Its a new process in which contractors drill down deep into the earth to use thermal energy as a power source. Geothermal energy is a thermal based energy that is stored deep inside of the earth. The different types of thermal resources available give us different types of engineering and drilling challenges. Unlike fossil fuels there is almost unlimited amounts of clean energy available in the earths crust. There are several types of geothermal energy. They include:
  1. Conventional Geothermal sources
    • The Binary cycle power plants - Which draws up hot water from deep within the earth and then applies a second liquid that has a much lower boiling point then water that then vaporizes instantly. The steam that it shots out rotates large power turbines.
    • Hot and dry rock geothermal power - This type is much more simple. It uses a well that is drilled so deep that it hits hot bedrock. You then pore fluid down the well to produce steam. Which can then be converted into power.
  2. A geothermal heat pump - This type of system is used mainly to heat and cool houses and buildings. If you dig ten to fifteen feet down you will find that it maintains a constant sixty degrees Fahrenheit all year around. The point is to draw up that air into your home or office to maintain that ideal temperature for the whole year, or you can pump a fluid threw a series of pips and run it through a air pump reducing or eliminating the need for electric heat and air.
  3. Pumping out the hot geothermal water - This is self explanatory. You pump out the hot water from within the earths crust for use in home or for commercial applications. This type is not available in all places of the earth unless you can dig extremely deep.
There are geothermal power plants in just four states in the united state. These states are:
  • California - There are 33 geo power plants in this state and it makes up over 90% of our nations power.
  • Nevada - There are 14 in this state.
  • Hawaii - Has only one.
  • Utah - Has only one.
Earthquakes and volcanoes oh my...

The highest output of conventional geothermal energy is found along the path of tectonic plates where you see volcanoes and earthquakes the most. This is why California has 33 power plants using geo thermal, as California has the largest fault line in the USA. You can have a geothermal heat pump anywhere in the world but the ones along the fault lines are the most common. This type is called geothermal reservoirs. These natural reservoirs are filled with water. When the magma from the earths mantle comes up due to the rising of tectonic plats it heats the water in the geothermal reservoirs making them perfect for farming the pure clean energy. A good example of a geothermal well is old faithful in yellow stone national park. However this is protected by law as not to disturb this natural wonder.

I know what your thinking now. How long does the water in the reservoirs last? Well the power plants that use this type of energy have a special process in which they inject the used steam and water back into the reservoir. This type of power plant is still very new, but in theory the water should last as long as the power plant.

All these types of new power plants are 99% clean. The natural sulfur that does come up with the water is filtered out so this process is as good for the environment as possible.

How can you get geothermal heating and air conditioning?

There are plenty of contractors that provide a full installation service. There is even a two thousand dolor tax credit for people and businesses that install geothermal heating and cooling. The majority of the cost will be for the drilling, and the other gear you will need for the geothermal heat pump.

The earth absorbs about forty sever percent of all the sun light that hits the ground. That's what makes the ground below ten feet stay a constant temperature. The holes are often drilled in a vertical series of loops about one hundred to two hundred feet down.

If you live close to water they will place the pips around 50 feet down and horizontally. If you live right at a body of water you can put your pips in the water and anchored to the bottom. This causes a friction as the fluid passes in a series of loops threw the water. which will heat in the winter, and cool in the summer. You can also do this with a well if you have one. If you live in a geo active aria you can also tap into enough hot water to not have to really on a water heater. Saving you even more money.

Advantages of Information Technology

Information Technology:

Information Technology or IT mainly deals with computer applications. The common work environment today is totally dependent on computers. This has led to the need to develop and consistently upgrade dedicated computer software like project management software, for a number of related requirements. These include storage and protection of content, processing and transmitting of dedicated information and the secured retrieval of information, when and as required. IT promotes computing technology, covering everything from installing applications to developing databases.

Why is Information Technology Important:

All our work related applications are now completely automated, thanks to the IT sector. IT professionals are people involved in essential management of sensitive data, exclusive computer networking and systems-engineering. The advancement of the IT sector has resulted in automated:
  • Administration of entire systems.
  • Production and manipulation of sensitive information.
  • Cultural development and communication.
  • Streamlining of business processes and timely upgradation.
Advantages of Information Technology:

The advantages of information technology are many. True globalization has come about only via this automated system. The creation of one interdependent system helps us to share information and end linguistic barriers across the continents. The collapse of geographic boundaries has made the world a 'global village'. The technology has not only made communication cheaper, but also possible much quicker and 24x7. The wonders of text messages, email and auto-response, backed by computer security applications, have opened up scope for direct communication.

Computerized, internet business processes have made many businesses turn to the Internet for increased productivity, greater profitability, clutter free working conditions and global clientèle. It is mainly due to the IT industry that people from diverse cultures are able to personally communicate and exchange valuable ideas. This has greatly reduced prejudice and increased sensitivity. Businesses are able to operate 24x7, even from remote locations.

Information technology has rippled on in the form of a Communication Revolution. Specialists in this field like programmers, analyzers and developers are able to further the applications and improve business processes simultaneously. The management infrastructure thus generated defies all boundaries. Among the many advantages of the industry are technical support post-implementation, network and individual desktop management, dedicated business applications and strategic planning for enhanced profitability and effective project management.

IT provides a number of low-cost business options to tap higher productivity with dedicated small business CRM and a special category for the larger operations. Regular upgrades have enabled many businessmen to increase productivity and identify a market niche that would never have been possible without the connectivity. With every subsequent increase in the ROI orReturn On Investment, businesses are able to remain buoyant even amidst the economic recession. Not only do people connect faster with the help of information technology, but they are also able to identify like-minded individuals and extend help, while strengthening ties.

This segment revolves around automated processes that require little or no human intervention at all. This in turn has minimized job stress levels at the work place and eliminated repetition of tasks, loss due to human error, risks involved due to negligence of timely upgrades and extensive paper-intensive business applications that result in the accumulation of unnecessary bulk. The sophistication of the modern work stations and general working conditions is possible only due to the development of Information Technology.

Existence of Life on Mars

Existence of Life on Mars
I was flown in a spacecraft one night and the next day I landed on an entirely unfamiliar land. Little people, as they called themselves inhabited the region. Many of them were running around aimlessly like little children, while the others were busy working on miniature machines resembling computers. I was taken by surprise when one of them transformed into an orb-like blob of light and and traversed a considerable distance within microseconds! I could not believe my ears when one of those unfamiliar little women came to me and said, "Are you an earthling? I welcome you to Mars." I soon learned that Martians had already begun to establish contact with the Earth; in fact, they were trying to communicate with the people on Earth, through those miniature machines! I was further amused when I saw a few of those little people singing for me a welcome song! They called it the 'Martian melody'. I looked around to suddenly find that my tour to Mars was just a dream! I woke up and began with my daily routine. But I still wonder...will my dream come true?


Mars, as most of us know, is the fourth planet from the Sun. It derives its name from Mars, the Roman God of war. Being reddish in color,it is also known as the 'Red Planet'. The geographical features of this planet, as also its rotational speed and seasonal cycles, are similar to those on Earth. Among all the planets of the solar system, Mars has the highest probability to harbor water. Research has, to some extent, revealed the presence of water ice at the poles and mid-latitudes of Mars.

Owing to its similarities with the geographical and geological features of the planet Earth, scientists have always speculated about the existence of life on Mars. One of the striking similarity between the features of these planets, is that, the length of a day on Mars is same as that on Earth. Also, owing to the similarity between the axial tilts of Earth and Mars, both the planets experience the same seasons. William Whewell from Trinity College, Cambridge proposed that Mars had seas, land and perhaps harbored life. Telescopic observations of the supposed Martian channels, in the 19th century strengthened the belief that life could prevail on Mars. However, the observations were later found to be mere optical illusions. Later, in 1894, William Campbell, an astronomer from the US proved that neither water nor oxygen were present in the atmosphere of Mars. By 1909 it was proved that the presence of canals on Mars was only a myth.

Later, in 1964, the spacecraft Mariner 4 performed a flyby on planet Mars and managed to capture images of the Martian surface. The photographs taken by Mariner 4 revealed the presence of craters on Mars as also the absence of a global magnetic field around the planet. The craters brought out the fact that the Martian surface lacked plate tectonics and weathering. The absence of any global magnetic field made it apparent that Mars had no protection against the harmful cosmic rays. Moreover, the value of the atmospheric pressure on Mars, which was 0.6 kPa, as calculated by Mariner 4, proved that liquid water could not exist on the Martian surface. However, the search for existence of life on Mars did not end here. It rather transformed into a search for bacteria-like living organisms.

Another noteworthy mission aimed at the exploration of Mars was the Viking program by NASA that consisted of two probes, Viking 1 and Viking 2 and was regarded as the most expensive and ambitious endeavors towards the discovery of life on Mars. Each of the space probes were equipped with an orbiter that was meant to photograph the Martian surface, and a lander designed to study the surface of the planet. The mission did not detect the presence of any kind of organic molecules.

Meteorites that have reached the Earth's surface have given evidences of the presence of single-celled life on Mars. Some of them have led scientists to speculate the presence of methane, formaldehyde and silica in trace amounts. Some of NASA's recent experiments have concluded that hematite is found on Mars. Since hematite is a mineral that forms only in presence of water, researchers have suspected the presence of liquid water on Mars.

Despite several facts and speculations about the existence of life on Mars, there is no conclusive evidence that supports the proposition that Mars harbors life. Perhaps, the Martians, if at all they exist, will have to come on Earth to prove their existence. How I wish, they invite me to their planet, a tour of my dreams!

Engine Troubleshooting - Diagnosing Car Engine Problems

There are many people who use cars to commute daily and it is surely very irritating if the car has some engine problem and breaks down before reaching your destination. If the car breaks down, diagnosing a car engine problem can be an easy task for the owner of the car, if he has some basic knowledge of engine problems and car troubleshooting.
Engine Troubleshooting - Diagnosing Car Engine Problems

Engine Start-up Problem
  • Ensure that you have enough gas in your car.
  • When you try to start the car, check if there is a cranky noise. If there is no such noise then you need to check the battery and insure that there is no wiring problem. Before removing the battery terminals, observe the positive and negative terminals of the battery and then try to tighten them; or unplug, clean and join the terminals again. A low battery can be judged by dim lights, low horn, slow or no movement of windscreen wipers, etc. and needs to be recharged.
  • If the battery is fine, you need to check the starter as it may have to be repaired or replaced.
  • If the engine starts but stops; in the case of cars having carb engines, you need to check the choke. Start the engine, press the accelerator lightly, and rev up the engine till it is heated. But if you own a fuel-injected car, there can be many issues such as vacuum hose leak, air valve or the fuel pressure regulator.
Engine Overheats Quickly

If the engine overheats after driving for sometime or a mile, then there may be a possibility of some smoke or steam coming out from the hood.
  • Check and adjust the ignition timing as it can be wrongly set.
  • Check if there are any mechanical problems in the engine; such as compression, and insure proper working of thermostat that regulates the temperature.
  • The level of the engine coolant can be low, and if it is, it should be refilled to a proper level. Examine if there's a leak in the cooling system and repair it as soon as possible.
  • The cylinder head gasket needs to be replaced as it may have conked out.
  • The drive belts of the engine may be slipping or broken, tighten them or replace them if needed.
Backfiring Engines

When you start the engine and accelerate, you hear a continuous noise like fire crackers from the silencer of your vehicle, that's how the engine backfires which can be harmful to the engine.
  • You need to check for a broken or a burnt valve. A broken camshaft also can be a problem and should be repaired to avoid serious engine issues.
  • The ignition timing may need to be adjusted.
  • The timing chain or belt of the camshaft may have been slipped, and needs to be replaced at the earliest.
  • The wiring of the spark plug may not be proper. Adjust the wiring in a proper way and check the firing noise.
No Increase in the Car's Speed or Acceleration

When you accelerate, the power that the car is supposed to have, is not being generated. The car can even stall when the gas pedal is pressed for acceleration.
  • Replace the air filter as it may be dirty or the fuel filter as it may be choked.
  • There may be water in the gasoline. Empty the gas tank and fill it with clean gas.
  • Check if your catalytic converter is choked. If it is, it has to be replaced.
These are some common car troubleshooting steps that can be taken when needed. Diagnosing car engine problems and treating them as early as possible can avoid severe car engine failures and can also help increase the life of the engine. Periodical servicing and maintenance of the car helps it to give the best performance without breakdown problems.

WiTricity - Wireless Electricity

Our forefathers marveled at the invention of glowing light bulbs by Thomas Edison in 1879. However, to us 21st centurions, thelight bulb is nothing out of the ordinary. When computers, cellphones, laptops, iPods, etc. were invented our antennas tweaked. Now this is what you call invention! However, as time's progressing we are getting used to these devices. In fact, charging all these appliances has become so very cumbersome.

Each appliance has its own set of chargers, and with every family member owning their cellphones, the drawers are overflowing with all sorts of wires. How many times have you wished if there could be some way to do away with all the wiry clutter? When you are on the way to work and your cellphone beeps in hunger for a battery charge, haven't you wished for your cellphone battery to get 'self charged'. Well your plight has been heard by doctor 'WiTricity'.
WiTricity - Wireless Electricity

What is WiTricity?

WiTricity is nothing but wireless electricity. Transmission of electrical energy from one object to another without the use of wires is called as WiTricity. WiTricity will ensure that the cellphones, laptops, iPods and other power hungry devices get charged on their own, eliminating the need of plugging them in. Even better, because of WiTricity some of the devices won't require batteries to operate.

What's the Principle behind WiTricity?

WiTricity - Wireless Electricity, these words are simpler said than done. The concept behind this fascinating term is a little complex. However, if you want to understand it, try and picture what I state in the next few lines. Consider two self resonating copper coils of same resonating frequency with a diameter 20 inches each. One copper wire is connected to the power source (WiTricity transmitter), while the other copper wire is connected to the device (WiTricity Receiver).

The electric power from the power source causes the copper coil connected to it to start oscillating at a particular (MHz) frequency. Subsequently, the space around the copper coil gets filled with non-magnetic radiations. This generated magnetic field further transfers the power to the other copper coil connected to the receiver. Since this coil is also of the same frequency, it starts oscillating at the same frequency as the first coil. This is known as 'coupled resonance' and is the principle behind WiTricity.

The Brain behind WiTricity?

Prof. Marin Soljacic from Massachusetts Institute of Technology (MIT), is the one who has proved that magnetic coupled resonance can be utilized in order to transfer energy without wires. What's even more interesting is how he came about this idea. Soljacic, just like any of us was fed up of his 'low battery' beeping cellphone and wondered just like any of us if there was a way to get rid of this 'charging problem'. However, here is where the difference between Soljacic and any of us comes in. He didn't just stand there wondering, instead he tried to figure out if there existed any physical phenomenon which could be of some help. He remembered Michael Faraday's discovery of electromagnetic induction (1831) and used it to come up with WiTricity.

MIT's Experiment:

In 2007, Marin Soljacic led a five member team of researchers at MIT (funded by Army Research Office, National Science Foundation and the Department of Energy) and experimentally demonstrated transfer of electricity without the use of wires. These researchers were able to light a 60W bulb from a source placed seven feet away, with absolutely no physical contact between the bulb and the power source.

The first copper coil (24 inches in diameter) was connected to the power source and the second was connected to the bulb, and were made to resonate at a frequency of 10 MHz. The bulb glowed even when different objects (like a wooden panel) were placed between the two coils. The system worked with 40% efficiency and the power that wasn't utilized remained in the vicinity of the transmitter itself, and did not radiate to the surrounding environment.

Is WiTricity a New Concept?

No, this concept of wireless electricity is not new. In fact it dates back to the 19th century, when Nikola Tesla used conduction- based systems instead of resonance magnetic fields to transfer wireless power. Further, in 2005, Dave Gerding coined the term WiTricity which is being used by the MIT researchers today.

Moreover, we all are aware of the use of electromagnetic radiation (radio waves) which is quite well known for wireless transfer of information. In addition, lasers have also been used to transmit energy without wires. However, radio waves are not feasible for power transmissions because the nature of the radiation is such that it spreads across the place, resulting into a large amount of radiations being wasted. And in the case of lasers, apart from requirement of uninterrupted line of sight (obstacles hinders the transmission process), it is also very dangerous.

What's so Unique about Soljacic's experiment?

What Soljacic's team has done is that they have specifically tuned the transmitting unit to the receiving device. The transmission is also not hindered by the presence of any object in the line of sight. If the object to be charged is in the vicinity of the WiTricity source, then the energy transfer will undoubtedly take place.

In this 'coupling resonance' system, the electric energy that is not used up by the receiver does not get radiated into the surrounding environment, but remains in the vicinity of the transmitter. This ensures safety as well as minimal wastage of power. One of the five researchers, Dr. Aristeidis Karalis says that their coupling resonance system is one million times more efficient as compared to that of Nikola Tesla.

Why was WiTricity not Developed before?

It is often said 'necessity is the best teacher' and can be applied in this case as well. Only in this century, has the need for wireless electricity emerged so rapidly, spearheaded by the agony caused by the cumbersome charging of endless devices. Earlier people didn't need it, so they didn't think about it.

How Safe is WiTricity?

Human beings or other objects placed between the transmitter and receiver do not hinder the transmission of power. However, does magnetic coupling or resonance coupling have any harmful effects on humans? MIT's researchers are quite confident that WiTricity's 'coupling resonance' is safe for humans. They say that the magnetic fields tend to interact very weakly with the biological tissues of the body, and so are not prone to cause any damage to any living beings.

What's the Future of WiTricity?

MIT's WiTricity is only 40 to 45% efficient and according to Soljacic, they have to be twice as efficient to compete with the traditional chemical batteries. The team's next aim is to get a robotic vacuum or a laptop working, charging devices placed anywhere in the room and even robots on factory floors. The researchers are also currently working on the health issues related to this concept and have said that in another three to five years time, they will come up with a WiTricity system for commercial use.

WiTricity, if successful will definitely change the way we live. Imagine cellphones, laptops, digital camera's getting self charged! Wow! Let's hope the researchers will be able to come up with the commercial system soon. Till then, we wait in anticipation!

Compressed Air Cars

Compressed air cars are cars with engines that use compressed air, instead of regular gas used in conventional fuel cars. The idea of such cars is greatly welcomed by people of the 21st century, when pollution caused by petrol and diesel is an extremely worrying factor.
Compressed Air Cars

History and Development

Many people may think that the concept of compressed air cars is a new discovery; but in reality, it is something that was discovered and tried way back in time.
  • In 1687, Dennis Papin came up with the idea of a compressed air engine.
  • In 1838, Andraud and Tessie from Motay, France, built a compressed air car which was eventually tested on a race track in July 1840. This test was a success, but the idea was not pursued any further.
  • Mekarski air engines were used in trams and locomotives for public transport in 1872. The front of the engine carried a tank which could be refilled at every station. Many locomotives were manufactured later on the same line of engineering.
  • In 1892, Robert Hardie discovered a new method of heating the air and increasing the engine's range, which in turn helped to increase the distance that could be traveled at a stretch. Later in 1898, Hoadley and Knight invented a two-stage air engine that had an even greater range.
  • A commercially successful air car was built by Charles B. Hodges in 1896, many of which were sold to the mining industry for mining related operations.
  • In 1926, Lee Barton Williams from the US invented an automobile that could start on gas, but after the vehicle reached a speed of 10 mph, the gas supply got cut off and air was used to run the engine, which then could reach a speed of 62 mph.
  • An experimental model of the compressed air car was built by Sorgato of Italy in January 1975. It could run at 30 mph for approximately 2 hours.
  • Terry Miller developed the 'Air Car One' in 1979. This project cost him $1500, and he sold the rights in 1983.
Likewise, many other engineers invented some other techniques for running compressed air cars. Claud Mead, Des Hill, and Carl Leissler were some of them who helped the enhancement of such engines.

Engine and Technology

The engine that is installed in a compressed air car uses compressed air which is stored in the car's tank at a pressure as high as 4500 psi. The technology used by air car engines is totally different from the technology that is used in conventional fuel cars. They use the pressure generated by the expansion of compressed air to run their pistons. This results in 'no pollution', as air is the only product that is used by the engine to produce power, and the waste material is the air itself.

Air Storage Tank/Fueling

As thought by engineers and designers, the storage tank would be made up of carbon fiber to reduce the car's weight and prevent an explosion, in case of a direct collision. Carbon-fiber tanks are capable of containing air pressure up to 4500 psi, something the steel tanks are not capable of. For fueling the car tank with air, the compressor needs to be plugged into the car, which would use the air that is around to fill the compressed air tank. This could be a slow process of fueling; at least until air cars are commonly used by people, after which high-end compressors would be available at gas stations that would fuel the car in no time at all.

Emission

The air-powered car would normally emit air, as it's what it would solely use. But it would totally depend on the purity of air that is put into the air tank. If impure air is filled in the tank, same would be the level of impurity of the emission. The emission level would highly depend on the location and time of filling air in the tank.

Advantages
  • Air is everywhere, and so freely available. Unlike gas, we don't have the fear of unavailability of air. There would also be no need to pay for buying air.
  • Air is not flammable. Hence, there are no chances of an explosion or fire in any way.
  • There would be no air pollution as the emission is the air itself.
  • If you own a compressor, refueling could be easily done at home.
  • The technology used in this type of car would be simple; as there would be no spark plugs, starter, mufflers, and cooling system to deal with.
  • The production and maintenance cost of the car would also be very low.
Disadvantages
  • Refueling the air tank using a low performance compressor can take up to four hours, as compared to the compressors installed at gas stations which would take a few minutes to perform the same task.
  • A study has shown that the cars running on lithium-ion batteries perform more effectively than compressed air cars.
  • As the air car's body would be made of light material, there would be no scope to use technology that is safe and accident-free. Some people are not not sure about the safety standards of these cars; but some are, as they think that technological advancements would meet the safety expectations of the consumers.
Some companies have already started working on the commercialization of compressed air cars, but till then we need to wait for this ecological and economical traveler to arrive and be available to all.

How to Use an iPhone

The iPhone is a high-end mobile phone by Apple Inc. It has many enhanced features such as the large multi-touch 3.5 inch screen with 16 million colors, a light sensor that adjusts the screen's brightness according to the light around, a virtual keyboard, 3G, WiFi, and GPS mapping.
How to Use an iPhone

On/Off (Sleep/Wake): Switching the iPhone on and off is similar to other multimedia phones, but the same on/off button has many other functions to offer. This button is situated at the top-right edge of the phone. It's a black plastic button that is similar to a 'dash' in shape. By tapping it once, you can put the phone on standby mode that will turn off the screen, which will in turn, consume less power. You will be able to receive a call even when the phone is in standby mode. Tapping it again will view the screen to carry out other functions.

Pressing the button for three seconds will ask for your confirmation to switch off the phone by displaying a screen with a 'Slide to Power Off' option. You can then place your fingertip on the red right-pointing arrow and slide it to the right, which will turn off the phone completely. You wouldn't be able to receive calls or use any functions of the phone. To turn the phone back on, keep the switch pressed for a second, which will boot the device and the Apple logo will be displayed. When there is an incoming call alert, tapping it once can silence the ringing or vibration and tapping it twice will immediately send the call tovoicemail.

Navigation: There is a home screen button below the screen which takes you directly to the home screen, no matter where you are in the menu. The navigation on the iPhone is mostly by fingertip. Up to nine home screens can be created for quick access. You can customize the screen by arranging the icons and also moving them to another home screen. If you need to view the contents that are in options, you need to tap on the icon for a screen to appear, which will display the contents. If the displayed content is larger than the screen, you just need to slide your fingertip on the screen in the appropriate direction to view the rest. The screen only works with a fingertip, and not with a stylus or fingernail, like other touchscreen phones. You can zoom in photographs, maps, mails or the web by placing two fingertips on the point which you need to zoom into, and stretch them in opposite directions. The method of navigation on the iPhone is interesting for a new user, and is also very convenient once a user gets used to it.

Volume Control: On the left edge of the iPhone, there is an up/down volume switch which works in different ways. When on a call, the button adjusts the speaker or the ear bud volume. It adjusts the playback volume when listening to music. It is also like a central volume control button for other functions that need volume control.

Incoming and Outgoing Calls: By tapping the green 'Call' button located at the bottom of the screen, you obtain a list of contact information which you can scroll through, with your fingertip. You can also tap a number that is part of an SMS, email, contacts or favorites to make a call. If you have many phone numbers in your contact list, you can always use the search feature. You can even talk to more than one person at a time, switch between calls or create a conference call. To answer a call, press the 'answer' key located at the bottom of the screen.

On-screen Keyboard: The on-screen keyboard is the only way to type on the iPhone. The keyboard appears automatically when you tap in a place where typing is possible; like notes, SMS, text boxes, email, address bar, etc. While typing, the phone suggests options for the words you type. Text messages are displayed on the iPhone as an ongoing chat, which helps the user recollect the earlier messages and continue with the conversation. This is also known as a thread. The built-in dictionary suggests corrections in case content is typed incorrectly. If you need the 'caps lock' feature, turn on 'Enable Caps Lock' in the 'Settings' program. If you double-tap the 'L' key, after you have turned on the caps lock option, the key turns blue. It indicates that you are in the caps lock mode. To turn off the caps lock, just tap the 'L' key again.

Camera: An iPhone has a 2 mega-pixel camera that can be activated with a tap on the 'camera' icon from the main screen of the phone. You can rotate the view finder of the phone in any way in order to create the correct screen angle to take a picture. To click a picture, you need to press the 'Camera' icon at the bottom of the view screen. You need to select the 'photo' icon on the main page, which looks like a sunflower, to view all the images.

Music Player: For syncing audio files with your iPhone, install iTunes on your computer and sync audio files to it. Connect the iPhone to the computer and transfer the songs through iTunes. Songs on the iPhone can be sorted by artist's name, genre, playlists, etc. You can simply select the 'iPod' option on the screen to listen to music on the iPhone.

Web Access: Internet can be accessed on your iPhone by tapping the 'Safari' icon at the bottom of the home screen. There is an address bar at the top of the browser where you can type the web address. To get a full browsing experience, turn the iPhone sideways, which will rotate the screen view to a wide-screen mode. Turn it back to a vertical position for a normal view. To zoom in on any content of the web page, double-tap it; and to zoom out, double-tap it again. You can scroll the page by dragging the fingertip up, down or sideways. For adding websites to favorites, find the websites that you need to add, press the 'plus' icon at the bottom of the screen and add them to your homepage favorites. To add a bookmark, tap the 'plus' icon and then the 'add bookmark' icon, which can be accessed by tapping the 'open book' icon at the bottom of the page, followed by tapping on the saved page.

These are some of the basic functionalities of an iPhone. There are many other features and functions supported by the iPhone, which can be learned only through hands-on practice.

History of Forensic Science

The application of scientific methods to examine the evidences gathered in order to solve queries, thereby helping the legal system to acknowledge the truth is known as forensic science. These evidences may range from fingerprints to blood samples or from a memory card to a hard disk. The word 'forensic' was derived from the Latin word 'forensis' meaning 'before the forum'.

The first documented use of medical knowledge to solve a crime can be traced to a Chinese book, written in 1248, titled Xi Yuan Ji Lu or 'Collected Cases of Injustice Rectified', written by Song Ci. It contained descriptions of how to distinguish a death due to drowning from a death due to strangulation.

The 18th and 19th centuries saw a great deal of progress in the field of forensic science in Europe. One of the first recorded instances of forensic science application in solving a legal case came in 1784, in England, when a torn piece of paper recovered from the bullet wound in the victim's head matched with the other piece of paper from John Tom's pocket, which led to his conviction.

Fortunatus Fidelis, an Italian doctor, is regarded as the first person to practice modern forensic medicine, way back in 1598. Later, in the 19th century, forensic medicine became a recognized branch of medicine. In 1806, German chemist, Valentin Ross developed a method to detect poison in the walls of a victim's stomach. In a murder trial in 1836, James Marsh, was able to identify arsenic intake as the cause of death with the help of forensic science.

Edmond Locard, a renowned criminologist from France, formulated the basic principle of forensic science, which says, "Every contact leaves a trace". In 1910, he established the world's first laboratory that was fully dedicated to crime analysis, in association with the Lyons Police Department. His seven volume work; Traité de Criminalistique proved to be of great help in many criminal investigations. In 1932, the Federal Bureau of Investigation (FBI) established its own forensics laboratory, known as Federal Crime Laboratory with J. Edgar Hoover as its director.

Over a period of time, forensic science has been broadly divided into a few sub-divisions, each specializing in a particular aspect. While digital forensics is applied to prove scientific methods and techniques to recover data from digital media, forensic toxicology is applied to determine the effects of drugs and poison on living beings. Forensic anthropology is utilized to recover and identify skeletal remains whereas veterinary forensics is used to solve crime involving animals.

Forensic DNA Analysis
The DNA of every person is unique, with the only exception being identical twins. An individual's DNA profile can be generated by using samples from blood, bone or hair. Analyzing the DNA samples collected from the crime-scene with those of the suspect for the presence of a set of specific DNA markers can serve as the proof of crime. DNA profiling was first described by an English geneticist, named Alec Jeffreys in 1985. It was first used in England to convict Colin Pitchfork for the murder of two girls, while in the United States, the first instance came with the conviction of Tommy Lee Andrews for the case related to a series of sexual assaults.

The last two decades have seen a tremendous growth in the use of DNA evidence to solve criminal cases as well as paternity detection. Today, a few hundred forensic laboratories contribute to solving thousands of crimes in the United States.

Today, forensic science is used in the investigation of every major crime. Its popularity in the law enforcement community will only help establish justice.

Thursday, April 23, 2009

Apple to end music restrictions

Apple Inc has agreed to start selling digital songs from its iTunes store without copy protection software.iPod (AP)

At present, most music downloaded from Apple's iTunes store can only be played through an iTunes interface or iPod.

The agreement with Sony BMG, Universal, and Warner Music will end digital rights management (DRM) software currently attached to iTunes music.

The changes were announced at the end of the keynote address, at the Macworld conference in San Francisco.

Apple's senior vice president of worldwide product marketing, Phil Schiller delivered the speech, traditionally given by Steve Jobs.

"Over the last six years songs have been $0.99 [79p]. Music companies want more flexibility. Starting today, 8 million songs will be DRM free and by the end of this quarter, all 10 million songs will be DRM free," he told the crowd.

Apple has also revised its pricing structure, offering a three-tier system with songs available for £0.59, £0.79 and £0.99.

At present, the firm has a one-price-fits-all strategy - currently £0.79 per track - with no subscription fee.

The new model will have a varied pricing structure, with what the company calls "better quality iTunes Plus" costing more.

By the end of this quarter, all 10 million songs will be DRM free
Phil Schiller, Apple

The move could potentially spell the end for DRM limited music, which was never popular with users or the record industry.

Mark Mulligan, a director with market analysts Jupiter Research, said the end of DRM in music- in its current form - was inevitable.

"The only reason it has taken so long is that the record industry has been trying to level the playing field, by giving away DRM free to everyone else, but even that hasn't dented Apple's share," said Mr Mulligan.

"Ultimately, what I think we're going to end up with [in the industry] is a new form of DRM. The more you pay, the less DRM you get bolted onto your music. Premium music will be DRM free, the cheaper it gets, the more shackles are attached," he added.

In 2007 Apple's CEO, Steve Jobs, published an open letter called 'Thoughts on Music' in which he called on the three big record companies to ditch DRM.