WHAT IS ELECTRICITY
Electricity is a form of energy. That’s because we can use electricity to do things for us, like run machines and computers. Electricity also can be transformed into other types of energy such as heat or light and is used to heat our homes, light our cities and towns and power the computers we are using.
The electricity that GPU provides travels through the wires you see on tall poles and towers around your town or city. Sometimes they go down in a fenced-off area that is full of large metal boxes, lots of electrical wires, equipment and other stuff. These areas are called substations, and they change the power of electricity is before it gets to your home.
Electricity is the movement of billions of electrons. Electrons are one component of an atom. Atoms are the basic building blocks of all matter and are comprised on protons and neutrons in addition to electrons. The protons and neutrons of an atom are housed in the centre of an atom called the nucleus.
Electricity is a phenomenon that is a result of the existence of electrical charge. The theory of electricity and its inseparable effect, magnetism, is probably the most accurate and complete of all scientific theories. The understanding of electricity has led to the invention of motors, generators, telephones, radio and television, X-ray devices, computers, and nuclear energy systems. Electricity is a necessity to modern civilization.
How is Electricity Produced?
Electricity is a secondary source of energy that is created at a generating plant. At the generating station, primary sources of energy, which include coal, oil, gas, water, and wind, are converted to steam. This steam provides the power to turn the blades of a device known as a turbine. The steam turning the blades of a turbine is like the wind turning the blades of a windmill. The mechanical power created by the steam turning the turbine turns the shaft. The shaft then turns the generator. A generator contains a magnet surrounded by a coil of wire. The movement of electrons is called electric current.
Amber is a yellowish, translucent mineral. As early as 600 BC the Greeks were aware of its peculiar property: when rubbed with a piece of fur, amber develops the ability to attract small pieces of material such as feathers. For centuries this strange, inexplicable property was thought to be unique to amber.
Two thousand years later, in the 16th century, William Gilbert proved that many other substances are electric (from the Greek word for amber, elektron) and that they have two electrical effects. When rubbed with fur, amber acquires resinous electricity; glass, however, when rubbed with silk, acquires vitreous electricity. Electricity repels the same kind and attracts the opposite kind of electricity. Scientists thought that the friction actually created the electricity (their word for charge). They did not realize that an equal amount of opposite electricity remained on the fur or silk.
In 1747, Benjamin Franklin in America and William Watson (1715-87) in England independently reached the same conclusion: all materials possess a single kind of electrical “fluid” that can penetrate matter freely but that can be neither created nor destroyed. The action of rubbing merely transfers the fluid from one body to another, electrifying both. Franklin and Watson originated the principle of conservation of charge: the total quantity of electricity in an insulated system is constant.
Franklin defined the fluid, which corresponded to vitreous electricity, as positive and the lack of fluid as negative. Therefore, according to Franklin, the direction of flow was from positive to negative–the opposite of what is now known to be true. A subsequent two-fluid theory was developed, according to which samples of the same type attract, whereas those of opposite types repel.
Franklin was acquainted with the Leyden jar, a glass jar coated inside and outside with tinfoil. It was the first capacitor, a device used to store charge. The Leyden jar could be discharged by touching the inner and outer foil layers simultaneously, causing an electrical shock to a person. If a metal conductor was used, a spark could be seen and heard. Franklin wondered whether lightning and thunder were also a result of electrical discharge. During a thunderstorm in 1752, Franklin flew a kite that had a metal tip. At the end of the wet, conducting hemp line on which the kite flew he attached a metal key, to which he tied a non-conducting silk string that he held in his hand. The experiment was extremely hazardous, but the results were unmistakable: when he held his knuckles near the key, he could draw sparks from it. The next two who tried this extremely dangerous experiment were killed.
The Electrical Force
It was known as early as 1600 that the attractive or repulsive force diminishes as the charges are separated. This relationship was first placed on a numerically accurate, or quantitative, foundation by Joseph Priestley, a friend of Benjamin Franklin. In 1767, Priestley indirectly deduced that when the distance between two small, charged bodies is increased by some factor, the forces between the bodies are reduced by the square of the factor. For example, if the distance between charges is tripled, the force decreases to one-ninth its former value. Although rigorous, Priestley’s proof was so simple that he did not strongly advocate it. The matter was not considered settled until 18 years later, when John Robinson of Scotland made more direct measurements of the electrical force involved.
The French physicist Charles A. de Coulomb, whose name is used as the unit of electrical charge, later performed a series of experiments that added important details, as well as precision, to Priestley’s proof. He also promoted the two-fluid theory of electrical charges, rejecting both the idea of the creation of electricity by friction and Franklin’s single-fluid model.
Today the electrostatic force law, also known as COULOMB’S LAW, is expressed as follows: if two small objects, a distance r apart, have charges p and q and are at rest, the magnitude of the force F on either is given by F = Kpq/rr, where K is a constant. According to the International System of Units, the force is measured in newtons (1 Newton = 0.225 lb), the distance in meters, and the charges in coulombs. The constant K then becomes 8.988 billion. Charges of opposite sign attract, whereas those of the same sign repel.
A coulomb C is a large amount of charge. To hold a positive coulomb (+ C) 1 meter away from a negative coulomb (- C) would require a force of 9 billion newtons (2 billion pounds). A typical charged cloud about to give rise to a lightning bolt has a charge of about 30 coulombs.
Because of an accident the 18th-century Italian scientist Luigi Galvani started a chain of events that culminated in the development of the concept of voltage and the invention of the battery. In 1780 one of Galvani’s assistants noticed that a dissected frog leg twitched when he touched its nerve with a scalpel. Another assistant thought that he had seen a spark from a nearby charged electric generator at the same time. Galvani reasoned that the electricity was the cause of the muscle contractions. He mistakenly thought, however, that the effect was due to the transfer of a special fluid, or “animal electricity,” rather than to conventional electricity.
Experiments such as this, in which the legs of a frog or bird were stimulated by contact with different types of metals, led Luigi Galvani in 1791 to propose his theory that animal tissues generate electricity. (The Bettmann Archive)
In experimenting with what he called atmospheric electricity, Galvani found that a frog muscle would twitch when hung by a brass hook on an iron lattice. Another Italian, Alessandro Volta, a professor at the University of Pavia, affirmed that the brass and iron, separated by the moist tissue of the frog, were generating electricity, and that the frog’s leg was simply a detector. In 1800, Volta succeeded in amplifying the effect by stacking plates made of copper, zinc, and moistened pasteboard respectively and in so doing he invented the battery.
A battery separates electrical charge by chemical means. If the charge is removed in some way, the battery separates more charge, thus transforming chemical energy into electrical energy. A battery can affect charges, for instance, by forcing them through the filament of a light bulb. Its ability to do work by electrical means is measured by the volt, named for Volta. A volt is equal to 1 joule of work or energy (1 joule = 2.78/10,000,000 kilowatt-hours) for each coulomb of charge. The electrical ability of a battery to do work is called the electromotive force, or emf.
The first electric battery, known as the voltaic pile, was invented in 1800 by Alessandro Volta (1745-1827). Voltaic piles consisted of a stack of alternating discs of zinc and copper or silver separated by felt soaked in brine. They provided, for the first time, a simple source of stored electrical energy that didn’t rely on mechanical means. (The Bettmann Archive)
Another device capable of electrical work is the capacitor, a descendant of the Leyden jar, which is used to store charge. If a charge Q is placed on the metal plates the voltage rises to amount V. The measure of a capacitor’s ability to store charge is the capacitance C, where C = Q/V. Charge flows from a capacitor just as it flows from a battery, but with one significant difference. When the charge leaves a capacitor’s plates, no more can be obtained without recharging. This happens because the electrical force is conservative. The energy released cannot exceed the energy stored. This ability to do work is called electric potential.
A type of conservation of energy is also associated with emf. The electrical energy obtainable from a battery is limited by the energy stored in chemical molecular bonds. Both emf and electric potential are measured in volts, and, unfortunately, the terms voltage, potential, and emf are used rather loosely. For example, the term battery potential is often used instead of emf.
Whether as an emf or an electric potential, voltage is a measure of the ability of a system to do work on a unit amount of charge by electrical means. Voltage is a better-known quantity than electric field. For instance, voltages measured in an electrocardiogram peak at 5 millivolts; many are familiar with the 115-volt potential of a house. The potential between a cloud and the ground just before a typical lightning bolt is a minimum of 10,000 volts.
Sometimes high voltages are needed. For instance, the electron beams in television tubes require more than 30,000 volts. Electrons “falling” through such a potential reach velocities as high as one-third the speed of light and have sufficient energy to cause a spot of light on the screen. Such high potentials may be developed from lower alternating potentials by using a transformer.
By scuffing shoes on a carpet on a dry day, an electric potential of more than 20,000 volts can be developed, resulting in a spark.
An electric charge in motion is called electric current. The strength of a current is the amount of charge passing a given point (as in a wire) per second, or I = Q/t, where Q coulombs of charge pass in t seconds. The unit for measuring current is the ampere or amp, which equals 1 coulomb/sec.
Because it is the source of magnetism as well, current is the link between electricity and magnetism. In 1819 the Danish physicist Hans Christian Oersted found that a compass needle was affected by a current-carrying wire. Almost immediately, Andre Ampere in France discovered the magnetic force law. Michael Faraday in England and Joseph Henry in the United States added the idea of magnetic induction, whereby a changing magnetic field produces an electric field. The stage was then set for the encompassing electromagnetic theory of James Clerk Maxwell.
The variation of actual currents is enormous. A modern electrometer can detect currents as low as 1/100,000,000,000,000,000 amp, which is a mere 63 electrons per second. The current in a nerve impulse is approximately 1/100,000 amp; a 100-watt light bulb carries 1 amp; a lightning bolt peaks at about 20,000 amps; and a 1,200-megawatt nuclear power plant can deliver 10,000,000 amps at 115 V.
Most materials are insulators. In them, all electrons are bound in individual atoms and do not permit a flow of charge unless the electric field acting on the material is so high that breakdown occurs. Then, in a process called ionisation, the most loosely bound electrons are torn from the atoms, allowing current flow. This condition exists during a lightning storm. The separation of charge between the clouds and the ground creates a large electric field that ionises the air atoms, thereby forming a conducting path from cloud to ground.
Although a conductor permits the flow of charge, it is not without a cost in energy. The electrons are accelerated by the electric field. Before they move far, however, they collide with one of the atoms of the conductor, slowing them down or even reversing their direction. As a result, they lose energy to the atoms. This energy appears as heat, and the scattering is a resistance to the current.
In 1827 a German teacher named George Ohm demonstrated that the current in a wire increases in direct proportion to the voltage V and the cross-sectional areas of the wire A, and in inverse proportion to the length I. Because the current also depends on the particular material, Ohm’s law is written in two steps, I = V/R, and R = pI/A X the resistivity. The quantity R is called the resistance. The resistivity depends only on the type of material. The unit of resistance is the ohm, where 1 ohm is equal to 1 volt/amp.
Certain materials, such as lead, lose their resistance almost entirely when cooled to within a few degrees of absolute zero. Such materials are called superconductors. Substances have recently been found that become super conductive at much higher temperatures.
The resistive heating caused by electron scattering is a significant effect and is used in electric stoves and heaters as well as in incandescent light bulbs. In a resistor the power P, or energy per second, is given by P = (I squared) R.
Speed of Electricity
As electrons bounce along through the wire, the general charge drift constitutes the current. The average, or drift, speed is defined as the speed the electrons would have if all were moving with constant velocity parallel to the field. The drift speed is actually small even in good conductors. In a 1.0-mm-diameter copper wire carrying a current of 10 amps at room temperature, the drift speed of the electrons is 0.2 mm per second. In copper, the electrons rarely drift faster than one hundred-billionth the speed of light.
On the other hand, the speed of the electric signal is the speed of light. This means that, at the speed of light, the removal of one electron from one end of a long wire would affect electrons elsewhere. For example, consider a long, motionless freight train, with the cars representing electrons in a wire. Because the couplings between cars have play in them, the caboose is affected a short while after the engine begins moving.
During this time the engine moves forward a short distance. The signal telling the caboose to start moves backward quickly, travelling the length of the train in the same time it takes the engine to go forward a meter or so. Similarly, the electron drift speed in a conductor is low, but the signal moves at the speed of light in the opposite direction.
Electrical Theory of Matter
The possibility that electricity does not consist of a smooth, continuous fluid probably occurred to many scientists. Even Franklin once wrote that the “fluid” consists of “particles extremely sub tile.”
Nevertheless, a great deal of evidence had to be accumulated before the view was accepted that electricity comes in tiny, discrete amounts, looking not at all like a fluid when viewed microscopically. James Clerk Maxwell opposed this particle theory. Toward the end of the 1800s, however, the work of Sir Joseph John Thomson (1856-1940) and others proved the existence of the electron.
Thomson had measured the ratio of the electron’s charge to its mass. Then in 1899 he inferred a value for the electronic charge itself by observing the behavior of a cloud of tiny charged water droplets in an electric field. This observation led to Millikan’s Oil-Drop Experiment.
Robert Millikan, a physicist at the University of Chicago, with the assistance of his student Harvey Fletcher, sought to measure the charge of a single electron, an ambitious goal in 1906. A tiny droplet of oil with an excess of a few electrons was formed by forcing the liquid through a device similar to a perfume atomizer. The drop was then, in effect, suspended, with an electric field attracting it up and the force of gravity pulling it down. By determining the mass of the oil drop and the value of the electric field, the charge on the drop was calculated. The result: the electron charge e is negative and has the value e = 1.60/10,000,000,000,000,000,000 coulombs. This charge is so small that a single copper penny contains more than 10,000,000,000,000,000,000,000 electrons.
Robert Millikan (1868-1953) won the 1923 Nobel Prize in physics for his work on the elementary electric charge and on the photoelectric effect. He also did much work on cosmic rays, which he named. He is seen here (right) in his basement with his assistant and his self-recording electroscope. Under Millikan’s leadership the California Institute of Technology quickly developed into one of the foremost scientific centers in the world. (The Bettmann Archive)
Millikan also found that a charge always appears to be in exact integer multiples of plus or minus e; in other words, the charge is quantized. Other elementary particles discovered later were also found to have a charge of plus or minus e. For example, the positron, discovered in 1932 by Carl David Anderson of the California Institute of Technology, is exactly the same as the electron, except that it has a charge of +e.
Bulk matter is normally neutral. The tendency is for every positive proton in an atom to be electrically balanced against a negative electron, and the sum is as close to zero as anyone has been able to measure. In 1911, Ernest Rutherford proposed the nuclear atom. He suggested that electrons orbit a positively charged nucleus less than 1/100,000,000,000,000 meters in diameter, just as planets orbit the Sun. Rutherford also suggested that the nucleus is composed of protons, each having a charge +e.
This view of matter, still considered correct in many ways, established the electrical force as that which holds an atom together. After Rutherford presented his atom, the Danish physicist Niels Bohr proposed that the electrons have only certain orbits about the nucleus, that other orbits are impossible.
Early in the 20th century the quantum theory was developed. According to this theory, the electron is a smeared cloud of mass and charge. In some situations the electron cloud might be so small that the particle appears to be much like the tiny, charged marble of earlier views. In other situations, such as when the electron is in an atomic orbit, the cloud is many times larger.
In 1963, Murray Gell-Mann and George Zweig of the California Institute of Technology proposed a theory according to which the electronic charge e might not be the fundamental charge after all. In their theory, heavy particles such as protons and neutrons consist of various combinations of particles called quarks. One quark is supposed to have charge (-1/3)e and another (-2/3)e. This theory has prompted a major search for quarks.