Monday, August 3, 2009

Telecommunication World


Search.........

Early telecommunications

A replica of one of Chappe's semaphore towers


In the Middle Ages, chains of beacons were commonly used on hilltops as a means of relaying a signal. Beacon chains suffered the drawback that they could only pass a single bit of information, so the meaning of the message such as "the enemy has been sighted" had to be agreed upon in advance. One notable instance of their use was during the Spanish Armada, when a beacon chain relayed a signal from Plymouth to London signalling the arrival of Spanish ships.
In 1792, Claude Chappe, a French engineer, built the first fixed visual telegraphy system (or semaphore line) between Lille and Paris. However semaphore suffered from the need for skilled operators and expensive towers at intervals of ten to thirty kilometres (six to nineteen miles). As a result of competition from the electrical telegraph, the last commercial line was abandoned in 1880.

Telegraph and telephone
The first commercial electrical telegraph was constructed by Sir Charles Wheatstone and Sir William Fothergill Cooke and opened on 9 April 1839. Both Wheatstone and Cooke viewed their device as "an improvement to the [existing] electromagnetic telegraph" not as a new device.


Samuel Morse independently developed a version of the electrical telegraph that he unsuccessfully demonstrated on 2 September 1837. His code was an important advance over Wheatstone's signaling method. The first transatlantic telegraph cable was successfully completed on 27 July 1866, allowing transatlantic telecommunication for the first time.The conventional telephone was invented independently by Alexander Bell and Elisha Gray in 1876. Antonio Meucci invented the first device that allowed the electrical transmission of voice over a line in 1849. However Meucci's device was of little practical value because it relied upon the electrophonic effect and thus required users to place the receiver in their mouth to “hear” what was being said. The first commercial telephone services were set-up in 1878 and 1879 on both sides of the Atlantic in the cities of New Haven and London.

Radio and television
In 1832, James Lindsay gave a classroom demonstration of wireless telegraphy to his students. By 1854, he was able to demonstrate a transmission across the Firth of Tay from DundeScotland to Woodhaven, a distance of two miles (3 km), using water as the transmission medium In December 1901, Guglielmo Marconi established wireless communication between St. John's, Newfoundland (Canada) and Poldhu, Cornwall (England), earning him the 1909 Nobel Prize in physics (which he shared with Karl Braun).However small-scale radio communication had already been demonstrated in 1893 by Nikola Tesla in a presentation to the National Electric Light AssociationOn 25 March 1925, John Logie Baird was able to demonstrate the transmission of moving pictures at the London department store Selfridges. Baird's device relied upon the Nipkow disk and thus became known as the mechanical television. It formed the basis of experimental broadcasts done by the British Broadcasting Corporation beginning 30 September 1929 However, for most of the twentieth century televisions depended upon the cathode ray tube invented by Karl Braun. The first version of such a television to show promise was produced by Philo Farnsworth and demonstrated to his family on 7 September 1927


Computer networks and the Internet


On 11 September 1940, George Stibitz was able to transmit problems using teletype to his Complex Number Calculator in New York and receive the computed results back at Dartmout College in New Hampshire


This configuration of a centralized computer or mainframe with remote dumb terminals remained popular throughout the 1950s. However, it was not until the 1960s that researchers started to investigate packet switching — a technology that would allow chunks of data to be sent to different computers without first passing through a centralized mainframe. A four-node network emerged on 5 December 1969; this network would become ARPANET, which by 1981 would consist of 213 nodes
ARPANET's development centred around the Request for Comment process and on 7 April 1969, RFC 1 was published. This process is important because ARPANET would eventually merge with other networks to form the Internet and many of the protocols the Internet relies upon today were specified through the Request for Comment process. In September 1981, RFC 791 introduced the Internet Protocol v4 (IPv4) and RFC 793 introduced the Transmission Control Protocol (TCP) — thus creating the TCP/IP protocol that much of the Internet relies upon today.
However, not all important developments were made through the Request for Comment process. Two popular link protocols for local area networks (LANs) also appeared in the 1970s. A patent for the token ring protocol was filed by Olof Soderblom on 29 October 1974 and a paper on the Ethernet protocol was published by Robert Metcalfe


Telephone

Optical fiber provides cheaper bandwidth for long distance communication
In an analogue telephone network, the caller is connected to the person he wants to talk to by switches at various telephone exchanges. The switches form an electrical connection between the two users and the setting of these switches is determined electronically when the caller dials the number. Once the connection is made, the caller's voice is transformed to an electrical signal using a small microphone in the caller's handset. This electrical signal is then sent through the network to the user at the other end where it is transformed back into sound by a small speaker in that person's handset. There is a separate electrical connection that works in reverse, allowing the users to converse.
The fixed-line telephones in most residential homes are analogue — that is, the speaker's voice directly determines the signal's voltage. Although short-distance calls may be handled from end-to-end as analogue signals, increasingly telephone service providers are transparently converting the signals to digital for transmission before converting them back to analogue for reception. The advantage of this is that digitized voice data can travel side-by-side with data from the Internet and can be perfectly reproduced in long distance communication (as opposed to analogue signals that are inevitably impacted by noise).
Mobile phones have had a significant impact on telephone networks. Mobile phone subscriptions now outnumber fixed-line subscriptions in many markets. Sales of mobile phones in 2005 totalled 816.6 million with that figure being almost equally shared amongst the markets of Asia/Pacific (204 m), Western Europe (164 m), CEMEA (Central Europe, the Middle East and Africa) (153.5 m), North America (148 m) and Latin America (102 m).In terms of new subscriptions over the five years from 1999, Africa has outpaced other markets with 58.2% growth Increasingly these phones are being serviced by systems where the voice content is transmitted digitally such as GSM or W-CDMA with many markets choosing to depreciate analogue systems such as AMPS
There have also been dramatic changes in telephone communication behind the scenes. Starting with the operation of TAT-8 in 1988, the 1990s saw the widespread adoption of systems based on optic fibres. The benefit of communicating with optic fibres is that they offer a drastic increase in data capacity. TAT-8 itself was able to carry 10 times as many telephone calls as the last copper cable laid at that time and today's optic fibre cables are able to carry 25 times as many telephone calls as TAT-8 This increase in data capacity is due to several factors: First, optic fibres are physically much smaller than competing technologies. Second, they do not suffer from crosstalk which means several hundred of them can be easily bundled together in a single cable Lastly, improvements in multiplexing have led to an exponential growth in the data capacity of a single fibreAssisting communication across many modern optic fibre networks is a protocol known as Asynchronous Transfer Mode (ATM). The ATM protocol allows for the side-by-side data transmission mentioned in the second paragraph. It is suitable for public telephone networks because it establishes a pathway for data through the network and associates a traffic contract with that pathway. The traffic contract is essentially an agreement between the client and the network about how the network is to handle the data; if the network cannot meet the conditions of the traffic contract it does not accept the connection. This is important because telephone calls can negotiate a contract so as to guarantee themselves a constant bit rate, something that will ensure a caller's voice is not delayed in parts or cut-off completelyThere are competitors to ATM, such as Multiprotocol Label Switching (MPLS), that perform a similar task and are expected to supplant ATM in the future


Radio and television

Digital television

standards and their adoption worldwide.
In a broadcast system, a central high-powered broadcast tower transmits a high-frequency electromagnetic wave to numerous low-powered receivers. The high-frequency wave sent by the tower is modulated with a signal containing visual or audio information. The antenna of the receiver is then tuned so as to pick up the high-frequency wave and a demodulator is used to retrieve the signal containing the visual or audio information. The broadcast signal can be either analogue (signal is varied continuously with respect to the information) or digital (information is encoded as a set of discrete values

The broadcast media industry is at a critical turning point in its development, with many countries moving from analogue to digital broadcasts. This move is made possible by the production of cheaper, faster and more capable integrated circuits. The chief advantage of digital broadcasts is that they prevent a number of complaints with traditional analogue broadcasts. For television, this includes the elimination of problems such as snowy pictures, ghosting and other distortion. These occur because of the nature of analogue transmission, which means that perturbations due to noise will be evident in the final output. Digital transmission overcomes this problem because digital signals are reduced to discrete values upon reception and hence small perturbations do not affect the final output. In a simplified example, if a binary message 1011 was transmitted with signal amplitudes [1.0 0.0 1.0 1.0] and received with signal amplitudes [0.9 0.2 1.1 0.9] it would still decode to the binary message 1011 — a perfect reproduction of what was sent. From this example, a problem with digital transmissions can also be seen in that if the noise is great enough it can significantly alter the decoded message. Using forward error correction a receiver can correct a handful of bit errors in the resulting message but too much noise will lead to incomprehensible output and hence a breakdown of the transmission.In digital television broadcasting, there are three competing standards that are likely to be adopted worldwide. These are the ATSC, DVB and ISDB standards; the adoption of these standards thus far is presented in the captioned map. All three standards use MPEG-2 for video compression. ATSC uses Dolby Digital AC-3 for audio compression, ISDB uses Advanced Audio Coding (MPEG-2 Part 7) and DVB has no standard for audio compression but typically uses The choice of modulation also varies between the schemes. In digital audio broadcasting, standards are much more unified with practically all countries choosing to adopt the Digital Audimstandard (also known as the Eureka 147 standard). The exception being the United States which has chosen to adopt HD Radio. HD Radio, unlike Eureka 147, is based upon a transmission method known as in-band on-channel transmission that allows digital information to "piggyback" on normal AM or FM analogue transmissionsHowever, despite the pending switch to digital, analogue television remains transmitted in most countries. An exception is the United States that ended analogue television transmission on the 12th of June 2009 after twice delaying the switch over deadline. For analogue television, there are three standards in use (see a map on adoption here). These are known as PAL, NTSC and SECAM. For analogue radio, the switch to digital is made more difficult by the fact that analogue receivers are a fraction of the cost of digital receivers The choice of modulation for analogue radio is typically between amplitude modulation (AM) or frequency modulation (FM). To achieve stereo playback, an amplitude modulated subcarrier is used for stereo FM.

The Internet

The OSI reference model

The Internet is a worldwide network of computers and computer networks that can communicate with each other using the Internet Protocol Any computer on the Internet has a unique IP address that can be used by other computers to route information to it. Hence, any computer on the Internet can send a message to any other computer using its IP address. These messages carry with them the originating computer's IP address allowing for two-way communication. In this way, the Internet can be seen as an exchange of messages between computersAs of 2008[update], an estimated 21.9% of the world population has access to the Internet with the highest access rates (measured as a percentage of the population) in North America (73.6%), Oceania/Australia (59.5%) and Europe (48.1%). In terms of broadband access, Iceland (26.7%), South Korea (25.4%) and the Netherlands (25.3%) led the world.The Internet works in part because of protocols that govern how the computers and routers communicate with each other. The nature of computer network communication lends itself to a layered approach where individual protocols in the protocol stack run more-or-less independently of other protocols. This allows lower-level protocols to be customized for the network situation while not changing the way higher-level protocols operate. A practical example of why this is important is because it allows an Internet browser to run the same code regardless of whether the computer it is running on is connected to the Internet through an Ethernet or Wi-Fi connection. Protocols are often talked about in terms of their place in the OSI(pictured on the right), which emerged in 1983 as the first step in an unsuccessful attempt to build a universally adopted networking protocol
Local area networks

Despite the growth of the Internet, the characteristics of local area networks (computer networks that run at most a few kilometres) remain distinct. This is because networks on this scale do not require all the features associated with larger networks and are often more cost-effective and efficient without them.
In the mid-1980s, several protocol suites emerged to fill the gap between the data link and applications layer of the OSI reference model. These were Appletalk, IPX and NetBIOS with the dominant protocol suite during the early 1990s being IPX due to its popularity with MS-DOS users. TCP/IP existed at this point but was typically only used by large government and research facilities As the Internet grew in popularity and a larger percentage of traffic became Internet-related, local area networks gradually moved towards TCP/IP and today networks mostly dedicated to TCP/IP traffic are common. The move to TCP/IP was helped by technologies such as DHCP that allowed TCP/IP clients to discover their own network address — a functionality that came standard with the AppleTalk/IPX/NetBIOS protocol suites.
[71]
It is at the data link layer though that most modern local area networks diverge from the Internet.

Media














Channels of communication that serve many diverse functions, such as offering a variety of entertainment with either mass or specialized appeal, communicating news and information, or displaying advertising messages.
Objects on which data can be stored. These include hard disks, floppy disks, CD-ROMs, and tapes.

Transmission Media:

A transmission medium (plural transmission media) is a material substance (solid, liquid or gas) which can propagate energy waves.
For example, the transmission medium for sound received by the ears is usually air, but solids and liquids may also act as transmission media for sound.
The absence of a material medium (the vacuum of empty space) can also be thought of as a transmission medium for electromagnetic waves such as light and radio waves. While material substance is not required for electromagnetic waves to propagate, such waves are usually affected by the transmission media through which they pass, for instance by absorption or by reflection or refraction at the interfaces between media.
The term transmission medium can also refer to the technical device which employs the material substance to transmit or guide the waves. Thus an optical fiber or a copper cable can be referred to as a transmission medium.
A transmission medium can be classified as a:





Linear medium






if different waves at any particular point in the medium can be superposed;

Transmission Media

Transmission media are actually located below the physical layer and are directly controlled by the physical layer. You could say that transmission media belong to layer zero. Figure shows the position of transmission media in relation to the physical layer.


In data communications the definition of the information and the transmission
Medium is more specific. The transmission medium is usually free space, metallic cable, or fiber-optic cable. The information is usually a signal that is the result of a conversion of data from another form.

Classes Of Transmission Media:

In telecommunications, transmission media can be divided into two broad categories:
Guided and unguided.










Guided media include twisted-pair cable, coaxial cable, and Fiber-optic cable.





Unguided medium is free space. Figure shows this taxonomy.

GUIDED MEDIA:






Guided media, which are those that provide a conduit from one device to another, include twisted-pair cable, coaxial cable, and fiber-optic cable. A signal traveling along any of these media is directed and contained by the physical limits of the medium. Twisted-pair and coaxial cable use metallic (copper) conductors that accept and transport signals in the form of electric current. Optical fiber is a cable that accepts and transports signals in the form of light.

Open Wire:






Open Wire is traditionally used to describe the electrical wire strung along power poles. There is a single wire strung between poles. No shielding or protection from noise interference is used. We are going to extend the traditional definition of Open Wire to include any data signal path without shielding or protection from noise interference. This can include multiconductor cables or single wires. This media is susceptible to a large degree of noise and interference and consequently not acceptable for data transmission except for short distances under 20 ft.

Twisted-Pair Cable

A twisted pair consists of two conductors (normally copper), each with its own plastic
Insulation, twisted together, as shown in Figure
One of the wires is used to carry signals to the receiver, and the other is used only
As a ground reference. The receiver uses the difference between the two.
In addition to the signal sent by the sender on one of the wires, interference (noise)
And crosstalk may affect both wires and create unwanted signals.

Unshielded Versus Shielded Twisted-Pair Cable

The most common twisted-pair cable used in communications is referred to as
Unshielded twisted-pair (UTP). IBM has also produced a version of twisted-pair cable for its use called shielded twisted-pair (STP). STP cable has a metal foil or braided mesh covering that encases each pair of insulated conductors. Although metal casing improves the quality of cable by preventing the penetration of noise or crosstalk, it is bulkier and more expensive. Figure 7.4 shows the difference between UTP and STP.Our discussion focuses primarily on UTP because STP is seldom used outside of IBM.

.







.

Applications





Coaxial cable was widely used in analog telephone networks where a single coaxial network could carry 10,000 voice signals. Later it was used in digital telephone networks where a single coaxial cable could carry digital data up to 600 Mbps. However, coaxial cable in telephone networks has largely been replaced today with fiber-optic cable.

Fiber-Optic Cable

A fiber-optic cable is made of glass or plastic and transmits signals in the form of light. To understand optical fiber, we first need to explore several aspects of the nature of light. Light travels in a straight line as long as it is moving through a single uniform substance. If a ray of light traveling through one substance suddenly enters another substance (of a different density), the ray changes direction. Figure shows how a ray of light changes direction when going from a more dense to a less dense substance.
As the figure shows, if the angle of incidence I (the angle ray makes with the
line perpendicular to the interface between the two substances) is less than the critical angle, the ray refracts and moves closer to the surface. If the angle of incidence is equal to the critical angle, the light bends along the interface. If the angle is greater than the critical angle, the ray reflects (makes a turn) and travels again in the denser substance.
Note that the critical angle is a property of the substance, and its value differs
from one substance to another.


UNGUIDED MEDIA: WIRELESS

Unguided media transport electromagnetic waves without using a physical conductor. This type of communication is often referred to as wireless communication. Signals are normally broadcast through free space and thus are available to anyone who has a device capable of receiving them.
Figure shows the part of the electromagnetic spectrum, ranging from 3 kHz to
900 THz, used for wireless communication.

Unguided signals can travel from the source to destination in several ways: ground
propagation, sky propagation, and line-of-sight propagation, as shown in Figure
The section of the electromagnetic spectrum defined as radio waves and microwaves is divided into eight ranges, called bands, each regulated by government authorities. These bands are rated from very low frequency (VLF) to extremely high frequency (EHF).

We can divide wireless transmission into three broad groups:
1=Radio waves
2=Microwaves
3=Infrared waves





Satellite TechnologySatellites





are launched on rockets which are significantly larger than their scientific payload, requiring rocket fuel to reach outer space. Once the satellite disengages, solar panels open to power the sensors and navigation features indefinitely. Newton's law of gravity and Kepler's laws of motion keeps satellites on course.
The instruments that measure electromagnetic energy are called radiometers. We'll focus on two kinds of scanning radiometers designed to detect upwelling terrestrial radiation in discrete wavelengths: imagers and sounders. "Imagers" result in the striking images readily available on television or the Internet. Imagers can either measure visible sunlight reflected back to space from the Earth's surface and clouds or the amount of radiation emitted by these entities.The output voltage from a particular radiometer detector is proportional to the energy striking that detector area per unit time. On a satellite image, each pixel is assigned a shade according to the energy measured. "Sounder" is a shorter name for Vertical Atmospheric Sounder (VAS), but don't let this name mislead you, these radiometers measure infrared radiation, not sound waves. Sounders provide vertical profiles of temperature, pressure, water vapor and critical trace gases in Earth's atmosphere. Trace gas profiles such as carbon dioxide or ozone are important for climate studies while temperature, water vapor and pressure information are crucial to forecasting severe weather.
Satellite Orbits

There are two main satellite orbits: Low Earth Orbiting Satellites or LEO and Geostationary Earth Orbiting Satellites or GEO. Most satellite images seen on the local television news are produced by GEO satellites which orbit the earth above the equator at the same speed as the earth rotates in order to transmit a continuous picture of the region below. LEO orbits are lower, shorter and nearly perpendicular to GEO orbits, traveling from pole to pole taking "snapshots" of Earth rotating below.
GEO SatellitesGeostationary

Earth Orbiting satellites hover over a single point at an altitude of about 36,000 kilometers.
To maintain constant height and momentum, a geostationary satellite must be located over the equator. The United States typically operates two geostationary satellites called GOES or Geostationary Operational Environment Satellites. One has a good view of the East Coast; the other is focused on the West Coast. Other countries operate GEO satellite systems and scientists traditionally share data freely.