Saturday, October 3, 2009

An Explanation of Wireless Communication Technology Part 2: The Modern G

high fiving phonesWe last left off discussing the transition from 1G to 2G, but popular advertising tells us that we can expect more. At this point, the technology becomes a bit too advanced to explain in an engaging and angst-appealing way, so descriptions will become looser in this segment. Additionally, I will attempt to use the word Telephony at least once more during this post, because for a long time I didn’t believe it was a real word.

Though the first 3G system was launched in Japan in 2001 , 2G systems are still today the mostly widely used networks in the world. They were heavily invested in initially, and that kind of development was hard to dismantle and replace in the dawn of the extra G. In the United States, Motorola developed a 2G TDMA-based technology called  iDEN (Integrated Digital Enhanced Network) that is still used by Sprint Nextel and SouthernLINC wireless. Internationally, the most popular standard for mobile phones is called GSM (Global System for Mobile Communication), with the promoters estimating that this 2G technology is used by approximately 80% of the global market.It’s ubiquity makes international roaming it’s strongest feature.

The transition from second to third generation platforms came in baby steps such as 2.5G and 2.75G. TDMA was discarded because of the increased pause length that needed to be injected. FDMA consumed too much bandwidth. So in addition to refining the compression methods of CDMA, these two systems were hybridized in various ways during the 2-3G evolution.

What truly made much this advancement possible was GPRS (General Packet Radio Service). The principle of this technology was “packet-switching,” meaning that transmitted data was now grouped into blocks irrespective of type or content and then sent in bulk. This optimized the link capacity and made it possible to include WAP (Wireless Application Protocol, an open international standard used for web browsing) MMS (Multimedia Messaging Service, meaning picture-messages and larger strings of text) and minimal internet services such as email access and online ringtone purchasing.

The transition to 2.75G was marked by the hybrid GSM-EDGE network introduced by Cingular in 2003. The novelty of this network was that it was backwards compatible with 2G (one of the few that was) and was easily absorbed into the 3G entity. It introduced more sophisticated methods of coding and transmitting data that my BA in physics couldn’t quite prepare me to understand (one of these is called Phase-Shift Keying, if you’re so inclined to look it up). This system delivered higher bit-rates of transfer per channel of frequency, brining broadband to cell phones. 

True 3G was unveiled in a big ITU conference where it was first called IMT-2000 (a tech-age, human-angle name designed to grab the attention of the new-millennium gadget enthusiast). This family of standards absorbed GSM-EGDE and included other hot systems such as CDMA2000 and WiMaX.

I must point out here that the ITU has not actually provided a very clear or well-defined standard for the third generation of mobile technology. Where 0G, 1G, and 2G all have very distinct, very basic technological differences, 3G only offers “higher” data rates and a few basic minimums for speed. It required evolutionary and revolutionary technology. It was supposedly more secure due to the complicated compression algorithms. It boasts improved “spectral efficiency” (which can be expressed as the number of simultaneous phone calls per area per 1 MHz frequency spectrum).

I realize that most people don’t care about the techno-terms, and so I’ll summarize:  the third generation of wireless telephony gave birth to the high-speed communication and media-information device…aka the Smart Phone.

With the 3G network, sending high-data messages (picture and video) was a breeze. Broadband speeds of communication turned the phone into a mobile, internet-accessing mini-computer. High data transfer rates made it possible to perform multiple functions simultaneously. New operating software was installed into a device that housed a central computing system, rather than just a radio transceiver.

The first, true 3G standard to hit America was none other than Verizon’s in October 2003, which is why they still boast their status as the 3G forerunner. Verizon Wireless has lately been focusing their energy on increasing the strength, scope, reliability, and speed of their 3G cell tower network, rather than spending resources on 4G research or smart-device development.

Many companies didn’t adopt 3G because of the initial expense. In other countries, it was cheaper to bargain for increased 2G bandwidth than it was to install a whole new relay network. Some WCCs had rushed so quickly into the 2G era that they had amassed too much debt to invest (many of you might remember how companies like Verizon and Cingular and T-Mobile seemed to just pop up out of no where between 2003-2006, leaving other cell-staples in the dust). The cost of a smart phone was nearly five times that of a 2G phone, which deterred those who didn’t have that United-States-Innovation-Crazy attitude from investing in them. With that much power output, battery life dropped again and power output spiked back up, rekindling public fears of brain cancer and other radiation exposure.

As if 3G’s party lines weren’t cloudy enough, enter 4G, which is hardly fourth generation at all. Most likely, future innovations  will look back on this entity cropping up in the marketplace today and designate it as another half or three-quarters G status. The only real clear detail is that, in order to be considered a “4G” network (currently called IMT-Advanced), a  system must meet a data transfer speed of one Giga[billion]bit per second for a stationary system, and 100 Mega[million]bits for moving systems. It’s being developed to run applications like wireless broadband access, MMS, video chat, mobile TV, HDTV, and Digital Video Broadcasting all at the same time. It aims to be MORE spectrally efficient, accommodate HIGHER network capacity, to have a minimum data transfer rate of at least 100 Mbits/s between any two points anywhere in the world, and to have a perfectly seamless handoff between networks (old, current, and international).

Ambitious? About as much as it is vague and secret, yes. It’s packet-switched just like 3G, but that’s about as much as I (or anyone on the internet, it seems) is capable of understanding without an advanced degree. Newer MA systems include hot-bed technologies like Orthogonal FDMA (OFDMA), Single Carrier FDMA (SC-FMDA), Interleaved FMDA, Multi-Carrier CDMA (you get the point), and if you want to try to understand these systems, more power to you.

4G is only available is select cities this year, and the legal obstacles for it are significant. It certainly will be expensive, both to build and to participate in, but I will keep my opinions to myself as much as I tried to keep my personal feelings out of these two articles…

And that is the end of my learning affair with G. Thanks for reading!