A modulator-demodulator or modem is a computer hardware device that converts data from a digital format into a format suitable for an analog transmission medium such as telephone or radio. A modem transmits data by modulating one or more carrier wave signals to encode digital information, while the receiver demodulates the signal to recreate the original digital information. The goal is to produce a signal that can be transmitted easily and decoded reliably. Modems can be used with almost any means of transmitting analog signals, from light-emitting diodes to radio.
Early modems were devices that used audible sounds suitable for transmission over traditional telephone systems and leased lines. These generally operated at 110 or 300 bits per second (bit/s), and the connection between devices was normally manual, using an attached telephone handset. By the 1970s, higher speeds of 1200 and 2400 bit/s for asynchronous dial connections, 4800 bit/s for synchronous leased line connections and 35 kbit/s for synchronous conditioned leased lines were available. By the 1980s, less expensive 1200 and 2400 bit/s dialup modems were being released, and modems working on radio and other systems were available. As device sophistication grew rapidly in the late 1990s, telephone-based modems quickly exhausted the available bandwidth, reaching the ultimate standard of 56 kbit/s. The rise of public use of the internet during the late 1990s led to demands for much higher performance, leading to the move away from audio-based systems to entirely new encodings on cable television lines and short-range signals in subcarriers on telephone lines. The move to cellular telephones, especially in the late 1990s and the emergence of smartphones in the 2000s led to the development of ever-faster radio-based systems. Today, modems are ubiquitous and largely invisible, included in almost every mobile computing device in one form or another, and generally capable of speeds on the order of tens or hundreds of megabytes per second. Modems are frequently classified by the maximum amount of data they can send in a given unit of time, usually expressed in bits per second (symbol bit/s, sometimes abbreviated "bps") or rarely in bytes per second (symbol B/s). Modern broadband modem speeds are typically expressed in megabits per second (Mbit/s). Historically, modems were often classified by their symbol rate, measured in baud. The baud unit denotes symbols per second, or the number of times per second the modem sends a new signal. For example, the ITU V.21 standard used audio frequency-shift keying with two possible frequencies, corresponding to two distinct symbols (or one bit per symbol), to carry 300 bits per second using 300 baud. By contrast, the original ITU V.22 standard, which could transmit and receive four distinct symbols (two bits per symbol), transmitted 1,200 bits by sending 600 symbols per second (600 baud) using phase-shift keying. Many modems are variable-rate, permitting them to be used over a medium with less than ideal characteristics, such as a telephone line that is of poor quality or is too long. This capability is often adaptive so that a modem can discover the maximum practical transmission rate during the connect phase, or during operation. Collection of modems once used in Australia, including dial-up, DSL, and cable modems. Modems grew out of the need to connect teleprinters over ordinary phone lines instead of the more expensive leased lines which had previously been used for current loop–based teleprinters and automated telegraphs. The earliest devices that satisfy the definition of a modem may be the multiplexers used by news wire services in the 1920s.[1] In 1941, the Allies developed a voice encryption system called SIGSALY which used a vocoder to digitize speech, then encrypted the speech with one-time pad and encoded the digital data as tones using frequency shift keying. This was also a digital modulation technique, making this an early modem.[2] Commercial modems largely did not become available until the late 1950s, when the rapid development of computer technology created demand for a method of connecting computers together over long distances, resulting in the Bell Company and then other businesses producing an increasing number of computer modems for use over both switched and leased telephone lines. Later developments would produce modems that operated over cable television lines, power lines, and various radio technologies, as well as modems that achieved much higher speeds over telephone lines. A dial-up modem transmits computer data over an ordinary switched telephone line that has not been designed for data use. This contrasts with leased line modems, which also operate over lines provided by a telephone company, but ones which are intended for data use and do not impose the same signaling constraints. The modulated data must fit the frequency constraints of a normal voice audio signal. Early modems, including acoustic coupled modems, relied on the communicating parties or an automatic calling unit to dial and establish a voice connection before switching their modems to line; more modern devices are able to perform the actions needed to connect a call through a telephone exchange, e.g., picking up the line, dialing, understanding signals sent back by phone company equipment (dialtone, ringing, busy signal) recognizing incoming ring signals and answering calls. Dial-up modems have been made in a wide variety of speeds and capabilities, with many capable of testing the line they are calling over and selecting the most advanced signaling mode that the line can support. Generally speaking, the fastest dialup modems ever available to consumers never exceeded 56 kbit/s and never achieved that speed in both directions. The dial-up modem was once a widely known technology, since it was mass-marketed to consumers in many countries for dial-up internet access. In the 1990s, tens of millions of people in the United States used dial-up modems for internet access.[3] Dial-up service has since been largely supplanted by broadband internet,[4] such as DSL, which typically still uses a modem, but of a very different type which may still operate over a normal phone line, but with substantially relaxed constraints. History1950sTeleGuide terminal Mass production of telephone line modems in the United States began as part of the SAGE air-defense system in 1958, connecting terminals at various airbases, radar sites, and command-and-control centers to the SAGE director centers scattered around the United States and Canada. Shortly afterwards in 1959, the technology in the SAGE modems was made available commercially as the Bell 101, which provided 110 bit/s speeds. Bell called this and several other early modems "datasets". 1960sSome early modems were based on touch-tone frequencies, such as Bell 400-style touch-tone modems.[5] The Bell 103A standard was introduced by AT&T in 1962. It provided full-duplex service at 300 bit/s over normal phone lines. Frequency-shift keying was used, with the call originator transmitting at 1,070 or 1,270 Hz and the answering modem transmitting at 2,025 or 2,225 Hz.[6] The 103 modem would eventually become a de facto standard once third-party (non-AT&T modems) reached the market, and throughout the 1970s, independently made modems compatible with the Bell 103 de facto standard were commonplace.[7] Example models included the Novation CAT and the Anderson-Jacobson. A lower-cost option was the Pennywhistle modem, designed to be built using readily available parts.[8] Teletype machines were granted access to remote networks such as the Teletypewriter Exchange using the Bell 103 modem.[9] AT&T also produced reduced-cost units, the originate-only 113D and the answer-only 113B/C modems. 1970sThe 201A Data-Phone was a synchronous modem using two-bit-per-symbol phase-shift keying (PSK) encoding, achieving 2,000 bit/s half-duplex over normal phone lines.[10] In this system the two tones for any one side of the connection are sent at similar frequencies as in the 300 bit/s systems, but slightly out of phase. In early 1973, Vadic introduced the VA3400 which performed full-duplex at 1,200 bit/s over a normal phone line.[11] In November 1976, AT&T introduced the 212A modem, similar in design, but using the lower frequency set for transmission. It was not compatible with the VA3400,[12] but it would operate with 103A modem at 300 bit/s. In 1977, Vadic responded with the VA3467 triple modem, an answer-only modem sold to computer center operators that supported Vadic's 1,200-bit/s mode, AT&T's 212A mode, and 103A operation.[13] The original 300-baud Hayes Smartmodem 1980sA significant advance in modems was the Hayes Smartmodem, introduced in 1981. The Smartmodem was an otherwise standard 103A 300 bit/s direct-connect modem, but it introduced a command language which allowed the computer to make control requests, such as commands to dial or answer calls, over the same RS-232 interface used for the data connection.[14] The command set used by this device became a de facto standard, the Hayes command set, which was integrated into devices from many other manufacturers. Automatic dialing was not a new capability – it had been available via separate Automatic Calling Units, and via modems using the X.21 interface[15] – but the Smartmodem made it available in a single device that could be used with even the most minimal implementations of the ubiquitous RS-232 interface, making this capability accessible from virtually any system or language.[16] The introduction of the Smartmodem made communications much simpler and more easily accessed. This provided a growing market for other vendors, who licensed the Hayes patents and competed on price or by adding features.[17] This eventually led to legal action over use of the patented Hayes command language.[18] Dial modems generally remained at 300 and 1,200 bit/s (eventually becoming standards such as V.21 and V.22) into the mid-1980s. In 1984, V.22bis was created, a 2,400-bit/s system similar in concept to the 1,200-bit/s Bell 212. This bit rate increases was achieved by defining four or eight distinct symbols, which allowed the encoding of two or three bits per symbol instead of only one. By the late 1980s, many modems could support improved standards like this, and 2,400-bit/s operation was becoming common. Increasing modem speed greatly improved the responsiveness of online systems and made file transfer practical. This led to rapid growth of online services with large file libraries, which in turn gave more reason to own a modem. The rapid update of modems led to a similar rapid increase in BBS use. The introduction of microcomputer systems with internal expansion slots made small internal modems practical. This led to a series of popular modems for the S-100 bus and Apple II computers that could directly dial out, answer incoming calls, and hang up entirely from software, the basic requirements of a bulletin board system (BBS). The seminal CBBS for instance was created on an S-100 machine with a Hayes internal modem, and a number of similar systems followed. Echo cancellation became a feature of modems in this period, which improved the bandwidth available to both modems by allowing them to ignore their own reflected signals. Additional improvements were introduced by quadrature amplitude modulation (QAM) encoding, which increased the number of bits per symbol to four through a combination of phase shift and amplitude. Transmitting at 1,200 baud produced the 4,800 bit/s V.27ter standard, and at 2,400 baud the 9,600 bit/s V.32. The carrier frequency was 1,650 Hz in both systems. The introduction of these higher-speed systems also led to the development of the digital fax machine during the 1980s. While early fax technology also used modulated signals on a phone line, digital fax used the now-standard digital encoding used by computer modems. This eventually allowed computers to send and receive fax images. 1990sUSRobotics Sportster 14,400 Fax modem (1994) In the early 1990s, V.32 modems operating at 9600 bit/s were introduced, but were expensive and were only starting to enter the market when V.32bis was standardized, which operated at 14400 bit/s. Rockwell International's chip division developed a new driver chip set incorporating the V.32bis standard and aggressively priced it. Supra, Inc. arranged a short-term exclusivity arrangement with Rockwell, and developed the SupraFAXModem 14400 based on it. Introduced in January 1992 at $399 (or less), it was half the price of the slower V.32 modems already on the market. This led to a price war, and by the end of the year V.32 was dead, never having been really established, and V.32bis modems were widely available for $250. V.32bis was so successful that the older high-speed standards had little advantages. USRobotics (USR) fought back with a 16800 bit/s version of HST, while AT&T introduced a one-off 19200 bit/s method they referred to as V.32ter, but neither non-standard modem sold well. V.34 modem implemented as an internal ISA card V.34 data/fax modem as PC card for notebooks
Consumer interest in these proprietary improvements waned during the lengthy introduction of the 28800 bit/s V.34 standard. While waiting, several companies decided to release hardware and introduced modems they referred to as V.FAST. In order to guarantee compatibility with V.34 modems once a standard was ratified (1994), manufacturers used more flexible components, generally a DSP and microcontroller, as opposed to purpose-designed ASIC modem chips. This would allow later firmware updates to conform with the standards once ratified. The ITU standard V.34 represents the culmination of these joint efforts. It employed the most powerful coding techniques available at the time, including channel encoding and shape encoding. From the mere four bits per symbol (9.6 kbit/s), the new standards used the functional equivalent of 6 to 10 bits per symbol, plus increasing baud rates from 2,400 to 3,429, to create 14.4, 28.8, and 33.6 kbit/s modems. This rate is near the theoretical Shannon limit of a phone line.[19] 56 kbit/s technologiesWhile 56000 bit/s speeds had been available for leased-line modems for some time, they did not become available for dial up modems until the late 1990s. Dial-up modem bank at an ISP In the late 1990s, technologies to achieve speeds above 33.6 kbit/s began to be introduced. Several approaches were used, but all of them began as solutions to a single fundamental problem with phone lines. By the time technology companies began to investigate speeds above 33.6 kbit/s, telephone companies had switched almost entirely to all-digital networks. As soon as a phone line reached a local central office, a line card converted the analog signal from the subscriber to a digital one and conversely. While digitally encoded telephone lines notionally provide the same bandwidth as the analog systems they replaced, the digitization itself placed constraints on the types of waveforms that could be reliably encoded. The first problem was that the process of analog-to-digital conversion is intrinsically lossy, but second, and more importantly, the digital signals used by the telcos were not "linear": they did not encode all frequencies the same way, instead utilizing a nonlinear encoding (μ-law and a-law) meant to favor the nonlinear response of the human ear to voice signals. This made it very difficult to find a 56 kbit/s encoding that could survive the digitizing process. Modem manufacturers discovered that, while the analog to digital conversion could not preserve higher speeds, digital-to-analog conversions could. Because it was possible for an ISP to obtain a direct digital connection to a telco, a digital modem – one that connects directly to a digital telephone network interface, such as T1 or PRI – could send a signal that utilized every bit of bandwidth available in the system. While that signal still had to be converted back to analog at the subscriber end, that conversion would not distort the signal in the same way that the opposite direction did. Early 56k dial-up productsThe first 56k dial-up option was a proprietary design from USRobotics, which they called "X2" because 56k was twice the speed (×2) of 28k modems. At that time, USRobotics held a 40% share of the retail modem market, while Rockwell International held an 80% share of the modem chipset market. Concerned with being shut out, Rockwell began work on a rival 56k technology. They joined with Lucent and Motorola to develop what they called "K56Flex" or just "Flex". Both technologies reached the market around February 1997; although problems with K56Flex modems were noted in product reviews through July, within six months the two technologies worked equally well, with variations dependent largely on local connection characteristics.[20] The retail price of these early 56K modems was about US$200, compared to $100 for standard 33k modems. Compatible equipment was also required at the Internet service providers (ISPs) end, with costs varying depending on whether their current equipment could be upgraded. About half of all ISPs offered 56k support by October 1997. Consumer sales were relatively low, which USRobotics and Rockwell attributed to conflicting standards.[21] Standardized 56k (V.90/V.92)In February 1998, The International Telecommunication Union (ITU) announced the draft of a new 56 kbit/s standard V.90 with strong industry support. Incompatible with either existing standard, it was an amalgam of both, but was designed to allow both types of modem by a firmware upgrade. The V.90 standard was approved in September 1998 and widely adopted by ISPs and consumers.[21][22] The V.92 standard was approved by ITU in November 2000[23] and utilized digital PCM technology to increase the upload speed to a maximum of 48 kbit/s. The high upload speed was a tradeoff. 48 kbit/s upstream rate would reduce the downstream as low as 40 kbit/s due to echo effects on the line. To avoid this problem, V.92 modems offer the option to turn off the digital upstream and instead use a plain 33.6 kbit/s analog connection in order to maintain a high digital downstream of 50 kbit/s or higher.[24] V.92 also added two other features. The first is the ability for users who have call waiting to put their dial-up Internet connection on hold for extended periods of time while they answer a call. The second feature is the ability to quickly connect to one's ISP, achieved by remembering the analog and digital characteristics of the telephone line and using this saved information when reconnecting. Evolution of dial-up speedsThese values are maximum values, and actual values may be slower under certain conditions (for example, noisy phone lines).[25] For a complete list see the companion article list of device bandwidths. A baud is one symbol per second; each symbol may encode one or more data bits.
Compression
Many dial-up modems implement standards for data compression to achieve higher effective throughput for the same bitrate. V.44 is an example used in conjunction with V.92 to achieve speeds greater than 56k over ordinary phone lines. As telephone-based 56k modems began losing popularity, some Internet service providers such as Netzero/Juno, Netscape, and others started using pre-compression to increase apparent throughput. This server-side compression can operate much more efficiently than the on-the-fly compression performed within modems, because the compression techniques are content-specific (JPEG, text, EXE, etc.). Website text, images, and Flash media are typically compacted to approximately 4%, 12%, and 30%, respectively. The drawback is a loss in quality, as they use lossy compression which causes images to become pixelated and smeared. ISPs employing this approach often advertise it as "accelerated dial-up". These accelerated downloads are integrated into the Opera and Amazon Silk web browsers, using their own server-side text and image compression. Methods of attachmentDial-up modems can attach in two different ways: with an acoustic coupler, or with a direct electrical connection. Directly connected modemsThe case Hush-A-Phone Corp. v. United States, which legalized acoustic couplers, applied only to mechanical connections to a telephone set, not electrical connections to the telephone line. The Carterfone decision of 1968, however, permitted customers to attach devices directly to a telephone line as long as they followed stringent Bell-defined standards for non-interference with the phone network.[31] This opened the door to independent (non-AT&T) manufacture of direct-connect modems, that plugged directly into the phone line rather than via an acoustic coupler. While Carterfone required AT&T to permit connection of devices, AT&T successfully argued that they should be allowed to require the use of a special device to protect their network, placed in between the third-party modem and the line, called a Data Access Arrangement or DAA. The use of DAAs was mandatory from 1969 to 1975 when the new FCC Part 68 rules allowed the use of devices without a Bell-provided DAA, subject to equivalent circuitry being included in the third-party device.[32] Virtually all modems produced after the 1980s are direct-connect. Acoustic couplersThe Novation CAT acoustically coupled modem While Bell (AT&T) provided modems that attached via direct wire connection to the phone network as early as 1958, their regulations at the time did not permit the direct electrical connection of any non-Bell device to a telephone line. However, the Hush-a-Phone ruling allowed customers to attach any device to a telephone set as long as it did not interfere with its functionality. This allowed third-party (non-Bell) manufacturers to sell modems utilizing an acoustic coupler.[31] With an acoustic coupler, an ordinary telephone handset was placed in a cradle containing a speaker and microphone positioned to match up with those on the handset. The tones used by the modem were transmitted and received into the handset, which then relayed them to the phone line.[33] Because the modem was not electrically connected, it was incapable of picking up, hanging up or dialing, all of which required direct control of the line. Touch-tone dialing would have been possible, but touch-tone was not universally available at this time. Consequently, the dialing process was executed by the user lifting the handset, dialing, then placing the handset on the coupler. To accelerate this process, a user could purchase a dialer or Automatic Calling Unit. Automatic Calling Units / DialersEarly modems – could not place or receive calls on their own, but required human intervention for these steps. As early as 1964, Bell provided Automatic Calling Units that connected separately to a second serial port on a host machine and could be commanded to open the line, dial a number, and even ensure the far end had successfully connected before transferring control to the modem.[34] Later on, third-party models would become available, sometimes known simply as dialers, and features such as the ability to automatically sign in to time-sharing systems.[35] Eventually this capability would be built into modems and no longer require a separate device. Controller-based modems vs. soft modemsA PCI Winmodem soft modem (on the left) next to a conventional ISA modem (on the right) Prior to the 1990s, modems contained all the electronics and intelligence to convert data in discrete form to an analog (modulated) signal and back again, and to handle the dialing process, as a mix of discrete logic and special-purpose chips. This type of modem is sometimes referred to as controller-based.[36] In 1993, Digicom introduced the Connection 96 Plus, a modem which replaced the discrete and custom components with a general purpose digital signal processor, which could be reprogrammed to upgrade to newer standards.[37] Subsequently, USRobotics released the Sportster Winmodem, a similarly upgradable DSP-based design.[38] As this design trend spread, both terms – soft modem and Winmodem – obtained a negative connotation in non-Windows-based computing circles because the drivers were either unavailable for non-Windows platforms, or were only available as unmaintainable closed-source binaries, a particular problem for Linux users.[39] Later in the 1990s, software-based modems became available. These are essentially sound cards, and in fact a common design uses the AC'97 audio codec, which provides multichannel audio to a PC and includes three audio channels for modem signals. The audio sent and received on the line by a modem of this type is generated and processed entirely in software, often in a device driver. There is little functional difference from the user's perspective, but this design reduces the cost of a modem by moving most of the processing power into inexpensive software instead of expensive hardware DSPs or discrete components. Soft modems of both types either are internal cards or connect over external buses such as USB. They never utilize RS-232 because they require high bandwidth channels to the host computers to carry the raw audio signals generated (sent) or analyzed (received) by software. Since the interface is not RS-232, there is no standard for communication with the device directly. Instead, soft modems come with drivers which create an emulated RS-232 port, which standard modem software (such as an operating system dialer application) can communicate with. Voice/fax modems"Voice" and "fax" are terms added to describe any dial modem that is capable of recording/playing audio or transmitting/receiving faxes. Some modems are capable of all three functions.[40] Voice modems are used for computer telephony integration applications as simple as placing/receiving calls directly through a computer with a headset, and as complex as fully automated robocalling systems. Fax modems can be used for computer-based faxing, in which faxes are sent and received without inbound or outbound faxes ever needing to ever be printed on paper. This differs from efax, in which faxing occurs over the internet, in some cases involving no phone lines whatsoever. Modem Over IP (Modem Relay)The ITU-T V.150.1 Recommendation defines procedures for the inter-operation of PSTN to IP gateways.[41] In a classic example of this setup, each dial-up modem would connect to a modem relay gateway. The gateways are then connected to an IP network (such as the Internet). The analog connection from the modem is terminated at the gateway and the signal is demodulated. The demodulated control signals are transported over the IP network in an RTP packet type defined as State Signaling Events (SSEs). The data from the demodulated signal is sent over the IP network via a transport protocol (also defined as an RTP payload) called Simple Packet Relay Transport (SPRT). Both the SSE and SPRT packet formats are defined in the V.150.1 Recommendation (Annex C and Annex B respectively). The gateway at the remote end that receives the packets uses the information to re-modulate the signal for the modem connected at that end. While the V.150.1 Recommendation is not widely deployed, a pared down version of the recommendation called "Minimum Essential Requirements (MER) for V.150.1 Gateways" (SCIP-216) is used in Secure Telephony applications.[42] Cloud-based ModemsWhile traditionally a hardware device, fully software-based modems with the ability to be deployed in a cloud environment (such as Microsoft Azure or AWS) do exist.[43] Leveraging a Voice-over-IP (VoIP) connection through a SIP Trunk, the modulated audio samples are generated and sent over an IP network via RTP and an uncompressed audio codec (such as G.711 μ-law or a-law). PopularityA 1994 Software Publishers Association found that although 60% of computers in US households had a modem, only 7% of households went online.[44] A CEA study in 2006 found that dial-up Internet access was declining in the US. In 2000, dial-up Internet connections accounted for 74% of all US residential Internet connections.[citation needed] The United States demographic pattern for dial-up modem users per capita has been more or less mirrored in Canada and Australia for the past 20 years. Dial-up modem use in the US had dropped to 60% by 2003, and in 2006, stood at 36%.[citation needed] Voiceband modems were once the most popular means of Internet access in the US, but with the advent of new ways of accessing the Internet, the traditional 56K modem was losing popularity. The dial-up modem is still widely used by customers in rural areas where DSL, cable, wireless broadband, satellite, or fiber optic service are either not available or they are unwilling to pay what the available broadband companies charge.[45] In its 2012 annual report, AOL showed it still collected around $700 million in fees from about three million dial-up users. TTY/TDDTDD devices are a subset of the teleprinter intended for use by the deaf or hard of hearing, essentially a small teletype with a built-in dial-up modem and acoustic coupler. The first models produced in 1964 utilized FSK modulation much like early computer modems. A leased line modem also uses ordinary phone wiring, like dial-up and DSL, but does not use the same network topology. While dial-up uses a normal phone line and connects through the telephone switching system, and DSL uses a normal phone line but connects to equipment at the telco central office, leased lines do not terminate at the telco. Leased lines are pairs of telephone wire that have been connected together at one or more telco central offices so that they form a continuous circuit between two subscriber locations, such as a business' headquarters and a satellite office. They provide no power or dialtone - they are simply a pair of wires connected at two distant locations. A dialup modem will not function across this type of line, because it does not provide the power, dialtone and switching that those modems require. However, a modem with leased-line capability can operate over such a line, and in fact can have greater performance because the line is not passing through the telco switching equipment, the signal is not filtered, and therefore greater bandwidth is available. Leased-line modems can operate in 2-wire or 4-wire mode. The former uses a single pair of wires and can only transmit in one direction at a time, while the latter uses two pairs of wires and can transmit in both directions simultaneously. When two pairs are available, bandwidth can be as high as 1.5 Mbit/s, a full data T1 circuit.[46]
DSL modem Cable modem The term broadband was previously[47][48] used to describe communications faster than what was available on voice grade channels. The term broadband gained widespread adoption in the late 1990s to describe internet access technology exceeding the 56 kilobit/s maximum of dialup. There are many broadband technologies, such as various DSL (digital subscriber line) technologies and cable broadband. DSL technologies such as ADSL, HDSL, and VDSL use telephone lines (wires that were installed by a telephone company and originally intended for use by a telephone subscriber) but do not utilize most of the rest of the telephone system. Their signals are not sent through ordinary phone exchanges, but are instead received by special equipment (a DSLAM) at the telephone company central office. Because the signal does not pass through the telephone exchange, no "dialing" is required, and the bandwidth constraints of an ordinary voice call are not imposed. This allows much higher frequencies, and therefore much faster speeds. ADSL in particular is designed to permit voice calls and data usage over the same line simultaneously. Similarly, cable modems use infrastructure originally intended to carry television signals, and like DSL, typically permit receiving television signals at the same time as broadband internet service. Other broadband modems include FTTx modems, satellite modems, and power line modems. TerminologyDifferent terms are used for broadband modems, because they frequently contain more than just a modulation/demodulation component. Because high-speed connections are frequently used by multiple computers at once, many broadband modems do not have direct (e.g. USB) PC connections, but connect over a network such as Ethernet or Wi-Fi. Early broadband modems offered Ethernet handoff allowing the use of one or more public IP addresses, but no other services such as NAT and DHCP that would allow multiple computers to share one connection. This led to many consumers purchasing separate "broadband routers," placed between the modem and their network, to perform these functions.[49][50] Eventually, ISPs began providing residential gateways which combined the modem and broadband router into a single package that provided routing, NAT, security features, and even Wi-Fi access in addition to modem functionality, so that subscribers could connect their entire household without purchasing any extra equipment. Even later, these devices were extended to provide "triple play" features such as telephony and television service. Nonetheless, these devices are still often referred to simply as "modems" by service providers and manufacturers.[51] Consequently, the terms "modem", "router", and "gateway" are now used interchangeably in casual speech, but in a technical context "modem" may carry a specific connotation of basic functionality with no routing or other features, while the others describe a device with features such as NAT.[52][53] Broadband modems may also handle authentication such as PPPoE. While it is often possible to authenticate a broadband connection from a users PC, as was the case with dial-up internet service, moving this task to the broadband modem allows it to establish and maintain the connection itself, which makes sharing access between PCs easier since each one does not have to authenticate separately. Broadband modems typically remain authenticated to the ISP as long as they are powered on.
A bluetooth radio module with built-in antenna (left) Any communication technology sending digital data wirelessly involves a modem. This includes direct broadcast satellite, WiFi, WiMax, mobile phones, GPS, Bluetooth and NFC. Modern telecommunications and data networks also make extensive use of radio modems where long distance data links are required. Such systems are an important part of the PSTN, and are also in common use for high-speed computer network links to outlying areas where fiber optic is not economical. Wireless modems come in a variety of types, bandwidths, and speeds. Wireless modems are often referred to as transparent or smart. They transmit information that is modulated onto a carrier frequency to allow many wireless communication links to work simultaneously on different frequencies.[relevant?] Transparent modems operate in a manner similar to their phone line modem cousins. Typically, they were half duplex, meaning that they could not send and receive data at the same time. Typically, transparent modems are polled in a round robin manner to collect small amounts of data from scattered locations that do not have easy access to wired infrastructure. Transparent modems are most commonly used by utility companies for data collection. Smart modems come with media access controllers inside, which prevents random data from colliding and resends data that is not correctly received. Smart modems typically require more bandwidth than transparent modems, and typically achieve higher data rates. The IEEE 802.11 standard defines a short range modulation scheme that is used on a large scale throughout the world. Mobile broadbandHuawei HSPA+ (EVDO) USB wireless modem from Movistar Colombia Huawei 4G+ Dual Band Modem Modems which use a mobile telephone system (GPRS, UMTS, HSPA, EVDO, WiMax, 5G etc.), are known as mobile broadband modems (sometimes also called wireless modems). Wireless modems can be embedded inside a laptop, mobile phone or other device, or be connected externally. External wireless modems include connect cards, USB modems, and cellular routers. Most GSM wireless modems come with an integrated SIM cardholder (i.e. Huawei E220, Sierra 881.) Some models are also provided with a microSD memory slot and/or jack for additional external antenna, (Huawei E1762, Sierra Compass 885.)[54][55] The CDMA (EVDO) versions do not typically use R-UIM cards, but use Electronic Serial Number (ESN) instead. Until the end of April 2011, worldwide shipments of USB modems surpassed embedded 3G and 4G modules by 3:1 because USB modems can be easily discarded. Embedded modems may overtake separate modems as tablet sales grow and the incremental cost of the modems shrinks, so by 2016, the ratio may change to 1:1.[56] Like mobile phones, mobile broadband modems can be SIM locked to a particular network provider. Unlocking a modem is achieved the same way as unlocking a phone, by using an 'unlock code'.[citation needed] An ONT providing data, telephone and television service A modem that connects to a fiber optic network is known as an optical network terminal (ONT) or optical network unit (ONU). These are commonly used in fiber to the home installations, installed inside or outside a house to convert the optical medium to a copper Ethernet interface, after which a router or gateway is often installed to perform authentication, routing, NAT, and other typical consumer internet functions, in addition to "triple play" features such as telephony and television service. Fiber optic systems can use quadrature amplitude modulation to maximize throughput. 16QAM uses a 16-point constellation to send four bits per symbol, with speeds on the order of 200 or 400 gigabits per second.[57][58] 64QAM uses a 64-point constellation to send six bits per symbol, with speeds up to 65 terabits per second. Although this technology has been announced, it may not yet be commonly used.[59][60][61]
Although the name modem is seldom used, some high-speed home networking applications do use modems, such as powerline ethernet. The G.hn standard for instance, developed by ITU-T, provides a high-speed (up to 1 Gbit/s) local area network using existing home wiring (power lines, phone lines, and coaxial cables). G.hn devices use orthogonal frequency-division multiplexing (OFDM) to modulate a digital signal for transmission over the wire. As described above, technologies like Wi-Fi and Bluetooth also use modems to communicate over radio at short distances. Null modem adapter A null modem cable is a specially wired cable connected between the serial ports of two devices, with the transmit and receive lines reversed. It is used to connect two devices directly without a modem. The same software or hardware typically used with modems (such as Procomm or Minicom) could be used with this type of connection. A null modem adapter is a small device with plugs on both end which is placed on the end of a normal "straight-through" serial cable to convert it into a null-modem cable. A "short haul modem" is a device that bridges the gap between leased-line and dial-up modems. Like a leased-line modem, they transmit over "bare" lines with no power or telco switching equipment, but are not intended for the same distances that leased lines can achieve. Ranges up to several miles are possible, but significantly, short-haul modems can be used for medium distances, greater than the maximum length of a basic serial cable but still relatively short, such as within a single building or campus. This allows a serial connection to be extended for perhaps only several hundred to several thousand feet, a case where obtaining an entire telephone or leased line would be overkill. While some short-haul modems do in fact use modulation, low-end devices (for reasons of cost or power consumption) are simple "line drivers" that increase the level of the digital signal but do not modulate it. These are not technically modems, but the same terminology is used for them.[62]
Wikibooks has a book on the topic of: Transferring Data between Standard Dial-Up Modems
Wikimedia Commons has media related to Modems.
Page 2In telecommunications, 5G is the fifth-generation technology standard for broadband cellular networks, which cellular phone companies began deploying worldwide in 2019, and is the planned successor to the 4G networks which provide connectivity to most current cellphones. 5G networks are predicted to have more than 1.7 billion subscribers and account for 25% of the worldwide mobile technology market by 2025, according to the GSM Association and Statista.[1][2] Like its predecessors, 5G networks are cellular networks, in which the service area is divided into small geographical areas called cells. All 5G wireless devices in a cell are connected to the Internet and telephone network by radio waves through a local antenna in the cell. The new networks have higher download speeds, eventually up to 10 gigabits per second (Gbit/s).[3] In addition to 5G being faster than existing networks, 5G has higher bandwidth and can thus connect more different devices, improving the quality of Internet services in crowded areas.[4] Due to the increased bandwidth, it is expected the networks will increasingly be used as general internet service providers (ISPs) for laptops and desktop computers, competing with existing ISPs such as cable internet, and also will make possible new applications in internet-of-things (IoT) and machine-to-machine areas. Cellphones with 4G capability alone are not able to use the 5G networks, as they are not backwards compatible with 4G.
5G networks are cellular networks, in which the service area is divided into small geographical areas called cells. All 5G wireless devices in a cell communicate by radio waves with a cellular base station via fixed antennas, over frequency channels assigned by the base station. The base stations, termed nodes, are connected to switching centers in the telephone network and routers for Internet access by high-bandwidth optical fiber or wireless backhaul connections. As in other cellular networks, a mobile device moving from one cell to another is automatically handed off seamlessly. 5G is expected to support up to a million devices per square kilometer.
The industry consortium setting standards for 5G, the 3rd Generation Partnership Project (3GPP), defines "5G" as any system using 5G NR (5G New Radio) software - a definition that came into general use by late 2018.
Several network operators use millimeter waves called FR2 in 5G terminology, for additional capacity and higher throughputs. Millimeter waves have a shorter range than the lower frequency microwaves, therefore the cells are of a smaller size. Millimeter waves also have more trouble passing through building walls. Millimeter-wave antennas are smaller than the large antennas used in previous cellular networks. Some are only a few centimeters long.
The increased data rate is achieved partly by using additional higher-frequency radio waves in addition to the low- and medium-band frequencies used in previous cellular networks. For providing a wide range of services, 5G networks can operate in three frequency bands – : low, medium, and high.
5G can be implemented in low-band, mid-band or high-band millimeter-wave 24 GHz up to 54 GHz. Low-band 5G uses a similar frequency range to 4G cellphones, 600–900 MHz, giving download speeds a little higher than 4G: 30–250 megabits per second (Mbit/s).[5] Low-band cell towers have a range and coverage area similar to 4G towers. Mid-band 5G uses microwaves of 1.7–4.7 GHz, allowing speeds of 100–900 Mbit/s, with each cell tower providing service up to several kilometers in radius. This level of service is the most widely deployed, and was deployed in many metropolitan areas in 2020. Some regions are not implementing the low band, making Mid-band the minimum service level. High-band 5G uses frequencies of 24–47 GHz, near the bottom of the millimeter wave band, although higher frequencies may be used in the future. It often achieves download speeds in the gigabit-per-second (Gbit/s) range, comparable to cable internet. However, millimeter waves (mmWave or mmW) have a more limited range, requiring many small cells.[6] They can be impeded or blocked by materials in walls or windows.[7]
Due to their higher cost, plans are to deploy these cells only in dense urban environments and areas where crowds of people congregate such as sports stadiums and convention centers. The above speeds are those achieved in actual tests in 2020, and speeds are expected to increase during rollout.[5] The spectrum ranging from 24.25–29.5 GHz has been the most licensed and deployed 5G mmWave spectrum range in the world.[citation needed] Rollout of 5G technology has led to debate over its security and relationship with Chinese vendors. It has also been the subject of health concerns and misinformation, including discredited conspiracy theories linking it to the COVID-19 pandemic.
The ITU-R has defined three main application areas for the enhanced capabilities of 5G. They are Enhanced Mobile Broadband (eMBB), Ultra Reliable Low Latency Communications (URLLC), and Massive Machine Type Communications (mMTC).[8] Only eMBB is deployed in 2020; URLLC and mMTC are several years away in most locations.[9] Enhanced Mobile Broadband (eMBB) uses 5G as a progression from 4G LTE mobile broadband services, with faster connections, higher throughput, and more capacity. This will benefit areas of higher traffic such as stadiums, cities, and concert venues.[10] Ultra-Reliable Low-Latency Communications (URLLC) refer to using the network for mission critical applications that require uninterrupted and robust data exchange. The short-packet data transmission is used to meet both reliability and latency requirements of the wireless communication networks.
Massive Machine-Type Communications (mMTC) would be used to connect to a large number of devices. 5G technology will connect some of the 50 billion connected IoT devices.[11] Most will use the less expensive Wi-Fi. Drones, transmitting via 4G or 5G, will aid in disaster recovery efforts, providing real-time data for emergency responders.[11] Most cars will have a 4G or 5G cellular connection for many services. Autonomous cars do not require 5G, as they have to be able to operate where they do not have a network connection.[12] However, most autonomous vehicles also feature teleoperations for mission accomplishment, and these greatly benefit from 5G technology.[13][14] This article possibly contains unsourced predictions, speculative material, or accounts of events that might not occur. Information must be verifiable and based on reliable published sources.(January 2022) 5G speeds will range from around 50 Mbps to 1,000 Mbps (1 Gbps) depending on the RF channel and BS load. The fastest 5G speeds would be in the mmWave bands and can reach 4 Gbit/s with carrier aggregation and MIMO (assuming a perfect channel and no other BS load).
Sub-6 GHz 5G (mid-band), by far the most common, can deliver between 10 and 1,000 Mbps; it will have a much further reach than mmWave bands. In the sub-6 bands, C-Band (n77/n78) will be deployed by various U.S. operators in 2022. C-Band had been planned to be deployed by Verizon and AT&T in early January 2022 but was delayed due to safety concerns raised by the Federal Aviation Administration.[15][16] Low bands (such as n5) offer a greater range, thereby a greater coverage area for a given site, but their speeds are lower than the mid and high bands.
In 5G, the ideal "air latency" is of the order of 8–12 milliseconds i.e., excluding delays due to HARQ retransmissions, handovers, etc. Retransmission latency and backhaul latency to the server must be added to the "air latency" for correct comparisons. Verizon reported the latency on its 5G early deployment is 30 ms. Edge Servers close to the towers can probably reduce latency to 10 - 15 ms.
Latency is much higher during handovers; ranging from 50 to 500 milliseconds depending on the type of handover. Reducing handover interruption time is an ongoing area of research and development.
5G uses adaptive modulation and coding scheme (MCS) to keep the bit error rate (BLER) extremely low. Whenever the error rate crosses a (very low) threshold the transmitter will switch to a lower MCS, which will be less error-prone. This way speed is sacrificed to ensure an almost zero error rate.
The range of 5G depends on many factors: transmit power, frequency, and interference. For example, mmWave (e.g.: n256 band) will have a lower range than mid-band (e.g.: n78 band) which will have a lower range than low-band (e.g.: n5 band)
Given the marketing hype on what 5G can offer, simulators and drive tests are used by cellular service providers for the precise measurement of 5G performance.
Initially, the term was associated with the International Telecommunication Union's IMT-2020 standard, which required a theoretical peak download speed of 20 gigabits per second and 10 gigabits per second upload speed, along with other requirements.[17] Then, the industry standards group 3GPP chose the 5G NR (New Radio) standard together with LTE as their proposal for submission to the IMT-2020 standard.[18][19] 5G NR can include lower frequencies (FR1), below 6 GHz, and higher frequencies (FR2), above 24 GHz. However, the speed and latency in early FR1 deployments, using 5G NR software on 4G hardware (non-standalone), are only slightly better than new 4G systems, estimated at 15 to 50% better.[20][21][22] The standard documents for 5G are organized by 3GPP.[23][24] The 5G system architecture is defined in TS 23.501.[25] The packet protocol for mobility management (establishing connection and moving between base stations) and session management (connecting to networks and network slices) is described in TS 24.501.[26] Specifications of key data structures are found in TS 23.003.[27] IEEE covers several areas of 5G with a core focus in wireline sections between the Remote Radio Head (RRH) and Base Band Unit (BBU). The 1914.1 standards focus on network architecture and dividing the connection between the RRU and BBU into two key sections. Radio Unit (RU) to the Distributor Unit (DU) being the NGFI-I (Next Generation Fronthaul Interface) and the DU to the Central Unit (CU) being the NGFI-II interface allowing a more diverse and cost-effective network. NGFI-I and NGFI-II have defined performance values which should be compiled to ensure different traffic types defined by the ITU are capable of being carried.[page needed] The IEEE 1914.3 standard is creating a new Ethernet frame format capable of carrying IQ data in a much more efficient way depending on the functional split utilized. This is based on the 3GPP definition of functional splits.[page needed] 5G NR (New Radio) is a new air interface developed for the 5G network.[28] It is supposed to be the global standard for the air interface of 3GPP 5G networks.[29] In the Internet of things (IoT), 3GPP is going to submit evolution of NB-IoT and eMTC (LTE-M) as 5G technologies for the LPWA (Low Power Wide Area) use case.[32] 5G 3.5 GHz cell site of Vodafone in Karlsruhe, Germany Beyond mobile operator networks, 5G is also expected to be used for private networks with applications in industrial IoT, enterprise networking, and critical communications, in what being described as NR-U (5G NR in Unlicensed Spectrum)[33] Initial 5G NR launches depended on pairing with existing LTE (4G) infrastructure in non-standalone (NSA) mode (5G NR radio with 4G core), before maturation of the standalone (SA) mode with the 5G core network.[34] As of April 2019, the Global Mobile Suppliers Association had identified 224 operators in 88 countries that have demonstrated, are testing or trialing, or have been licensed to conduct field trials of 5G technologies, are deploying 5G networks or have announced service launches.[35] The equivalent numbers in November 2018 were 192 operators in 81 countries.[36] The first country to adopt 5G on a large scale was South Korea, in April 2019. Swedish telecoms giant Ericsson predicted that 5G internet will cover up to 65% of the world's population by the end of 2025.[37] Also, it plans to invest 1 billion reals ($238.30 million) in Brazil to add a new assembly line dedicated to fifth-generation technology (5G) for its Latin American operations.[38] When South Korea launched its 5G network, all carriers used Samsung, Ericsson, and Nokia base stations and equipment, except for LG U Plus, who also used Huawei equipment.[39][40] Samsung was the largest supplier for 5G base stations in South Korea at launch, having shipped 53,000 base stations at the time, out of 86,000 base stations installed across the country at the time.[41] The first fairly substantial deployments were in April 2019. In South Korea, SK Telecom claimed 38,000 base stations, KT Corporation 30,000 and LG U Plus 18,000; of which 85% are in six major cities.[42] They are using 3.5 GHz (sub-6) spectrum in non-standalone (NSA) mode and tested speeds were from 193 to 430 Mbit/s down.[43] 260,000 signed up in the first month and 4.7 million by the end of 2019.[44] T-Mobile US was the 1st company in the world to launch a commercially available 5G NR Standalone network.[45] Nine companies sell 5G radio hardware and 5G systems for carriers: Altiostar, Cisco Systems, Datang Telecom/Fiberhome, Ericsson, Huawei, Nokia, Qualcomm, Samsung, and ZTE.[46][47][48][49][50][51][52] SpectrumLarge quantities of new radio spectrum (5G NR frequency bands) have been allocated to 5G.[53] For example, in July 2016, the U.S. Federal Communications Commission (FCC) freed up vast amounts of bandwidth in underused high-band spectrum for 5G. The Spectrum Frontiers Proposal (SFP) doubled the amount of millimeter-wave unlicensed spectrum to 14 GHz and created four times the amount of flexible, mobile-use spectrum the FCC had licensed to date.[54] In March 2018, European Union lawmakers agreed to open up the 3.6 and 26 GHz bands by 2020.[55] As of March 2019[update], there are reportedly 52 countries, territories, special administrative regions, disputed territories and dependencies that are formally considering introducing certain spectrum bands for terrestrial 5G services, are holding consultations regarding suitable spectrum allocations for 5G, have reserved spectrum for 5G, have announced plans to auction frequencies or have already allocated spectrum for 5G use.[56] In March 2019, the Global Mobile Suppliers Association released the industry's first database tracking worldwide 5G device launches.[57] In it, the GSA identified 23 vendors who have confirmed the availability of forthcoming 5G devices with 33 different devices including regional variants. There were seven announced 5G device form factors: (telephones (×12 devices), hotspots (×4), indoor and outdoor customer-premises equipment (×8), modules (×5), Snap-on dongles and adapters (×2), and USB terminals (×1)).[58] By October 2019, the number of announced 5G devices had risen to 129, across 15 form factors, from 56 vendors.[59] In the 5G IoT chipset arena, as of April 2019 there were four commercial 5G modem chipsets and one commercial processor/platform, with more launches expected in the near future.[60] On March 6, 2020, the first-ever all-5G smartphone Samsung Galaxy S20 was released. According to Business Insider, the 5G feature was showcased as more expensive in comparison with 4G; the line up starts at US$1,000, in comparison with Samsung Galaxy S10e which started at US$750.[61] On March 19, HMD Global, the current maker of Nokia-branded phones, announced the Nokia 8.3 5G, which it claimed as having a wider range of 5G compatibility than any other phone released to that time. The mid-range model, with an initial Eurozone price of €599, is claimed to support all 5G bands from 600 MHz to 3.8 GHz.[62] Many phone manufacturers support 5G. Apple iPhone 12 and later versions support 5G.[63][64] Google Pixel phones support it, since version 5a.[65] The air interface defined by 3GPP for 5G is known as New Radio (NR), and the specification is subdivided into two frequency bands, FR1 (below 6 GHz) and FR2 (24–54 GHz) Frequency range 1 (< 6 GHz)Otherwise known as sub-6, the maximum channel bandwidth defined for FR1 is 100 MHz, due to the scarcity of continuous spectrum in this crowded frequency range. The band most widely being used for 5G in this range is 3.3–4.2 GHz. The Korean carriers use the n78 band at 3.5 GHz. Some parties used the term "mid-band" frequency to refer to higher part of this frequency range that was not used in previous generations of mobile communication. Frequency range 2 (24–54 GHz)The minimum channel bandwidth defined for FR2 is 50 MHz and the maximum is 400 MHz, with two-channel aggregation supported in 3GPP Release 15. The higher the frequency, the greater the ability to support high data-transfer speeds. Signals in this frequency have been described as mmWave. FR2 coverage5G in the 24 GHz range or above use higher frequencies than 4G, and as a result, some 5G signals are not capable of traveling large distances (over a few hundred meters), unlike 4G or lower frequency 5G signals (sub 6 GHz). This requires placing 5G base stations every few hundred meters in order to use higher frequency bands. Also, these higher frequency 5G signals cannot penetrate solid objects easily, such as cars, trees, and walls, because of the nature of these higher frequency electromagnetic waves. 5G cells can be deliberately designed to be as inconspicuous as possible, which finds applications in places like restaurants and shopping malls.[66]
Massive MIMOMIMO systems use multiple antennas at the transmitter and receiver ends of a wireless communication system. Multiple antennas use the spatial dimension for multiplexing in addition to the time and frequency ones, without changing the bandwidth requirements of the system. Massive MIMO (multiple-input and multiple-output) antennas increases sector throughput and capacity density using large numbers of antennas. This includes Single User MIMO and Multi-user MIMO (MU-MIMO). Each antenna is individually-controlled and may embed radio transceiver components.[citation needed] Edge computingEdge computing is delivered by computing servers closer to the ultimate user. It reduces latency, data traffic congestion[67][68] and can improve service availability.[69] Small cellSmall cells are low-powered cellular radio access nodes that operate in licensed and unlicensed spectrum that have a range of 10 meters to a few kilometers. Small cells are critical to 5G networks, as 5G's radio waves can't travel long distances, because of 5G's higher frequencies.[70][71][72][73] BeamformingThere are two kinds of beamforming: digital and analog. Digital beamforming involves sending the data across multiple streams (layers), while analog beamforming shaping the radio waves to point in a specific direction. The analog BF technique combines the power from elements of the antenna array in such a way that signals at particular angles experience constructive interference, while other signals pointing to other angles experience destructive interference. This improves signal quality in the specific direction, as well as data transfer speeds.[citation needed] 5G uses both digital and analog beamforming to improve the system capacity.[74] Convergence of Wi-Fi and cellularOne expected benefit of the transition to 5G is the convergence of multiple networking functions to achieve cost, power, and complexity reductions. LTE has targeted convergence with Wi-Fi band/technology via various efforts, such as License Assisted Access (LAA; 5G signal in unlicensed frequency bands that are also used by Wi-Fi) and LTE-WLAN Aggregation (LWA; convergence with Wi-Fi Radio), but the differing capabilities of cellular and Wi-Fi have limited the scope of convergence. However, significant improvement in cellular performance specifications in 5G, combined with migration from Distributed Radio Access Network (D-RAN) to Cloud- or Centralized-RAN (C-RAN) and rollout of cellular small cells can potentially narrow the gap between Wi-Fi and cellular networks in dense and indoor deployments. Radio convergence could result in sharing ranging from the aggregation of cellular and Wi-Fi channels to the use of a single silicon device for multiple radio access technologies.[75] NOMA (non-orthogonal multiple access)NOMA (non-orthogonal multiple access) is a proposed multiple-access technique for future cellular systems via allocation of power.[citation needed] SDN/NFVInitially, cellular mobile communications technologies were designed in the context of providing voice services and Internet access. Today a new era of innovative tools and technologies is inclined towards developing a new pool of applications. This pool of applications consists of different domains such as the Internet of Things (IoT), web of connected autonomous vehicles, remotely controlled robots, and heterogeneous sensors connected to serve versatile applications.[76] In this context, network slicing has emerged as a key technology to efficiently embrace this new market model.[77] Channel codingThe channel coding techniques for 5G NR have changed from Turbo codes in 4G to polar codes for the control channels and LDPC (low-density parity check codes) for the data channels.[78][79] Operation in unlicensed spectrumIn December 2018, 3GPP began working on unlicensed spectrum specifications known as 5G NR-U, targeting 3GPP Release 16.[80] Qualcomm has made a similar proposal for LTE in unlicensed spectrum. 5G-Advanced is a name for 3GPP release 18, which as of 2021[update] is under conceptual development.[81][82][83] A report published by the European Commission and European Agency for Cybersecurity details the security issues surrounding 5G. The report warns against using a single supplier for a carrier's 5G infrastructure, especially those based outside the European Union. (Nokia and Ericsson are the only European manufacturers of 5G equipment.)[84] On October 18, 2018, a team of researchers from ETH Zurich, the University of Lorraine and the University of Dundee released a paper entitled, "A Formal Analysis of 5G Authentication".[85][86] It alerted that 5G technology could open ground for a new era of security threats. The paper described the technology as "immature and insufficiently tested," and one that "enables the movement and access of vastly higher quantities of data, and thus broadens attack surfaces". Simultaneously, network security companies such as Fortinet,[87] Arbor Networks,[88] A10 Networks,[89] and Voxility[90] advised on personalized and mixed security deployments against massive DDoS attacks foreseen after 5G deployment. IoT Analytics estimated an increase in the number of IoT devices, enabled by 5G technology, from 7 billion in 2018 to 21.5 billion by 2025.[91] This can raise the attack surface for these devices to a substantial scale, and the capacity for DDoS attacks, cryptojacking, and other cyberattacks could boost proportionally.[86] Due to fears of potential espionage of users of Chinese equipment vendors, several countries (including the United States, Australia and the United Kingdom as of early 2019)[92] have taken actions to restrict or eliminate the use of Chinese equipment in their respective 5G networks. Chinese vendors and the Chinese government have denied claims of espionage.[clarification needed] On 7 October 2020, the UK Parliament's Defence Committee released a report claiming that there was clear evidence of collusion between Huawei and Chinese state and the Chinese Communist Party. The UK Parliament's Defence Committee said that the government should consider removal of all Huawei equipment from its 5G networks earlier than planned.[93] Electromagnetic interferenceWeather forecasting
The spectrum used by various 5G proposals, especially the n258 band centered at 26 GHz, will be near that of passive remote sensing such as by weather and Earth observation satellites, particularly for water vapor monitoring at 23.8 GHz.[94] Interference is expected to occur due to such proximity and its effect could be significant without effective controls. An increase in interference already occurred with some other prior proximate band usages.[95][96] Interference to satellite operations impairs numerical weather prediction performance with substantially deleterious economic and public safety impacts in areas such as commercial aviation.[97][98] The concerns prompted U.S. Secretary of Commerce Wilbur Ross and NASA Administrator Jim Bridenstine in February 2019 to urge the FCC to delay some spectrum auction proposals, which was rejected.[99] The chairs of the House Appropriations Committee and House Science Committee wrote separate letters to FCC chairman Ajit Pai asking for further review and consultation with NOAA, NASA, and DoD, and warning of harmful impacts to national security.[100] Acting NOAA director Neil Jacobs testified before the House Committee in May 2019 that 5G out-of-band emissions could produce a 30% reduction in weather forecast accuracy and that the resulting degradation in ECMWF model performance would have resulted in failure to predict the track and thus the impact of Superstorm Sandy in 2012. The United States Navy in March 2019 wrote a memorandum warning of deterioration and made technical suggestions to control band bleed-over limits, for testing and fielding, and for coordination of the wireless industry and regulators with weather forecasting organizations.[101] At the 2019 quadrennial World Radiocommunication Conference (WRC), atmospheric scientists advocated for a strong buffer of −55 dBW, European regulators agreed on a recommendation of −42 dBW, and US regulators (the FCC) recommended a restriction of −20 dBW, which would permit signals 150 times stronger than the European proposal. The ITU decided on an intermediate −33 dBW until September 1, 2027, and after that a standard of −39 dBW.[102] This is closer to the European recommendation but even the delayed higher standard is much weaker than that pleaded for by atmospheric scientists, triggering warnings from the World Meteorological Organization (WMO) that the ITU standard, at 10 times less stringent than its recommendation, brings the "potential to significantly degrade the accuracy of data collected".[103] A representative of the American Meteorological Society (AMS) also warned of interference,[104] and the European Centre for Medium-Range Weather Forecasts (ECMWF), sternly warned, saying that society risks "history repeat[ing] itself" by ignoring atmospheric scientists' warnings (referencing global warming, monitoring of which could be imperiled).[105] In December 2019, a bipartisan request was sent from the US House Science Committee to the Government Accountability Office (GAO) to investigate why there is such a discrepancy between recommendations of US civilian and military science agencies and the regulator, the FCC.[106] AviationThe United States FAA has warned that radar altimeters on aircraft, which operate between 4.2 and 4.4 GHz, might be affected by 5G operations between 3.7 and 3.98 GHz. This is particularly an issue with older altimeters using RF filters[107] which lack protection from neighboring bands.[108] This is not as much of an issue in Europe, where 5G uses lower frequencies between 3.4 and 3.8 GHz.[109] Nonetheless, the DGAC in France has also expressed similar worries and recommended 5G phones be turned off or be put in airplane mode during flights.[110] On December 31, 2021, U.S. Transportation Secretary Pete Buttigieg and Steve Dickinson, administrator of the Federal Aviation Administration asked the chief executives of AT&T and Verizon to delay 5G implementation over aviation concerns. The government officials asked for a two-week delay starting on January 5, 2022, while investigations are conducted on the effects on radar altimeters. The government transportation officials also asked the cellular providers to hold off their new 5G service near 50 priority airports, to minimize disruption to air traffic that would be caused by some planes being disallowed from landing in poor visibility.[111] After coming to an agreement with government officials the day before,[112] Verizon and AT&T activated their 5G networks on January 19, 2022, except for certain towers near 50 airports.[113] AT&T scaled back its deployment even further than its agreement with the FAA required.[114] The FAA rushed to test and certify radar altimeters for interference so that planes could be allowed to perform instrument landings (e.g. at night and in low visibility) at affected airports. By January 16, it had certified equipment on 45% of the U.S. fleet, and 78% by January 20.[115] Airlines complained about the avoidable impact on their operations, and commentators said the affair called into question the competence of the FAA.[116] Several international airlines substituted different planes so they could avoid problems landing at scheduled airports, and about 2% of flights (320) were cancelled by the evening of January 19.[117] SatelliteA number of 5G networks deployed on the radio frequency band of 3.3–3.6 GHz is expected to cause interference with C-Band satellite stations, which operate by receiving satellite signals at 3.4–4.2 GHz frequency.[118] This interference can be mitigated with low-noise block downconverters and waveguide filters.[118] Wi-FiIn regions like the US and EU, the 6 GHz band is to be opened up for unlicensed applications, which would permit the deployment of 5G-NR Unlicensed, 5G version of LTE in unlicensed spectrum, as well as Wi-Fi 6e. However, interference could occur with the co-existence of different standards in the frequency band.[119] OverhypeThere have been concerns surrounding the promotion of 5G, questioning whether the technology is overhyped. There are questions on whether 5G will truly change the customer experience,[120] ability for 5G's mmWave signal to provide significant coverage,[121][122] overstating what 5G can achieve or misattributing continuous technological improvement to "5G",[123] lack of new use case for carriers to profit from,[124] wrong focus on emphasizing direct benefits on individual consumers instead of for internet of things devices or solving the Last mile problem,[125] and overshadowing the possibility that in some aspects there might be other more appropriate technologies.[126] Such sort of concerns have also lead to consumers not trusting information provided by cellular providers on the topic.[127] There is a long history of fear and anxiety surrounding wireless signals that predates 5G technology. The fears about 5G are similar to those that have persisted throughout the 1990s and 2000s. They center on fringe claims that non-ionizing radiation poses dangers to human health.[128] Unlike ionizing radiation, non-ionizing radiation cannot remove electrons from atoms. The CDC says "Exposure to intense, direct amounts of non-ionizing radiation may result in damage to tissue due to heat. This is not common and mainly of concern in the workplace for those who work on large sources of non-ionizing radiation devices and instruments."[129] Some advocates of fringe health claim the regulatory standards are too low and influenced by lobbying groups.[128] An anti-5G sticker in Luxembourg. Many popular books of dubious merit have been published on the subject including one by Joseph Mercola alleging that wireless technologies caused numerous conditions from ADHD to heart diseases and brain cancer. Mercola has drawn sharp criticism for his anti-vaccinationism during the COVID-19 pandemic and was warned by the FDA to stop selling fake COVID-19 cures through his online alternative medicine business.[128][130] According to the New York Times, one origin of the 5G health controversy was an erroneous unpublished study that physicist Bill P. Curry did for the Broward County School Board in 2000 which indicated that the absorption of external microwaves by brain tissue increased with frequency.[131] According to experts this was wrong, the millimeter waves used in 5G are safer than lower frequency microwaves because they cannot penetrate the skin and reach internal organs. Curry had confused in vitro and in vivo research. However Curry's study was widely distributed on the internet. Writing in The New York Times in 2019, William Broad reported that RT America began airing programming linking 5G to harmful health effects which "lack scientific support", such as "brain cancer, infertility, autism, heart tumors, and Alzheimer's disease". Broad asserted that the claims had increased. RT America had run seven programs on this theme by mid-April 2019 but only one in the whole of 2018. The network's coverage had spread to hundreds of blogs and websites.[132] In April 2019, the city of Brussels in Belgium blocked a 5G trial because of radiation rules.[133] In Geneva, Switzerland, a planned upgrade to 5G was stopped for the same reason.[134] The Swiss Telecommunications Association (ASUT) has said that studies have been unable to show that 5G frequencies have any health impact.[135] According to CNET,[136] "Members of Parliament in the Netherlands are also calling on the government to take a closer look at 5G. Several leaders in the United States Congress have written to the Federal Communications Commission expressing concern about potential health risks. In Mill Valley, California, the city council blocked the deployment of new 5G wireless cells."[136][137][138][139][140] Similar concerns were raised in Vermont[141] and New Hampshire.[136] The US FDA is quoted saying that it "continues to believe that the current safety limits for cellphone radiofrequency energy exposure remain acceptable for protecting the public health."[142] After campaigning by activist groups, a series of small localities in the UK, including Totnes, Brighton and Hove, Glastonbury, and Frome, passed resolutions against the implementation of further 5G infrastructure, though these resolutions have no impact on rollout plans.[143][144][145] COVID-19 conspiracy theories and arson attacksThe World Health Organization published a mythbuster infographic to combat the conspiracy theories about COVID-19 and 5G. As the introduction of 5G technology coincided with the time of COVID-19 pandemic, several conspiracy theories circulating online posited a link between COVID-19 and 5G.[146] This has led to dozens of arson attacks being made on telecom masts in the Netherlands (Amsterdam, Rotterdam, etc.), Ireland (Cork,[147] etc.), Cyprus, the United Kingdom (Dagenham, Huddersfield, Birmingham, Belfast and Liverpool[148][149]), Belgium (Pelt), Italy (Maddaloni), Croatia (Bibinje[150]) and Sweden.[151] It led to at least 61 suspected arson attacks against telephone masts in the United Kingdom alone[152] and over twenty in The Netherlands. In the early months of the pandemic anti-lockdown protesters at protests over responses to the COVID-19 pandemic in Australia were seen with anti-5G signs, an early sign of what became a wider campaign by conspiracy theorists to link the pandemic with 5G technology. There are two versions of the 5G-COVID-19 conspiracy theory:[128]
In various parts of the world, carriers have launched numerous differently branded technologies, such as "5G Evolution", which advertise improving existing networks with the use of "5G technology".[153] However, these pre-5G networks are an improvement on specifications of existing LTE networks that are not exclusive to 5G. While the technology promises to deliver higher speeds, and is described by AT&T as a "foundation for our evolution to 5G while the 5G standards are being finalized," it cannot be considered to be true 5G. When AT&T announced 5G Evolution, 4x4 MIMO, the technology that AT&T is using to deliver the higher speeds, had already been put in place by T-Mobile without being branded with the 5G moniker. It is claimed that such branding is a marketing move that will cause confusion with consumers, as it is not made clear that such improvements are not true 5G.[154]
5G Automotive Association have been promoting the C-V2X communication technology that will first be deployed in 4G. It provides for communication between vehicles and infrastructures.[176] Digital TwinsA real time digital twin of the real object such as a turbine engine, aircraft, wind turbines, offshore platform and pipelines. 5G networks helps[177] in building it due to the latency and throughput to capture near real-time IoT data and support digital twins.[178] Public safetyMission-critical push-to-talk (MCPTT) and mission-critical video and data are expected to be furthered in 5G.[179] Fixed wirelessFixed wireless connections will offer an alternative to fixed line broadband (ADSL, VDSL, Fiber optic, and DOCSIS connections) in some locations.[180][181][182] Wireless video transmission for broadcast applicationsSony has tested the possibility of using local 5G networks to replace the SDI cables currently used in broadcast camcorders.[183] The 5G Broadcast tests started around 2020 (Orkneys, Bavaria, Austria, Central Bohemia) based on FeMBMS (Further evolved multimedia broadcast multicast service).[184] The aim is to serve unlimited number of mobile or fixed devices with video (TV) and audio (radio) streams without these consuming any data flow or even being authenticated in a network.
|