In fiber-optic communications, wavelength-division multiplexing (WDM) is a technology which multiplexes a number of optical carrier signals onto a single optical fiber by using different wavelengths (i.e., colors) of laser light. This technique enables bidirectional communications over one strand of fiber, as well as multiplication of capacity.
The term WDM is commonly applied to an optical carrier, which is typically described by its wavelength, whereas frequency-division multiplexing typically applies to a radio carrier which is more often described by frequency. This is purely conventional because wavelength and frequency communicate the same information. Specifically, frequency (in Hertz, which is cycles per second) multiplied by wavelength (the physical length of one cycle) equals the velocity of the carrier wave. In a vacuum, this is the velocity of light, usually denoted by the lower case letter, c. In glass fiber, it is substantially slower, usually about 0.7 times c. The data rate, which ideally might be at the carrier frequency, in practical systems is always a fraction of the carrier frequency.
A WDM system uses a multiplexer at the transmitter to join the several signals together and a demultiplexer at the receiver to split them apart. With the right type of fiber, it is possible to have a device that does both simultaneously and can function as an optical add-drop multiplexer. The optical filtering devices used have conventionally been etalons (stable solid-state single-frequency Fabry-Pérot interferometers in the form of thin-film-coated optical glass). As there are three different WDM types, whereof one is called "WDM", the notation "xWDM" is normally used when discussing the technology as such.
The concept was first published in 1978, and by 1980 WDM systems were being realized in the laboratory. The first WDM systems combined only two signals. Modern systems can handle 160 signals and can thus expand a basic 100 Gbit/s system over a single fiber pair to over 16 Tbit/s. A system of 320 channels is also present (12.5 GHz channel spacing, see below.)
WDM systems are popular with telecommunications companies because they allow them to expand the capacity of the network without laying more fiber. By using WDM and optical amplifiers, they can accommodate several generations of technology development in their optical infrastructure without having to overhaul the backbone network. The capacity of a given link can be expanded simply by upgrading the multiplexers and demultiplexers at each end.
This is often done by the use of optical-to-electrical-to-optical (O/E/O) translation at the very edge of the transport network, thus permitting interoperation with existing equipment with optical interfaces.
Most WDM systems operate on single-mode fiber optical cables which have a core diameter of 9 µm. Certain forms of WDM can also be used in multi-mode fiber cables (also known as premises cables) which have core diameters of 50 or 62.5 µm.
Early WDM systems were expensive and complicated to run. However, recent standardization and a better understanding of the dynamics of WDM systems have made WDM less expensive to deploy.
Optical receivers, in contrast to laser sources, tend to be wideband devices. Therefore, the demultiplexer must provide the wavelength selectivity of the receiver in the WDM system.
WDM systems are divided into three different wavelength patterns: normal (WDM), coarse (CWDM) and dense (DWDM). Normal WDM (sometimes called BWDM) uses the two normal wavelengths 1310 and 1550 on one fiber. Coarse WDM provides up to 16 channels across multiple transmission windows of silica fibers. Dense WDM (DWDM) uses the C-Band (1530 nm-1565 nm) transmission window but with denser channel spacing. Channel plans vary, but a typical DWDM system would use 40 channels at 100 GHz spacing or 80 channels with 50 GHz spacing. Some technologies are capable of 12.5 GHz spacing (sometimes called ultra-dense WDM). New amplification options (Raman amplification) enable the extension of the usable wavelengths to the L-band (1565 nm-1625 nm), more or less doubling these numbers.
Coarse wavelength division multiplexing (CWDM), in contrast to DWDM, uses increased channel spacing to allow less-sophisticated and thus cheaper transceiver designs. To provide 16 channels on a single fiber, CWDM uses the entire frequency band spanning the second and third transmission windows (1310/1550 nm respectively) including the critical frequencies where OH scattering may occur. OH-free silica fibers are recommended if the wavelengths between second and third transmission windows is to be used. Avoiding this region, the channels 47, 49, 51, 53, 55, 57, 59, 61 remain and these are the most commonly used. With OS2 fibers the water peak problem is overcome, and all possible 18 channels can be used.
WDM, CWDM and DWDM are based on the same concept of using multiple wavelengths of light on a single fiber but differ in the spacing of the wavelengths, number of channels, and the ability to amplify the multiplexed signals in the optical space. EDFA provide an efficient wideband amplification for the C-band, Raman amplification adds a mechanism for amplification in the L-band. For CWDM, wideband optical amplification is not available, limiting the optical spans to several tens of kilometres.
Originally, the term coarse wavelength division multiplexing (CWDM) was fairly generic and described a number of different channel configurations. In general, the choice of channel spacings and frequency in these configurations precluded the use of erbium doped fiber amplifiers (EDFAs). Prior to the relatively recent ITU standardization of the term, one common definition for CWDM was two or more signals multiplexed onto a single fiber, with one signal in the 1550 nm band and the other in the 1310 nm band.
In 2002, the ITU standardized a channel spacing grid for CWDM (ITU-T G.694.2) using the wavelengths from 1270 nm through 1610 nm with a channel spacing of 20 nm. ITU G.694.2 was revised in 2003 to shift the channel centers by 1 nm so, strictly speaking, the center wavelengths are 1271 to 1611 nm. Many CWDM wavelengths below 1470 nm are considered unusable on older G.652 specification fibers, due to the increased attenuation in the 1270-1470 nm bands. Newer fibers which conform to the G.652.C and G.652.D standards, such as Corning SMF-28e and Samsung Widepass, nearly eliminate the "water peak" attenuation peak at 1383nm and allow for full operation of all 18 ITU CWDM channels in metropolitan networks.
The main characteristic of the recent ITU CWDM standard is that the signals are not spaced appropriately for amplification by EDFAs. This limits the total CWDM optical span to somewhere near 60 km for a 2.5 Gbit/s signal, which is suitable for use in metropolitan applications. The relaxed optical frequency stabilization requirements allow the associated costs of CWDM to approach those of non-WDM optical components.
CWDM is being used in cable television networks, where different wavelengths are used for the downstream and upstream signals. In these systems, the wavelengths used are often widely separated. For example, the downstream signal might be at 1310 nm while the upstream signal is at 1550 nm.
Some GBIC and small form factor pluggable (SFP) transceivers utilize standardized CWDM wavelengths. GBIC and SFP CWDM optics allow a legacy switch system to be "converted" to enable wavelength multiplexed transport over a fiber by selecting compatible transceiver wavelengths for use with an inexpensive passive optical multiplexing device.
The 10GBASE-LX4 10 Gbit/s physical layer standard is an example of a CWDM system in which four wavelengths near 1310 nm, each carrying a 3.125 gigabit-per-second (Gbit/s) data stream, are used to carry 10 Gbit/s of aggregate data.
Passive CWDM is an implementation of CWDM that uses no electrical power. It separates the wavelengths using passive optical components such as bandpass filters and prisms. Many manufacturers are promoting passive CWDM to deploy fiber to the home.
Dense wavelength division multiplexing (DWDM) refers originally to optical signals multiplexed within the 1550 nm band so as to leverage the capabilities (and cost) of erbium doped fiber amplifiers (EDFAs), which are effective for wavelengths between approximately 1525-1565 nm (C band), or 1570-1610 nm (L band). EDFAs were originally developed to replace SONET/SDH optical-electrical-optical (OEO) regenerators, which they have made practically obsolete. EDFAs can amplify any optical signal in their operating range, regardless of the modulated bit rate. In terms of multi-wavelength signals, so long as the EDFA has enough pump energy available to it, it can amplify as many optical signals as can be multiplexed into its amplification band (though signal densities are limited by choice of modulation format). EDFAs therefore allow a single-channel optical link to be upgraded in bit rate by replacing only equipment at the ends of the link, while retaining the existing EDFA or series of EDFAs through a long haul route. Furthermore, single-wavelength links using EDFAs can similarly be upgraded to WDM links at reasonable cost. The EDFA's cost is thus leveraged across as many channels as can be multiplexed into the 1550 nm band.
At this stage, a basic DWDM system contains several main components:
The introduction of the ITU-T G.694.1frequency grid in 2002 has made it easier to integrate WDM with older but more standard SONET/SDH systems. WDM wavelengths are positioned in a grid having exactly 100 GHz (about 0.8 nm) spacing in optical frequency, with a reference frequency fixed at 193.10 THz (1,552.52 nm). The main grid is placed inside the optical fiber amplifier bandwidth, but can be extended to wider bandwidths. The first commercial deployment of DWDM was made by Ciena Corporation on the Sprint network in June 1996. Today's DWDM systems use 50 GHz or even 25 GHz channel spacing for up to 160 channel operation.
DWDM systems have to maintain more stable wavelength or frequency than those needed for CWDM because of the closer spacing of the wavelengths. Precision temperature control of laser transmitter is required in DWDM systems to prevent "drift" off a very narrow frequency window of the order of a few GHz. In addition, since DWDM provides greater maximum capacity it tends to be used at a higher level in the communications hierarchy than CWDM, for example on the Internet backbone and is therefore associated with higher modulation rates, thus creating a smaller market for DWDM devices with very high performance. These factors of smaller volume and higher performance result in DWDM systems typically being more expensive than CWDM.
Recent innovations in DWDM transport systems include pluggable and software-tunable transceiver modules capable of operating on 40 or 80 channels. This dramatically reduces the need for discrete spare pluggable modules, when a handful of pluggable devices can handle the full range of wavelengths.
This section's tone or style may not reflect the encyclopedic tone used on Wikipedia. (December 2018) (Learn how and when to remove this template message)
At this stage, some details concerning wavelength-converting transponders should be discussed, as this will clarify the role played by current DWDM technology as an additional optical transport layer. It will also serve to outline the evolution of such systems over the last 10 or so years.
As stated above, wavelength-converting transponders served originally to translate the transmit wavelength of a client-layer signal into one of the DWDM system's internal wavelengths in the 1,550 nm band (note that even external wavelengths in the 1,550 nm will most likely need to be translated, as they will almost certainly not have the required frequency stability tolerances nor will it have the optical power necessary for the system's EDFA).
In the mid-1990s, however, wavelength converting transponders rapidly took on the additional function of signal regeneration. Signal regeneration in transponders quickly evolved through 1R to 2R to 3R and into overhead-monitoring multi-bitrate 3R regenerators. These differences are outlined below:
As mentioned above, intermediate optical amplification sites in DWDM systems may allow for the dropping and adding of certain wavelength channels. In most systems deployed as of August 2006 this is done infrequently, because adding or dropping wavelengths requires manually inserting or replacing wavelength-selective cards. This is costly, and in some systems requires that all active traffic be removed from the DWDM system, because inserting or removing the wavelength-specific cards interrupts the multi-wavelength optical signal.
With a ROADM, network operators can remotely reconfigure the multiplexer by sending soft commands. The architecture of the ROADM is such that dropping or adding wavelengths does not interrupt the "pass-through" channels. Numerous technological approaches are utilized for various commercial ROADMs, the tradeoff being between cost, optical power, and flexibility.
This section needs expansion. You can help by adding to it. (June 2008)
When the network topology is a mesh, where nodes are interconnected by fibers to form an arbitrary graph, an additional fiber interconnection device is needed to route the signals from an input port to the desired output port. These devices are called optical crossconnectors (OXCs). Various categories of OXCs include electronic ("opaque"), optical ("transparent"), and wavelength selective devices.
Cisco's Enhanced WDM system combines 1 Gb Coarse Wave Division Multiplexing (CWDM) connections using SFPs and GBICs with 10 Gb Dense Wave Division Multiplexing (DWDM) connections using XENPAK, X2 or XFP DWDM modules. These DWDM connections can either be passive or boosted to allow a longer range for the connection. In addition to this, CFP modules deliver 100 Gbit/s Ethernet suitable for high speed Internet backbone connections.
See also transponders (optical communications) for different functional views on the meaning of optical transponders.
There are several simulation tools that can be used to design WDM systems.