So You Want Wi-Fi?

Developers implementing Wi-Fi in their products are presented with a selection of 802.11 standards, including 802.11a, b, g, n or a combination of standards. So which one should you choose? Let’s review the advantages and tradeoffs of these technologies to help you make an educated selection.

Background on Wi-Fi

Wi-Fi, used by more than 700 million people, is one of the fastest growing technologies. It was invented in 1991 by NCR Corporation/AT&T in the Netherlands. “Wi-Fi” is not a technical term but is commonly used to refer to the 802.11 standards, which were established by the IEEE starting in 1997. The standard has evolved over the past 15 years to support more applications in a rapidly growing market. In 1999, the Wi-Fi Alliance was formed to promote the growth of Wi-Fi and ensure industry success by certifying products using Wi-Fi technology.

802.11b

802.11b was established by the IEEE in 1999 to improve the data rate of the original 802.11 standard created in 1997. The original 802.11 standard supported a maximum data rate of 2 Mbps. This standard is no longer supported by the industry and therefore is not relevant to this analysis. All 802.11/Wi-Fi standards use the unlicensed radio spectrum. 802.11b supports only the license-free ISM band around 2.4 GHz. The maximum data rate of 802.11b is 11 Mbps, which is the slowest relative to the other 802.11 standards. Currently, the main advantages of 802.11b are cost of chipsets and signal integrity, which is inherently due to high sensitivity at lower data rates.

802.11a

802.11a was formed around the same time as 802.11b, but it has taken a lot longer for the market to adopt this standard. 802.11a supports a bandwidth up to 54 Mbps and uses the license-free band around 5.8 GHz. The higher frequency degrades the range of the RF signal and does not transmit well through obstructions such as walls. However, the 5.8 GHz band is less crowded, and therefore, 802.11a excels in interference performance. 802.11a solutions are also typically more expensive than those using 802.11b, due to the more costly components of 802.11a.

802.11g

The 802.11g standard was released in 2003. The IEEE intended the 802.11g standard to combine the benefits of 802.11a and 802.11b. The 802.11g standard uses the 2.4 GHz band and supports a bandwidth up to 54 Mbps. The higher data rate at the lower frequency provides the best of both worlds by enabling high-bandwidth systems to work over a longer range. 802.11g is backward-compatible with 802.11b. By moving out of the 5.8 GHz band, 802.11g systems are more susceptible to interference by other devices in the 2.4 GHz band, such as mobile phones, microwave ovens, and Bluetooth headsets.

802.11n

802.11n is the newest standard, although it is not officially ratified. It was formed by the IEEE in 2009. The 802.11n standard was designed to allow for more bandwidth over the existing 802.11 standards by utilizing a technology called MIMO (multiple input, multiple output). With four spatial streams at a channel width of 40 MHz, the maximum data rate theoretically increases to 600 Mbps. In reality, most implementations do not come close to 600 Mbps; they are closer to 65 Mbps without MIMO and 300 Mbps with 2x2 MIMO with 40 MHz channel bandwidth.

802.11n also operates in both the 2.4 GHz and 5.8 GHz bands; however, most available chipsets support only the 2.4 GHz band due to added solution cost. Another advantage of MIMO is better range performance in a multi-path environment due to increased signal integrity. The cost to implement an 802.11n solution is typically more than the cost of 802.11b/g.

So which one to use?

The major silicon providers, including Atheros, Broadcom, Marvel, and Texas Instruments, support multiple IC options; however, it is clear that they are promoting the two most comprehensive solutions: 802.11a/b/g/n and 802.11b/g/n. The lack of promotion for 802.11b/g-only chips seems to be due to the demand for applications requiring 802.11n. Perhaps a better question is whether or not to use 5.8 GHz or 2.4 GHz and whether or not MIMO is required to meet the data rate requirements.

Module versus integrated design

Once you have deliberated over which standard to use, you are faced with how to implement it. It is not unheard of for silicon providers to require extremely high minimum order quantities to support a chip level design. Considering the history and viability behind these companies, there are probably very good reasons behind the large MOQ requirement. A true make-versus-buy analysis warrants its own article, but to leave you with a few thoughts, we have included some of the main points to consider:

  • Compliance: EMC testing and certification can cost tens of thousands of dollars. By using a pre-certified module, you can eliminate the need for testing the product as an “intentional radiator.”
  • Test equipment: Designing an 802.11 radio requires the appropriate test equipment to validate conformance to the standard and the RF performance. This equipment can cost more than $50,000, and more than one set may be required.
  • Risk and time to market: Designing in a discrete solution can take months to complete and validate. Using a pre-certified and validated module drastically reduces your risk and development time.

Laird Embedded Wireless Solutions offers fully certified modules for 802.11a/b/g/n, BT 2.1+EDR and BLE 4.0 based on Texas Instruments’ WL1271 and WL1273. Laird also offers a serial to WLAN module based on the Texas Instruments’ CC3000. The TiWi™ modules are part of a complete line of certified RF modules.

  • So You Want Wi-Fi?已关闭评论
    A+
发布日期:2019年07月13日  所属分类:参考设计