A rare large-scale, wireless LAN stress test using three vendors' equipment found many WLANs could run into a performance ceiling as they grow in size and traffic. The study's authors say the problem lies not with the vendors' gear but with the design of the 802.11 protocol.
The tests confirm two troubling issues for high-density nets, according to Novarum, a consulting firm that developed and ran the tests. First, co-channel radio interference among the access points clobbers the aggregate throughput of the WLAN. Second, the conventional thin access point/controller architecture doesn't scale well as the number of access points increase in a given area.
According to Novarum, performance degrades because of radio interference in dense deployments, and because the conventional model for enterprise wireless LANs -- thin access points linked with a controller -- scales poorly.
These problems are related to the way the 802.11 media access control (MAC) layer is designed, how it handles acknowledgements and retries, and how these become problematic under high and sustained traffic loads. In many WLAN products, and the WLANs built with them, this protocol inefficiency hasn't been an issue, because there have been relatively few wireless clients and their traffic has consisted of relatively light, and bursty, data traffic.
High-throughput WLANs based on 802.11n, especially running in the 5GHz band, will partly mitigate these effects, according to the study's authors, but not entirely. Because 11n offers a larger 'data pipe,' it takes more traffic to reach network overload in a dense WLAN. But because of 11n's higher throughput, enterprises are looking to do more with it, such as voice and streaming video, which can push the net toward overload.
"If you don't deal with co-channel interference and [the need for] access point coordination, it will put more pressure on the protocols supporting [wireless] voice and video," says Phil Belanger, co-founder of Novarum.
Designing a real-world test
Novarum's tests, conducted in the fall of 2007 in a vacant second-floor office in Sunnyvale, Calif., are unusual because of the number of access points and clients actually deployed. As the report points out, WLAN tests typically involve a single access point and about a dozen clients deployed in a lab-like environment.
In this case, Novarum used 72 wireless laptops and up to 54 wireless VoIP handsets, linking to a typical office wireless LAN first with 15 and then with 10 access points. The net was recreated each time with access points from Aruba Networks, Cisco, and Meru Networks.
Novarum designed and ran the tests. But the project was paid for by Meru (the office space had been rented by Meru for employees who had not yet moved in).
Seven tests were run, in most cases using first 15 and then 10 access points. The access points had 802.11abg radios, but Novarum ran the test only for 11g in the 2.4GHz band. One test was data only, with the 72 laptops; others tested only voice traffic, with 24, 48 and 72 simulated VoIP conversations. Two tests mixed voice and data, and one tested the VoIP handsets to find out how many simultaneous calls the WLAN could support.
In e-mails to Network World after the original version of this story was posted online, Aruba's head of strategic marketing, Michael Tennefoss, objected to what he called a "biased report" based on "bad science," and a "Meru puff piece." We invited Aruba to post its objections in the reader comments mechanism at the end of the story online, but Tennefoss declined, saying readers would interpret the post as the "sour grapes" of the "loser."
There were a number of technical objections Tennefoss makes, but one stood out. Tennefoss said the antennas on the Aruba access points were in the "closed" position (that is, folded on its hinge against the case) instead of in the "open" position as instructed in the Aruba users's guide. "They might as well have removed the antennas altogether," he writes.
He knows this was done because the Novarum report explicitly mentions it, in Appendix C. That entry says the tests were initially run with the antennas correctly positioned but the data throughput results were "very disappointing." The results improved dramatically when the antennas were folded down, so the tests were run with the antennas in this position.
Tennefoss' says that this behavior should have been a clue to the fact that something was not configured properly or the access points were wrongly deployed.
In an e-mail response, Novarum's Belanger says the stress test was not intended as a product review of the vendors' equipment, though they took a similar starting point: "standard [vendor] product tested with the latest released software and standard tools. Whenever we deviated from that it was explicitly called out [in the report]," he writes.
He defends testing the Aruba access points with their antennas folded because it resulted in much better performance for the Aruba WLAN.
In our own large-scale WLAN test in 2006, with 25 access points, Aruba won the Network World Clear Choice Award for its performance. Meru was the only other vendor, out of 19 invited, willing to participate, but various problems including beta software code prevented that vendor from completing the test. Full test results are online, including deficiency in the 802.11 protocol that could lower throughput.