BBH Central IconBBH Central Home Page
  CENTRAL home  |   BBHL home About/Contact Us  |   Subscribe  |   Index by Topic  
You are here: Central > Broadband Home Labs > Home Networking > Wi-Fi > Test Procedure
Updated 3/10/2005

Wi-Fi Test Procedure

This page describes the test procedure used in Rounds One and Three. The procedure for Round Two is described on the Round Two Wi-Fi Baseline Tests page.

We performed quantitative measurement of network speed using a modified version of the HONEST: HOme NEtworking Speed Test diagnostic we created for the HomePlug tests reported earlier. HONEST times the transfer of large files between PCs; we believe it provides a realistic estimate of throughput for comparison between different products and networking technologies. Note that our measured transfer rates are substantially lower than the "raw" network speeds (11 Mbps for 802.11b and 54 Mbps for 802.11a and 11g) prominently featured in marketing claims.

We chose nineteen locations around the house for testing: the fourteen used for the HomePlug evaluation, plus two more locations inside the house and three outside. See the test locations page for a diagram of our house showing each test location including the access point.

Sony VAIO at location 11 for Round 3 tests.We selected the locations to get a good measure of the sensitivity of 802.11 to obstacles. One location (11 in the diagram) is in the same room as the access point and has a direct "line-of-sight" with no obstacles. The others are separated from the access point by one or several walls, a floor, ducting and/or a steel beam. Except for the beam, we think this is typical of North American suburban houses.

Our testing model was as follows:

  • We placed each access point at the same location -- labelled AP on the diagram -- in approximately the center of the main floor of our house.
  • To avoid interference, we unplugged all other access points, notebook adapters, and other wireless devices during each test series.
  • We used the default channels for each access point: channels 1, 6 or 11 for 802.11b and channel 60 for 802.11a.
  • We installed each notebook adapter in a notebook PC (an IBM X-30 for Round One and a Sony VAIO for Round Three), which we carried around the house for each test series.
  • For each test, we used our HONEST test program to measure the transfer speed of a large file from our server to the test PC. Thus we were measuring the file "download" speed from our server. We usually ran HONEST once at each test location, measuring the amount of data which could be transferred in 100 seconds. If we saw an unexpected result, we ran the test again and averaged the results.
  • We generally completed one test series before starting another one, but we "filled in" a few measurements.

AirMagnet

"Signal Strength" portion of AirMagnet AirWISE screenWe used AirMagnet Laptop throughout our testing. AirMagnet is a new hardware/software system that provides a full suite of tools to make precise measurements and diagnose problems with wireless networks. This is a screenshot of the "Signal Strength" portion of the AirMagnet AirWISE screen. (Also see a complete screen shot of the AirWISE screen showing security warnings since we made all of our measurements with WEP encryption disabled.)

During our testing, we used the AirMagnet survey tool several times to run "long surveys" of each access point at a fixed place (locations 11 or 15) close to the access point. This survey was at least ten hours, with all other wireless equipment powered off.

Before starting each series of test measurements, we used the AirMagnet survey tool to log the access point signal strength and noise level. We ran a similar survey at the end of each series.



Start-Up Problems

We had several false starts in running the Wi-Fi test series. Due to our delays in completing the tests, vendors released several generations of upgraded firmware for access points and upgraded drivers for notebook adapters. In our round two testing, we found that upgrading firmware and drivers often improved performance.

We encountered several problems with the AirMagnet test tool. We found that the software worked best for our purposes with the newer Netgear WAB501 notebook adapter and upgraded the software several times to newer versions.

We kept encountering a lack of consistency in test measurements. Each time we made a new set of measurements, we found that the same combination of access point and notebook adapter would give different results—sometimes radically different results—in tests separated by a few hours or days. It took us several months to gain some understanding of what might be causing these effects, and to develop a methodology to minimize them to provide meaningful results.


Next: Wi-Fi Test Results: Round One