Skip to content

The Noise Floor and What To Do With It

May 30, 2012


In previous articles (here and here), I have attempted to steer you away from trying to locate sources of interference and provided a framework for troubleshooting to test for and isolate problems stemming from noise or interference. I thought it would be good to next demonstrate an example of how this procedure can be put to use. So allow me to spin a terrifying yarn… a true story of a WISP that experiences an unexplained sector outage at the start of a holiday weekend.

To review, the term ‘noise floor’ is defined in Wikipedia as:

“The measure of the signal created from the sum of all the noise sources and unwanted signals within a measurement system, where noise is defined as any signal other than the one being monitored.”

To paraphrase, the noise floor is the murky, nasty stuff you want your operating frequencies to stay well above. Ideally, you want the noise floor to hover around -100 dBm or lower. Always. The greater the noise floor, the greater the level your legitimate operating frequencies need to be to guarantee optimal performance of your links. So what happens when the level of the noise floor changes to your network’s detriment?

Observe the spectrum analysis below. It is a snapshot from a dynamic baseline of an access point operating at 924.8 MHz:

As you can see, this sector was already experiencing less-than-ideal noise levels, mainly due to antenna height (70 meters) and its directionality. However, almost all customers were close enough to the tower that the receive signal strength (RSS) on both sides was well above the manufacturer’s recommended level for noise tolerance (20 dBm). Even with fluctuating interference (the red line) up to -65 dBm, average throughput was consistently tested and verified at the promised rates offered by the WISP’s service agreement.

Now for the scary stuff:

A week later, the same radio’s spectrum analysis was reporting an unprecedented 20 dBm rise in the noise floor, effectively severing all customer connections. So what to do? Run to the hills or stand and fight? The crux here was to quickly rule out anything within my domain of control that could be the cause so that I could start formulating a plan to solve the crisis. To do this I used the troubleshooting process I previously described in the Noise Detection article.


Scale: Single 90º sector at 924.8 MHz affecting ~20 customers within 2 km of 70m 4-sector tower.
Internal factors: No known changes or additions to the WISP customer’s wireless network.
External factors: No known tower (or other) construction projects in customer’s coverage area.
Timing: Interestingly enough but ultimately unhelpful, the abrupt noise level increase was recorded shortly after midnight on Canada Day (July 1).


Step Test Result
1 Baseline analysis 20 dBm increase in noise floor in upper 900 MHz band
2 Configuration changes No changes to hardware or software within the 24 hr period in which problem occurred
3 Customer feedback Unnecessary; effects to affected customers already known (unable to connect)
4 Radio configuration Configuration correct
5 Radio operation and monitoring AP radio nominal operation; monitoring shows RSS flatline for affected CPEs, SNR levels through the roof
6 Hardware inspection and component swapping All hardware visually inspected and components swapped with spares; no effect
7 Parallel system testing Spare radio with panel antenna attached detected similar noise levels at original antenna height (70m). Noise floor levels decreased slightly at each 10m step down to 30m.


This real-life example describes every wireless operator’s worst nightmare in all-too-vivid detail. And as nightmares usually go, deploying the solution was no easier than admitting the cause of the problem. Given the affected sector’s height and the fact it was pointing into a densely populated area, it was concluded that the noise floor was caused by a newly installed, unknown system transmitting from somewhere in an approximate 250 km2 area south of the tower. As such, the sector’s channel was retired and the remaining three sectors were rotated to compensate. Service to 98% of customers was restored with similar or better performance.


From → Wireless

Leave a Comment

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: