Economic and practical motivations are making wireless communication a winning technology in present and future network deployments. Depending on the specifications of the considered network scenarios, service types and size of the deployed network, different technologies (802.11a/b/g/n, 802.16/e, 802.15.4, Bluetooth, Hiperlan/2, etc.) can be exploited in order to best satisfy service requirements. Many of the aforementioned technologies share the same spectrum bands, hence potentially conflicting with each other. This phenomenon is a threat to wireless communications, especially in densely populated areas, where the ISM (Industrial, Scientific, Medical) bands are typically overcrowded by many private and public services. Only recently, the research community has been considering the possibility of turning the issue of coexistence of heterogeneous access technologies competing within the same frequency bands into an advantage, by exploiting the principles of cooperation 1.
Yet, this challenge calls for clever management and optimization of the available spectrum resources. In particular, the first and foremost step to be able to coordinate heterogeneous technologies coexisting within the same frequency bands is to identify them. Most currently available commercial technologies support only basic interference detection and avoidance techniques, whereas active cooperation for the identification of and the coexistence with other radio systems is not available. In order to determine the level of utilization of different parts of the radio spectrum and to infer which technologies are active in the wireless neighborhood of a device, we developed CRABSS, i.e., a CalRAdio-Based advanced Spectrum Scanner platform, which integrates the capabilities of the CalRadio 1 Software Defined Radio (SDR) 2 with the flexibility and modularity offered by the Unified Link Layer API (ULLA) 3 project.
CalRadio 1 is a development platform supporting an IEEE 802.11b RF transceiver and an open and completely reprogrammable Medium Access Control (MAC) firmware. Taking advantage of the flexibility of the platform, we designed and developed a suitable MAC protocol that realizes the advanced spectrum scanning functionalities of CRABSS. More specifically, CRABSS provides two different operational modes, namely horizontal and vertical.
In the horizontal mode, CRABSS operates as a legacy IEEE 802.11b station, thus enabling the exchange of data with other stations, while collecting link-related performance measurements. With respect to most commercial cards, CRABSS offers a richer set of metrics that, besides the usual link-layer counters such as the number of transmitted and received packets, also includes system-related indices, such as number of active stations, channel utilization per station, station activity rate, and so on. Some of these metrics can also be obtained by using commercial WiFi cards in promiscuous mode that, however, does not allow for active data exchange. Conversely, CRABSS is capable of collecting these metrics while maintaining its standard operational capabilities.
In the vertical mode, the software of the CalRadio 1 device is reconfigured in such a way that the processing capabilities of the board are used to perform Clear Channel Assessment (CCA) only, i.e., to reveal the presence of energy above threshold on a certain portion of the radio spectrum. With respect to other spectrum scanners, CRABSS offers much larger flexibility in setting the sniffing pattern, which can range from an extremely quick (tens of milliseconds) sweeping of the whole 2.4 GHz ISM band, spending just few milliseconds in each sub-band, to the persistent scanning of a single channel, or a group of channels. In short, CRABSS supports basically any frequency-scanning pattern, making it possible to specify the sequence of RF channels to be visited and the dwell time in each of them, within the limits of the CalRadio 1 hardware architecture.
The great flexibility of CRABSS can facilitate the identification of link or network critical situations, while offering a more complete picture of the current spectrum utilization that can be used to correctly drive cognitive optimization procedures. As an example, CRABSS can operate in the horizontal mode to exchange data with a legacy access point. During periods of inactivity, CRABSS can switch to the vertical mode to perform a quick scanning of the whole spectrum, thus maintaining an updated view of the current occupancy level of the different channels. If link measurements collected in the horizontal mode reveal that the link performance is going to degrade below a critical threshold, then CRABSS can make use of the information collected during its idle periods to switch to a better channel, thus saving time and limiting the performance loss.
The integration of CRABSS with ULLA framework further enhances the potential of the solution by offering a standardized and rather intuitive interface to access and control CRABSS functionalities. In this way, CRABSS can be easily integrated in different cognitive architectures as either an advanced IEEE 802.11b station or a spectrum scanning device. The information collected by CRABSS and exported through ULLA can be stored in a database for different purposes, such as tracking the utilization level of the ISM spectrum over time, inferring the different technologies that are active in the frequency band, and so on.
In the remainder of this paper, we briefly describe the basic ingredients of CRABSS, namely CalRadio 1 and the ULLA framework. Then, we describe the CRABSS architecture, specifying some details of the supported functionalities, present some preliminary experimental results and discuss how CRABSS may be used to help alleviate the current problem of the overcrowded ISM bands. We conclude the paper with some final remarks.
2. Calradio 1 Platform
CalRadio 1 is an open 802.11b-compatible development platform, designed and developed at the University of California, San Diego, with the aim of providing the research community with an open and fully reprogrammable board for experimental testing. The main purpose of this device is to study the 802.11 4 MAC protocol in order to understand its critical points and to identify possible enhancements.
In this section, we summarize the most important features of the platform, referring to 5 for a more detailed analysis of CalRadio 1 functionalities and performance.
2.1. Functional and Architectural Description
CalRadio 1 consists of four main components, namely an ARM (ARMv7TDMI) processor 6, a DSP (TI 5471 Digital Signal Processor) 7, a Baseband processor (Intersil HFA3863) 8 and an RF transceiver (MAX2821) 9. With respect to the ISO/OSI reference protocol stack, the ARM processor supports the ‘higher layer’ functionalities, from the network layer up, the DSP carries out the Data Link Layer (DLL) functionalities, including MAC, whereas the physical layer (PHY) functionalities are implemented by the Baseband processor and the RF transceiver, as schematically represented in Figure 1.
ARM and DSP are located within a single chip and are interfaced by means of a memory-mapped shared buffer residing on the DSP. Such buffer is used for exchanging both data packets and control information. In order to avoid possible race conditions, the buffer is spin-locked before accessing it. Reading and writing of the shared memory involves the Arithmetic-Logic Unit (ALU) of the DSP, that therefore cannot perform other operations in parallel. This characteristic represents a potential bottleneck for the performance of the CalRadio 1 board. In fact, reading/writing data on the shared buffer can take a relatively long time that can impact on the overall packet delivery delay of the board.
The feature that distinguishes CalRadio 1 from common commercial WiFi cards is the ability to access and reprogram the DLL and MAC protocols running on the DSP. Moreover, the DSP controls the Baseband processor and the RF transceiver setup, thus making it possible to directly play with most of the PHY settings also at runtime.
In order for the ARM processor to communicate with the DSP, a device driver module is loaded by the 2.4 Linux Kernel; this implements a standard L3 network interface that performs the basic operations on the kernel socket buffers, to transfer packets to the lower layers and deliver received packets to the upper network layers. When a new packet is put in the transmission queue, the driver fetches it and writes it onto a dedicated segment of the shared buffer through which the ARM processor and the DSP can communicate with each other.
Other than data packets, specific commands can be sent to the DSP through the shared buffer, in order to reconfigure the WiFi interface. A modified version of Linux iwtools 10 has been cross-compiled for the purpose of controlling the WiFi interface and the relative command hooks have been implemented inside the device driver.
As mentioned, the core of the CalRadio 1 platform is a 100 MHz clocked, programmable DSP. The DSP code can be written in standard ANSI-C language and compiled on an external PC. The generated binary code can be transferred to the ARM via standard File Transfer Protocol and then loaded on the DSP through a loader kernel module.
In Figure 2, we sketch the general operational flow performed by the CalRadio 1 firmware 5.
While cycling in the main loop (ML), the ARM processor delivers transmission packets to the DSP by copying them onto the shared buffer. Once per ML, the DSP checks for packets to transmit. If a new packet is found in the shared memory, the DSP transfers it to the Baseband processor for transmission. Packet reception at the PHY is triggered by a correlation peak in the incoming signal that is generated by the physical layer convergence procedure (PLCP). The sensitivity of the reception circuit at the RF stage can be adjusted by configuring a dedicated register in the transceiver. This allows the reception procedure to be more or less selective. Transmission of any packet is subject to the sensing mechanism of the RF transceiver. As long as the channel is sensed busy, the backoff countdown is delayed by 20 µs time slots, as dictated by the 802.11 protocol 4. The channel can be sensed busy based on two types of events: either the detected energy exceeds a given threshold (Energy Detection threshold) or a correlation peak has been detected by the RF circuitry. Either one or a combination of these two events is selectable by setting the appropriate register in the Baseband processor. Note that correlation alone cannot indicate the presence of interference when this is generated by non-802.11b/g devices.
As for each transmission protocol, also in 802.11b, we have atomic operations that require a higher priority with respect to other interrupts that can occur on the DSP. By implementing a simplified set of macros that mask the available DSP hardware interrupts, we designed a spin locking mechanism that allows atomic execution of the most critical portions of the code. In particular, when a new data transfer is initiated by the DSP Direct Memory Access (DMA) channel, packet reception is disabled and any reception flags are cleared at the end of the transfer. In all the other parts of the code, we grant the reception procedure with higher priority. When a packet is detected, the Baseband signals its presence to the DSP by means of a hardware interrupt. Such interrupt pre-empts the code execution to start a new reception routine involving the DMA channels. Once the reception routine is over, standard code flow execution is restored. In case of atomic operations that do not involve the RF stage, we use flags to signal pending reception operations. When the atomic code execution ends, we check such flags and initiate a new DMA reception transfer if a pending packet is signaled.
2.2. CalRadio 1 Sensing Capabilities
As to the potential offered by the CalRadio 1 platform, the first concern we focused on was to determine the maximum device agility, which is the major requirement to guarantee fast energy detection within a certain band. In a second stage, we focused on the issues related to the integration of the sensing capabilities of CalRadio 1 within the 802.11 MAC code, while preserving compliance with the protocol.
Spectrum agility is related to the minimum duration of an event in order for CalRadio 1 to detect it. As CalRadio 1 is monitoring a certain portion of the spectrum, other bands are left unmonitored. The time required by CalRadio 1 to shift its carrier frequency and start monitoring those other bands can be considered as a lower bound for the maximum spectrum agility achievable by any application that wants to exploit CalRadio 1 's scanning functionalities. The time spent on a certain band can then be defined by the application itself and will just impact on the reliability of the retrieved information on that band.
A carrier shifting operation on CalRadio 1 directly involves the RF stage registers, which only allow write operations. Since the register cannot be read, it is not possible to directly measure the time taken by the RF transceiver to actuate the frequency shifting command. To overcome this problem, we set up an experiment in two stages. In the first stage, a certain packet was continuously and repeatedly transmitted on the same frequency many times. In order to avoid any source of randomness in the measurement, we modified the MAC code on the DSP to bypass all DLL and MAC procedures, such as backoff mechanism, physical and virtual carrier sense (CS), retransmissions, rate adaptation algorithms, and so on. At the receiver side we logged the timestamps of each packet arrival. In the second stage of the experiment, before each packet transmission, the sender switched the carrier frequency to a different channel and then back. In this way, we accounted for the additional time required by the radio to perform two frequency shifts. The distributions of the packet inter-arrivals for both experiments are plotted in Figure 3.
By calculating the difference between the two mean values of the distributions, we obtain the average time required for two carrier frequency switches. From that, the average channel switching time turns out to be approximately equal to 7 µs.
Note that, in our experiment, we set the frequency shift to 5 MHz. In the RF transceiver datasheet 9, the time for shifting from 2.400 GHz to 2.499 GHz is specified with a typical value of 150 µs. Under the assumption that the time grows linearly with the frequency offset to shift, our results are in good agreement with the values reported on the datasheet and yield a delay-per-MHz shift of approximately 1.4 µs/MHz. We remark that this delay figure actually refers to the latency experienced at the DSP level, therefore it does not account for the additional time required by the ARM to process a carrier shifting command nor the additional time required to sample the channel and, if needed, post-process the acquired data.
3. ULLA Framework
ULLA is an open framework developed within the European project ‘Generic Open Link-Layer API for Unified Media Access’ (GOLLUM) 3. The major focus of GOLLUM was to remedy the situation where a separate programming interface exists for almost every wireless technology. The ULLA framework solution proposed in GOLLUM implements an operating system-independent link-layer API to support heterogeneous systems, by unifying the various methods for accessing different wired and especially wireless links.
ULLA enables better portability of applications between devices using different communication interfaces. ULLA implements three distinct methods to access the resources of a network device. First of all, a query mechanism can be used for single request-response transactions between the ULLA framework and the targeted network device. In addition, ULLA supports methods to set up asynchronous notifications that can be triggered periodically or when a predefined set of conditions occurs. Finally, a command API provides direct access to the network device resources. All these APIs can be accessed by means of Universal Query Language (UQL) statements to express either requests, commands or conditions for triggering certain actions (e.g., trigger a measurement update).
As sketched in Figure 4, the ULLA architecture has three major components, namely
the Link Users (LUs), namely any entity, layer or single application registering and accessing the ULLA core to gather PHY and MAC information from the ULLA registered network devices;
the ULLA core, which implements all the logic required to interface the upper network layers with the PHY and MAC layers of the physical devices;
the Link Layer Adapter (LLA), which is responsible for translating UQL statements into device specific commands and device messages into ULLA compatible information.
Each device that wants to interact with the ULLA core needs first to register and notify ULLA about its own capabilities. In this way, ULLA shall prevent Link Users from requiring information or commands that are not supported.
The ULLA Core optionally implements database information storage for statistical purposes, which may for instance be used by a Cognitive Resource Manager (CRM) to perform long term optimizations based on historical data sets.
Table I shows the classes of object abstractions defined by ULLA to operate commands and retrieve information. Each of these abstract objects can be accessed by any link user.
Table I. ULLA abstract object classes.
Abstraction of the radio device attributes and operations. Radio devices are represented as a set of radio links that are supported by the available Radio Access Technologies. The scanAvailableLinks operation is used to populate the available links within the corresponding device profile.
Represents a layer 2 link between the local radio device and another.
Represents a radio resource, which is used by one or more links.
4. CRABSS Architectural Description
The solution proposed in this paper is based on the integration of the CalRadio 1 platform with the ULLA framework. The first step to realize CRABSS was to enhance the CalRadio 1 MAC protocol with sensing capabilities while ensuring a minimal impact on the CalRadio 1 device efficiency, studied in Reference 5. To ease the setting of CRABSS parameters and the visualization of the collected measurements, we also developed a Link User application with a Graphical User Interface (GUI) that translates user's settings into ULLA commands and shows ULLA information in graphical form.
4.1. Providing CalRadio 1 with 802.11 MAC Protocol and Spectrum Sensing
The CalRadio 1 MAC layer software purchased along with the board consists of a basic set of functionalities that support transmission and reception procedures. We designed and implemented the main communications functionalities of the 802.11b MAC layer. In order to check their compliance with the standard, we tested the communication of the developed DSP code with commercial boards (i.e., Atheros and Intel WiFi chipsets). With reference to Figure 2, any time a packet is transferred to the DSP by the ARM, a new backoff procedure is started by the DSP. During this time, the CCA binary indicator is periodically checked to determine whether the current 20 µs time slot is busy or not. The backoff countdown process is frozen during busy periods and resumed after the channel has remained idle for a sufficient amount of time, as dictated by the IEEE 802.11 specifications.
While operating according to the 802.11 MAC protocol, the software also maintains a list of counters that reflect the current state of the link and of the network. These procedures realize what we call the horizontal scanning mode. Upon request, this operational mode can be suspended to activate the vertical scanning mode. In this case, all the functionalities of the 802.11 protocol are temporarily disabled, and the platform starts collecting CCA samples from the PHY layer according to the selected frequency scanning pattern.
To support ULLA, we introduced a new set of input/output control (ioctl) primitives that map ULLA queries into device specific requests. We exploited the same shared memory used to transfer MAC Packets Data Units (MPDU) from the ARM processor to the DSP, to forward ULLA queries and commands. Inside the DSP, each request is buffered and executed when the MAC goes into the idle state (no packets need to be transmitted or received). The number of subsequent requests that can be sent to CalRadio 1 is limited by the fact that, once the ARM has written a new query on the shared memory, it cannot write any other request until the DSP has finished fetching the previous one. In case multiple requests are pushed onto the CalRadio 1 driver, a mutex condition implemented on the shared buffer skips those that cannot be transferred to the DSP, to avoid deadlocks that could freeze and possibly crash the kernel.
4.2. CRABSS Sensing Features in Vertical Mode
The PHY parameters exported by CalRadio 1 are the Average Channel Busy Time and the Energy Burstiness detected on a channel. The former provides the amount of interference that is present on a WiFi channel (20 MHz band), in terms of the number of CCA readings that indicate the medium as busy divided by the total number of measured samples. The board provides access to the Energy Detection threshold used by the integrator circuit to generate the CCA binary signal.
The other indicator reports the average value and standard deviation for the power burstiness detected on a channel. This metric quantifies the average length of a communication or, considering the reciprocal value, the number of detected communications within a certain time frame. From such indicators, it is possible to estimate the probability of successfully accessing the media for transmission. This access probability tends to decrease with the number of network entities competing for transmission and can be used to estimate the residual network capacity and, from that, the maximum goodput that a link can sustain on a certain channel.
It is important to note that, while these indicators consider the energy sensed on a channel in a technology agnostic fashion, the way they are obtained is technology dependent. Specifically, CalRadio 1 is based on an 802.11b transceiver, whose RF reception filter has a bandwidth of 20 MHz, as shown in Figure 5. The standard specifies a frequency offset of 5 MHz for each WiFi channel in the 2.4–2.499 GHz ISM band. The RF transceiver, however, permits frequency shifts of 1 MHz (hence potentially violating the standard), but still with a reception filter bandwidth of 20 MHz. Having a frequency resolution of 20 MHz is, on the one hand, disadvantageous as we cannot tell which portions of these 20 MHz are actually being used for communication. On the other hand, this makes it possible to sweep the available frequencies in a shorter amount of time, hence having faster data refresh rates. This is important, for instance, in order to exploit the idle periods that may occur during standard 802.11 functioning to temporarily switch to vertical operation mode and collect useful information from the PHY layer. As previously described, the time for a frequency shift of 5 MHz is estimated as 7 µs. The minimum amount of time for CalRadio 1 to gather a reasonable amount of samples for a given channel and to complete the data transfer to the LLA has been estimated to be 10 ms, which makes the time required for frequency shifting operations negligible. To complete a whole scanning procedure of the 2.4 GHz ISM band without frequency overlap in the observation windows requires about 50 ms.
Besides being technology dependent, the acquired information is also hardware dependent: each piece of hardware has its own limitations and advantages; as an example the CalRadio 1 DSP provides access to 72K words of memory for program code and data. This means that no space is left on the DSP for data gathering and storage. For this reason, we cannot store the whole trace of the detected energy over time, but can only keep track of the average value and standard deviation of such variable.
Table II summarizes the main features of CRABSS in vertical mode. It is important to note that the proposed solution makes it possible to collect data on multiple selectable frequencies in a rather flexible manner. The collected measurements, such as Average Channel Busy Time and Energy Burstiness, can help a network entity to discover a new eligible channel to switch to when the link conditions experienced in the current channel degrade below the requirements. Once the new channel has been determined based on the raw information provided by CRABSS in vertical mode, a finer set of measurements (based, for instance, on the MAC layer indicators) can be exploited to have a more accurate description of the selected channel status and to monitor its behavior. These data are provided by CRABSS horizontal mode, as described next.
Table II. Sensing features of the CRABSS architecture in vertical mode.
CCA reading period of 0.2 µs.
216 CCA collected samples before reporting the total number of busy samples to the ULLA core.
CCA mean and standard deviation values recorded and exported to the ULLA core.
Reception filter bandwidth of 20 MHz.
Carrier frequency freely shifted by 1 MHz within the 2.4–2.499 GHz ISM band.
5 MHz carrier frequency shift performed in less than 7 µs.
Minimum dwell time of 10 ms for each channel.
4.3. CRABSS Sensing Features in Horizontal Mode
While the physical measurements can be collected on any 20 MHz-wide channel within the 2.4 GHz ISM band, the 802.11 MAC layer statistics are available in horizontal mode only and necessarily refer to standard 802.11 channels. MAC layer statistics are used to monitor link quality. When the current link conditions drop below the requirements, we can use the PHY indicators provided by CRABSS in vertical mode to select a different channel.
In Table III we report some of the MAC indicators exported by CRABSS in horizontal mode. Note that the indicators are collected for any link detected by CalRadio 1, even though the device is not directly involved in the communication. Furthermore, counters are updated even for packets with erroneous payload field, provided that PLCP and MAC headers are correct.
Table III. CRABSS exported statistics in horizontal mode.
Number of advertised 802.11 APs (through beaconing)
Number of sensed 802.11 stations
Number of tx/rx bytes for each [STA,AP] pair
Ratio between the number of sensed packets and the number of corresponding Acks
In this section, we report and discuss some preliminary results obtained using CRABSS, with the aim of illustrating the potentialities and limitations of the proposed solution.
5.1. Channel Sensing and Technology Inference
One of the possibilities enabled by CRABSS is to infer which wireless technologies are active in a certain region of the 2.4 GHz ISM spectrum from the spectrogram provided by CRABSS vertical scanning mode.
A snapshot of the spectrogram panel shown by the GUI Link User application is reported in Figure 6, where the major commercial technologies in use in the 2.4 GHz ISM band are emphasized. The graph shows the energy detected on each of the 13 available 802.11 channels over time. Darker color pixels show a time/frequency slot where no interference was detected (a free channel). Lighter color pixels denote an interfered time/frequency slot. In the picture, it is possible to observe the presence of an 802.11 network on channel 13 (lower part of the graph), a traffic burst generated by Tmote-Sky 11 (ZigBee) nodes in the rightmost part of the panel and the frequency hopping interference due to Bluetooth traffic for three distinct file transfers.
The data shown in Figure 6 was obtained by setting a scanning period of 100 ms for each channel. For each period, the average value of the CCA busy readings is returned. If we increase the time resolution of the spectrogram, it is possible to isolate two distinct bursts for each Bluetooth transfer. The first is generated by the paging operations (handshake and synchronization between master and slave nodes), while the latter shows the data transfer itself.
Technology identification algorithms can be used to analyze the interference pattern, signals bandwidth, and frequency hopping behavior over time to detect the presence of different overlapping technologies. Hence, a cognitive engine can coordinate nodes and optimize the available resources with respect to the ongoing traffic flows.
We observe that the interference is measured in terms of fraction of CCA samples that report the channel as busy, although we cannot actually measure the interference power level. However, the CalRadio 1 platform makes it possible to set the power threshold used by the CCA circuitry to mark the channel as idle or busy. In this way, we can adjust the sensitivity of CRABSS to our liking, for instance to reveal only major interference sources.
CRABSS data can be collected either on a very short time scale or for a longer time. While mid term optimization can be performed by knowing the state of the available channels on a short time scale, prediction and coordination become possible on a statistical basis when longer time-range datasets are collected and analyzed.
As shown by the graph in Figure 6, common technologies operating in the 2.4 GHz ISM band can be easily recognized by human inspection. However, the automatic inference of such technologies through pattern matching techniques is more challenging, in particular in case of co-presence of multiple technologies, which may shade each other's spectrum footprint. Another note regards the relationship between the distance of the transmitters and the bandwidth occupancy reported by CRABSS. When a node is transmitting nearby, the CalRadio 1 device may detect a wider interfered band. As the distance from CalRadio 1 increases, we observe a narrower interfered band as CalRadio 1 's reception filter roll-off regions do not capture enough power to indicate the channel as busy any longer.
A key indicator of the fairness of the communication between different wireless interferers is the burstiness of the communication, which accounts for the number of channel busy periods observed within a scanning period on a given channel. By knowing this value and associating each burst to a transmitter of known technology, we can derive the probability of successful access to the channel. CRABSS provides this indicator in terms of mean value and standard deviation of the length of the bursts sensed within a scanning period. This length is measured in units of 20 µs (equal to the 802.11 time slot unit). In order for this indicator to correctly depict the interference burstiness in a certain channel, the energy detection threshold must be coherent with the interference level of the environment. Moreover a hysteresis model must be used to avoid weak bursts to be misinterpreted as a series of multiple short bursts.
Figure 7 provides a comparison between a 3-dimensional trace of a WiFi beaconing activity on channel 13 (2.472 GHz) and the trace of the interference introduced by a Bluetooth file transfer overlapping with the WiFi signal. The presence of the plateau in the right-hand side graph confirms that CRABSS is actually capable of revealing Bluetooth signals despite the rather fast frequency hopping pattern of this technology. Furthermore, the WiFi beaconing activity can still be detected regardless of the masking effect of the Bluetooth trace and the rather low traffic generated by beacons. Hence, in this case CRABSS preserves the frequency fingerprint of the technologies that are active in the monitored area, potentially enabling their recognition through the analysis of the interference spectrogram. This information may enhance the context awareness of a cognitive engine, thus enabling better management of the transmission resources. The technology inference process may be automated by using different classification techniques, such as for example Discriminant Functions, Probabilistic Discriminative Models, and so on 12. The investigation of these techniques is currently in progress.
5.2. Example of Channel Switching Optimization
Channel switching is probably the first and the simplest optimization that can be performed based on the channel occupancy information provided by CRABSS. In fact, it is well known that switching a data link to a less congested/interfered channel can significantly improve the quality of the link, almost irrespective of the wireless technology adopted. The spectrum sniffing capabilities of CRABSS can be easily used to drive this primary optimization procedure. As a proof-of-concept, we developed a simple experimental setup, where we tracked the performance achieved by a tagged data link in the presence of an IEEE 802.11 interferer, in order to assess the benefit that can be obtained by applying a basic channel switching policy, driven by the information exported by CRABSS.
The experiment setup is sketched in Figure 8. The tagged data link consists of a source node that transmits Internet Control Message Protocol (ICMP) packets to a dummy receiver, by using an ad hoc WiFi connection. To mitigate the impact of the 802.11 MAC protocol, we disabled retransmissions and RTS/CTS handshakes. Moreover, we set up a fake ARP entry as destination of the ICMP packets, hence cutting off any additional source of randomness introduced by MAC acknowledgments and ICMP replies. This restricts the randomness in the timings of the tagged connection to the sole delay introduced by the countdown of the first backoff stage after each packet transmission. Moreover, as we repeatedly send ICMP packets without waiting for a reply, we avoid any retransmission delay that may originate in case of packet reception errors, due to collisions with interfering transmissions or other sources of channel errors. Finally, we disabled the rate-adapation algorithm in the tagged link, fixing the transmission rate to 54 Mbps. We observe that, with this setting, the throughput of the tagged station, defined as the mean number of ICMP packets that the source is capable of sending on-air in the unit time, only depends on the channel occupancy level, i.e., on the presence of other transmissions in the channel. Hence, from the perspective of the tagged link, the other transmissions in the same radio channel are seen as interference. In our experiment, apart from the non-controllable environmental interference produced by nearby networks, which was rather negligible, we realized an interfering link by establishing a long file transfer between a WiFi Access Point (AP) and a station node (STA). To recreate a typical source of interference, the PHY transmission rate on both the AP and the STA was determined by the Sample Rate 13 rate adaptation algorithm, which is common in commercial cards.
In the experiments, we use the busy channel information exported by CRABSS as a raw feed to a dummy algorithm that commands frequency shifts on the tagged link. The algorithm monitors the radio channel currently in use by the tagged station and, when alien interference is detected, it checks whether a different interference-free band is available. If a non-interfered or at least less interfered portion of the spectrum exists, the algorithm switches the tagged link to that channel.
Note that, in order for the algorithm to work properly, the traffic generated by the tagged data flow must be subtracted from the total amount of interference sensed on the channel. This is achieved by jointly using the information provided by the vertical and horizontal modes in CRABSS: the vertical mode gives the picture of the interference levels in all the available bands; the horizontal model, in turn, makes it possible to single out the traffic generated by the tagged link, which can then be removed from the total measured interference in that channel. In the experimental setup, the tagged transmitter and the CRABSS node were both connected to the same host running the optimization program. This program commands CRABSS through ULLA and receives periodic channel-status reports. When the interference level on the channel used by the tagged link increases above a given threshold, the optimization algorithm selects a new channel, according to the interference level reported by CRABSS, and issues a channel switching command to the tagged connection, still using ULLA.
The results of the experiments are shown in Figure 9, where we compare the transmission throughput (S) achieved by the tagged transmission when coexisting with the interfering link (white bars) and after switching to a less interfered channel (black bars), when varying the length of the ICMP packets used in the tagged link. We can appreciate how channel switching generally yields a significant performance gain, in particular when the packet size used in the tagged link is large, whereas, as the packet size decreases, the throughput gain obtained by switching to a non-interfered channel becomes less and less significant. The reason is that, with short packets, the throughput of the tagged link is limited by the protocol overhead and the finite rate at which the operating system of the source node can handle packet transmissions. In such a situation, the contribution of the interference to the efficiency loss becomes less relevant with respect to the inefficiency generated by the operating system's inherent latencies. This observation emphasizes the importance of a smart channel switching algorithm that can correctly interpret the information provided by a spectrum sensing device.
5.3. Channel Switching Criteria
In the above described experiment, we proved the effectiveness of a long term optimization based on a scenario in which the interference was confined within a certain band for a sufficiently long time. In more dynamic scenarios, in order for channel switching to be advantageous, the performance loss caused by the experienced interference must be larger than the coordination cost for switching. Below we provide a rule-of-thumb criterion that can be used to assess the effectiveness of channel switching. We assume that the link is saturated. Let τobs be the observation time required to detect an interfered channel and τneg the time for the two nodes to negotiate a new channel. We consider a worst case scenario in which no useful data can be transferred during the time τsw required to switch between channels and to re-establish the link between the two nodes. If R denotes the saturation rate, Ri the interfered rate (that is, the transmission rate in the presence of interference) and Ti the duration of the interference, the switching turns out to be effective when the following relation is satisfied:
By inverting Equation (1), we can specify the minimum Ti for which it is convenient to switch channel as
We observe that the negotiation and switching times become more and more significant as the interfered rate Ri approaches R. Conversely, if the channel is heavily interfered, i.e., Ri ≪R, then channel switching shall be triggered as soon as the duration of the interference burst is expected to be longer than the negotiation plus switching time.
5.3.1. A Case Study
As a proof-of-concept experiment we consider the transmission with physical rate RPHY = 20 Kbit/s in 802.15.4 technology, between two Tmote-Sky wireless sensors operating in the 2.4 GHz ISM band. For this purpose we estimated the time for a frequency shifting operation to be τobs = 1.5 ms (for adjacent channels).
The experiment was conducted with the following setup:
carrier sensing mechanism activated
no acknowledgement/retransmission mechanism
In this scenario, the two nodes could transmit bursts of 10 000 packets of 20 Bytes each with an effective rate RTX = 20 Kbit/s and a transmission efficiency
We repeated the experiment with two links transmitting with the same setup and measured the new (interfered) rate Ri = 18.6 Kbit/s on both links. τobs can be defined as the time required by CRABSS to sweep the whole 2.4 GHz ISM band, which was estimated to be 50 ms. By applying these results in Equation (2), we obtain
Assuming a three-way negotiation procedures, which require the exchange of three control packets of 20 bytes each, the negotiation time can be estimated to be τneg = 24 ms. In these conditions, channel switching makes sense when the interference persists on the channel for a time longer than approximately half a second.
In this work, we presented CRABSS, a new framework for interference detection and technology inference. The solution empowers the IEEE 802.11 MAC protocol with sensing capabilities for multi-technology interference detection and avoidance, while preserving its standard functionalities. The framework takes advantage of a modular cross-layer approach to transparently export MAC and PHY statistics to upper layers' prospective users (e.g., a CRM engine). The solution was tested by generating interference with the most common commercial technologies in the 2.4 GHz ISM band and proved to be effective in detecting and possibly identifying them.
Our current and future developments are focused on creating an automated self-training technology inference algorithm based on energy pattern recognition and machine learning techniques. Based on the information exported by CRABSS, this algorithm tries to recognize the presence of conflicts among multiple overlapping technologies in the 2.4 GHz ISM band, partly based on technology-specific interference fingerprints, and partly through an initial training phase, possibly helped by some static knowledge of the given environment. We plan to characterize the throughput and energy performance of a system in which the operating frequency of each node is selected based on this solution. We also plan to test the algorithm in a real-word wireless mesh network with remotely controlled nodes, in order to verify the goodness of this approach in practical scenarios.
This work was partially supported by the European Commission through the ARAGORN project (FP7 ICT-216856) and by the US Army Research Office through Grant No. W911NF-09-1-0456.
Riccardo Manfrin received his master degree in telecommunication engineering in December 2007 from the University of Padova. While obtaining the degree he has been working in the NEC Labs, Germany. Now he is a research fellow with the Consorzio Ferrara Ricerche (CFR), Ferrara, Italy, and collaborate with Patavina Technologies, a spin-off of the University of Padova, operating in the ICT field, and with the University of Padova. He has been involved in the Aragorn European project on reconfigurable cognitive networks. His research area focuses on MAC and PHY layer for wireless networks, with a deep background on 802.11 and 802.15.4 networks. In particular, he has been working on the development of protocols for wireless networks and wireless sensor networks based on embedded devices.
Andrea Zanella is an assistant professor at the Department of Information Engineering (DEI), University of Padova (ITALY). He received his Laurea degree in Computer Engineering in 1998, from the same University, and the PhD degree in Electronic and Engineering, in 2002. Before that, he was visiting scholar at the Department of Computer Science of the University of California, Los Angeles (UCLA), where he worked with Prof. Mario Gerla on Wireless Networks and Wireless Access to Internet. Andrea Zanella's major research interest is in the field of protocol design and performance evaluation of WPANs, MANETs, VANETs and WSNs. He was actively involved in a number of European and National research projects and he serves as reviewer for several IEEE journals and international conferences in the ICT area.
Michele Zorzi was born in Venice, Italy, in 1966. He received his Laurea degree and Ph.D. in electrical engineering from the University of Padova, Italy, in 1990 and 1994, respectively. During academic year 1992/93, he was on leave at the University of California, San Diego (UCSD) attending graduate courses and doing research on multiple access in mobile radio networks. In 1993, he joined the faculty of the Dipartimento di Elettronica e Informazione, Politecnico di Milano, Italy. After spending three years with the Center for Wireless Communications at UCSD, in 1998 he joined the School of Engineering of the University of Ferrara, Italy, and in 2003 joined the Department of Information Engineering of the University of Padova, Italy, where he is currently a Professor. His present research interests include performance evaluation in mobile communications systems, random access in mobile radio networks, ad hoc and sensor networks, energy constrained communications protocols, and underwater communications and networking. He was Editor-in-Chief of IEEE Wireless Communications from 2003 to 2005, is currently Editor-in-Chief of IEEE Transactions on Communications, and serves on the Editorial Board of the Wiley Journal of Wireless Communications and Mobile Computing. He was also guest editor for special issues in IEEE Personal Communications and IEEE Journal on Selected Areas in Communications. He is a Fellow of the IEEE and a Member-at-Large of the Board of Governors of the IEEE Communications Society.