SWOON: Simultaneous Wireless Organic Optimization within Nodewatcher – Final steps.

This blog post concludes the work I did during the Google Summer of Code 2016.

Spending the summer on nodewatcher was a pleasantly laborious exercise. Although my proposal concerned a development of a new algorithm, I realized that I also had to prepare various other components so that the algorithm would work. I started by sketching a short plan of different components I had to integrate for my algorithm to work. To remind the readers, SWOON is an algorithm for an efficient allocation of wireless frequencies in a densely populated area where many wireless access points interfere with each other. Before any work could begin, I had to install a local version of nodewatcher. I did not want to interfere with wlanslovenija’s mature network and I needed a small network of my own to run these tests. To find such a network, I collaborated with a non-profit organization based in Berkeley, California. They provided me with access to a building outfitted with more than 10 wireless access points. The wireless access points in this testbed are extremely diverse. They come from various manufacturers and not all broadcast on the 5GHz band. This allowed me to work with issues unique to community networks: software has to be built ground-up to be extremely modular and it has to support a diverse range of equipment, since there is no central planning authority.

I started with writing the data collection module, on which I have reported in my previous blog posts. Once I was able to work with the data, I worked on a different algorithm which detected rogue nodes in networks. This is especially useful for network maintainers who now have information on which nodes are most harmful to their network. They can now significantly reduce interference by working together. Algorithms on this scale have to be extremely efficient: I derived a complexity bound of O(E * log(V)) where E is the number of edges and V is the number of vertices in a graph. This allows the algorithm to scale all the way to wlanslovenija’s network or more, as it is almost linear. In order to achieve such scalability and efficiency, I had to peruse state-of-the-art graph theory research and utilize complex data structures (the one I used is called a disjoint-set data structure) to quickly order datasets in accordance with our needs.

I then moved on to the main algorithm, which took weeks to even rigorously define, let alone code up. I first considered implementing the Beigel-Eppstein coloring algorithm, which is known to run in ~O(1.3^n) time, but the complexity of the algorithm rendered it impractical. It relies on 19 different subcases which are iteratively used to simplify the constraint satisfaction (CSP) instance. Instead, I reverted to a custom implementation of a state-of-the-art greedy approach, which had to be modified from the source code of NetworkX, a leading open-source graph manipulation package. I devised a custom way of ordering the nodes in a graph to correctly prioritize wireless access points. The algorithm starts with looking at all the ‘neighbors’ (other nodes your radio detects) and computes the ‘busyness’ of each frequency band. We then minimize joint interference through the algorithm.

We observed an interesting property when running this algorithm on our wireless testbed. Most frequencies were moved to minimize interference, but some got worse. At first, we thought it was a bug, but we did eventually discover that some nodes must pick up more interference for others to remain noiseless. This perfectly coincides with our goal, which is to minimize the interference over all nodes, not every single node.

Next steps

One major problem with the algorithm is how unreliable ‘signal’ data is. We were thinking of coming up with a better metric to measure link quality instead of using signal strength. We were considering using the number of packet collisions instead. The modular design allows us to painlessly replace the metric without having to understand the entire algorithm.

We also noticed openWRT’s iwinfo doesn’t provide information about channel widths of access points. Adding this data would allow more accurate calculations instead of assuming all channels are 20MHz wide.

Finally, nodewatcher doesn’t yet have support for issuing node-specific warnings. I’d love to use such a warning to let a maintainer know that their node needs to change the channel.

However, I am confident this algorithm will enhance our user’s experience.

All my contributions can be seen on this link:
https://github.com/wlanslovenija/nodewatcher/commits/development?author=CdavM

If you’re interested in the timeline of the development, look at the following pull requests on nodewatcher’s GitHub:
Bug fixes related to a local installation of nodewatcher:
https://github.com/wlanslovenija/nodewatcher/pull/18
https://github.com/wlanslovenija/nodewatcher/pull/20
Data collection module:
https://github.com/wlanslovenija/nodewatcher/pull/22
Command that users can use to export data from the data collection module:
https://github.com/wlanslovenija/nodewatcher/pull/23
Improving the data collection module:
https://github.com/wlanslovenija/nodewatcher/pull/24
Algorithm that detects rogue nodes in one’s network:
https://github.com/wlanslovenija/nodewatcher/pull/25
Additional options for user-friendly data export command:
https://github.com/wlanslovenija/nodewatcher/pull/26
Testing framework for the rogue node algorithm + improvements:
https://github.com/wlanslovenija/nodewatcher/pull/27
Channel allocation algorithm implementation (not merged yet):
https://github.com/wlanslovenija/nodewatcher/pull/28
Updated data in nodewatcher/core:
https://github.com/wlanslovenija/nodewatcher/pull/29

SWOON: Simultaneous Wireless Organic Optimization within Nodewatcher – in the middle of the road.

Hi everyone,

This is my first update on working on SWOON, an algorithm to automatically allocate channels to nodes in nodewatcher. Not a lot of time has passed since I started working and but the wlanslovenija community was immensely helpful in getting me started. Up to now, I only worked on my own coding projects, where I got to be the main architect behind the codebase. I decided where functions were defined, I decided on the coding style, I did code reviews and I was the one to approve branch merges (of which there were none, because I wrote all the code 🙂 ). Now, GSoC threw me in the middle of a well established open source project where you have to get accustomed to existing coding practices. And accustoming to this was more difficult than I expected. So people from wlanslovenija had to be really patient with me (and I think they’ll have to keep this up for a while 🙂 ). But I slowly got the hang of it. My overarching plan for the algorithm is to first read the data, then process the data and finally issue a warning to those who maintain nodes with inefficient channel allocations. My work on reading the data from nodes is almost done (almost because a new amazing feature was suggested only yesterday!). On a tangent, I think this shows another immense benefit of community oriented projects – everyone’s free to suggest additional features that would make this process more eficient. To get the data about neighboring nodes, one has to temporarily turn the wireless radios off. Because of that, nodewatcher-agent only performs a survey of neighbors every two hours. But we collect data much more regularly. So how do we deal with this? My current implementation simply stores the data into the database every hour and it does not take into account the age of the data. I will now work on comparing the current datapoint with the latest one and only insert it into the datastream if the data has changed. This will reduce the space we’re using and could even be expanded for use by other modules. It’s great to know that you can contribute to any aspect of the project. Mitar, a senior developer of nodewatcher, said that they built all of it themselves, so we can rewrite anything as well.

I also worked on a parallel implementation of the mechanism, which read the data from old dumps. This gave me the insight into the architecture that made writing the actual implementation much easier.

My next steps are to finish the data collection part and create a module to analyze this data. This is exciting for everyone because there’s no documentation about writing such modules so I’ll get to be the first one to do it! I can only hope I’ll document it well enough to make the process easier for those who will follow as new members in the community. The github PR can be found here.

SWOON: Simultaneous Wireless Organic Optimization within Nodewatcher

Hi everyone,

I will contribute to one of the Freifunk projects; nodewatcher, via Google Summer of Code this summer and I wanted to keep you updated on my progress as well as exchange thoughts about my ideas.

First of all, nodewatcher is an open source, modular community oriented platform used for network planning, node deployment, node monitoring and maintenance. nodewatcher was initially developed to be primarily used by the wlan slovenija project. With 1336 nodes, it’s really successful and a great example for community networks. As nodewatcher gets deployed elsewhere with even more nodes, it’s natural to ask ourselves if we can be smarter about allocating spectrum to our wireless nodes – these nodes are mostly inexpensive wireless routers but it’s natural to extend the meaning of the term to dedicated wireless access points (i.e. Unifi AP).

The theoretical foundation for this problem is fascinating by itself: Each node has a different amount of noise in each channel (the 2.4GHz band allows 3 non-overlapping channels where each channel is 20MHz wide) and each node wants to maximize its SNR (signal-to-noise ratio). I will term this as the greedy approach, which is already used in enterprise level devices. However, in an urban setting, nodes are close enough to each other for their signal to act as noise to other nodes. The greedy approach is no longer optimal as it bears a high price of anarchy. Instead, our goal is now to maximize the sum of channel capacities (under a power constraint). I will have to devise an algorithm to solve this problem and the algorithm does not seem trivial since the number of combinations is increasing exponentially with the number of nodes in the system. Even with only 10 nodes, we haveover 59000 possible allocations on 2.4GHz band and over 95 trillion on the 5GHz band.

Traditional networking literature tackles this optimization problem with Lagrange multipliers. An alternative is to look at approximate graph coloring schemes and compute chromatic numbers. I hope to experiment across various settings and approaches.
Over the course of the project, I hope to experiment with a real network which consists of at least 10 nodes and measure the improvements. One exciting thing about real life experiments is that nodewatcher was mostly used inside wlan slovenija’s network and I get to run it independently! This will probably allow me to fix some bugs on the way and contribute to nodewatcher in this aspect as well.

The algorithm will initally be developed as a nodewatcher module, but I hope to eventually port it to openwrt (possibly after the summer ends). The main difficulty is that nodewatcher can act as a central level planner, whereas the openwrt scenario requires negotiation among nodes. So it’s harder to convince a node to decrease its TX power to benefit other users. But imagine a network where nodes can communicate and achieve a socially optimal point of spectrum allocation! A glorious future awaits us.