GSoC – 2021 OpenWRT PPA final evaluation

Hey all hope you all are doing. With heavy heart I would like to conclude this beautiful journey of GSoC 2021 with one last blog.

First I would like to start by thanking Freifunk, Google and my mentor Mr. Benjamin Henrion for guiding me to become a better developer during the course of this program.

The link to my work can be accessed from here

Work done till now

As stated in the first blog, the main project revolved around making a working OpenWRT SDK available to users with an abstraction layer for their easy use.

First order of business was to present the user with a web interface where they just have to provide with their repository link, which can then be build using the OpenWRT SDK and final packages and targets could be hosted on a server.

The website looks like :-

The final build are hosted on a httpd server running on the background

Challenges encountered

During the development of this project the issue I faced was that of building the user scripts in the docker container containing the SDK. Also the default config file had to be implemented to override the menuconfig settings.

To overcome this I created a docker image containing the SDK which automatically builds the user packages. You can find it here.

The second challenge was for the build container to run in the background, without disturbing the the web-interface functionality.

For this I have used multiprocessing module and subprocess’s Popen module respectively. What they do is simply create a separate process for build step and running it parallelly alongside the web-interface thereby achieving concurrency.

Future Plans

  1. Improve Web-portal with added UI/UX
  2. Finding more ways to host these packages.
  3. Improving the build process by running them in Kubernetes cluster.
  4. Having a provision for multiple builds.
  5. Possibly include CI/CD pipeline for the same.

Conclusion

During this program I learnt many different things ranging from tools to technologies.

There is still a lot to be achieved for the project, which is why I’ll keep on working on it. Contributions, suggestions and feedback are always welcomed.

At the end it is just the beginning of another journey. Good day and fare the well everyone.

 

LibreMesh Pirania UI – A final overview

Hello Freifunk community!

Working on this project has been amazing, not only have I been able to work hand in hand with great developers but I have also learned a lot of things related to software development that will probably be useful for me in the future. The main idea of this project seeks to improve the implementation of the administration interface of the Pirania captive portal which is implemented in LibreMesh, a plugin used in different Community Networks.

During these weeks, I focused on 3 aspects: design, testing and implementation.To create the sketches that would allow me to detail the achievement of our ideas, we chose a tool called Figma, in this I could deploy the initial designs that we plan to implement for this project. Below you can see some of the screens I created:

List of vouchers screen

In this first screen we thought of some functionalities such as being able to search for vouchers, list vouchers and access another screen to create vouchers, when we choose the option to create vouchers we will see a screen like the following one:

Voucher creation screen

In this screen you can find some functionalities such as:

– Description field, to identify who the voucher is for or what it is used for.

– Choice of the duration time since the voucher activation.

– Choice of voucher permanence, to establish whether a voucher can be used for “unlimited” time or not.

– Possibility to choose how many vouchers to create.

– Possibility to edit a voucher created to correct any typo in the description or to “delete” a voucher so that it can no longer be used.

– And set the possibility of choose some other advanced options such as setting an expiration date to activate the vouchers.

– At the end of the voucher creation, generate a metadata page to deliver the voucher passwords and other data of interest such as the description and the voucher creation date.

Voucher details

In this screen we access after selecting a specific voucher in the voucher list and we can find some features such as the availability status, the creation date and the password that enables access.

As you can see, the idea of these screens is to maintain a similar and clear aesthetic, just like the LimeApp.

After that, I was working on a TDD framework to write the tests for these screens, I used technologies like React Testing Library and Jest. screens, I used technologies like React Testing Library and Jest.

Working on these tests was my favorite part of the project, as I feel that I was able to learn many things in this area and discover tools that I did not know.

Currently this project is at this point, however, I will continue working on new updates to implement all these ideas that I have been working on these weeks.

Finally, I want to thank Freifunk and my mentors for helping me in this process, it has been very rewarding for me to share and learn so much from them!

Thanks for reading,

Angie.

[GSoC’21] API Generator and tools with Draft-7 JSON schema

Before you ingress

Hello everyone!, The main intention of this project is to update the existing API spec schema files to the latest version of JSON schema, and also few tools which are dependent on the spec files.

The current latest version of JSON schema is version 2012-12, but unfortunately there isn’t much support to update the dependent tools(to be specific, the generator). So my mentor Andibraeu and I have chosen to work with draft7 version, as it has upper hand with implementation support compared to other recent versions.

Our initial map out

  • Migrate all the spec schema file to draft 7 version.
  • Pick out a framework to update the generator.
  • Generate and test the forms.[1]
  • List and update all the remaining dependent tools.
  • Test the updated tools.
  • Fix bugs, if any.

Spec files

Migrating the spec files to draft 7 version is not a difficult task. At the beginning, I have only migrated the recent version of spec file, so that we can immediately start working with the tools. And after updating all the tools, I have migrated all the spec files to draft 7 version.

Some features to point out:

References:

  1. Initial Migration -pull request
  2. Patch -pull request
  3. Patch -pull request

Generator

This is one of the significant tool depending on spec schema files. The job of the tool is very simple, it takes JSON schema as input, generates HTML forms to render in the browsers, handles validation of the form data against the input JSON schema and finally, generates a JSON out file.

Generator Layout

Unfortunately, no framework seemed perfect at the beginning, so I have picked up several frameworks to try them out and weigh the pros and cons to finally pick one. I had mentioned all the pros and cons in a document(check the references section).

Frameworks that I have tried

  • UI Schema for React
  • React JSON schema forms (mozilla)
  • Restspace Schema forms
  • JSONForms (Eclipse Source)

By weighing all the pros and cons, we have chosen JSONForms (Eclipse Source) to proceed with the further development.

JSONFORMS (Eclipse Source)

And thereafter, I developed UI schema for custom layout for the form field in the webpage. Also, at this point, I have to develop a custom renderer to render a map for picking the latitude and longitude of the communities locations. So upon looking at the documentation I developed the custom renderer (renderer, control and tester) thereafter to adopt this renderer to schema we need to have single object which only embeds longitude and latitude fields. So I quickly discussed with my mentor and added a new spec file to the organization. In conclusion, we have our generator up and running a demo in GitHub Pages.

References:

  1. Evaluation of frameworks -document
  2. Implemented frameworks -repo
  3. Generator tool -repo
  4. live demo -GHPages

Dependent tools

The API viewer and the Travis job CI are completely dependent on the spec schema files to validate the communities API files.

API Viewer

This tool generates a static build of pages, which show the validation result of the communities API file data.

A Valid API file
An Invalid API file
Validation Errors

Improvements:

  • The prior tool existed in python 2, I have updated to python 3.
  • Update to Validate data against draft 7 schema and show validation.
  • Added datatables to list the communities.

References:

  1. Python 3 migration with Draft 7 data validation -pull request
  2. Patch -pull request

Travis Job

All the API files are collected in the directory repository. And this Travis job validates the data of the API file data, when they are updated or added to the directory.json

Travis job Build
Job console output

A build of the travis job can be found here.

Improvements:

  • Also, this tool (test) existed in python 2, I have updated it to python 3.
  • Updated to validate data against draft 7.

References:

  1. Python 3 migration with draft 7 validation -pull request

Common API(collector script)

If you recall as I have added new spec file by embedding latitude, longitude into an object to adopt with the jsonforms custom renderer for map picker using react-leaflet. This would affect a lot of other tools like Community finden, Kontakt, etc which are truly based on the lon, lat fields of the API files. But luckily all these tools use a summarized API file, And the collector script is used to collect all the communities files.

Improvements:

  • Deserialized geoCode object and appended the fields to the respective locations. So that the fields are set to their old locations.

References:

  1. Altering Location fields -pull request

References

Here are the previous blogs of the project at different stages:

  1. Initial Stage(Before coding period)
  2. Phase I evaluation

Wind-Up

I have started the project with minimal understanding of react, typescript and jsonschema. But it was very fun to understand and work on. I really liked this way to learn new things rather than reading or doing a course. Every issue that I have encountered had leaded me to understand the things briefly. I’m really thankful to freifunk for the opportunity. And a big shout out to my mentor Andreas Bräu for absolutely wonderful guidance and support.

~ Shiva Shankar aka sh15h4nk

[GSoC’21] Irdest Android Client – Work Report

Note: You can read the same post in LaTeX here

Prelude

Hi super happy to see you here! It has been an exciting and productive summer from which I learnt a bunch of new stuff and irdest plus GitLab have been gracious on me. The project went through many ups and downs but we made our way fixing the bugs and making things work as expected. I hope I’ll be able to convey some information of the work done by me on apiece of this magnificent software over the course of the summer. Let’s begin!

Irdest!?

Okay, so first things first. Let me introduce you with Irdest and what it does to make sure we are on the same page and irde.st doesn’t sound completely (we)ird to you, and then we’ll progress on the title afterwards. First, I’ll try explaining it in a single line,

"Irdest... is a beast!"

Okay no jokes this time(that was no joke btw), going to explanation for real xD, irde.st is a software suite that allows users to create an internet-independent, decentralized & ad-hoc wireless mesh network. It removes all the dependencies of a user from a specific service and enables users to create a local network mesh of their own. It does not expose data or information of the user. Even the IPs of the peers present in the mesh are not known, they communicate via routers and the entire communication is end-to-end encrypted between the users, thereby increasing privacy in user data. As of now, Irdest supports various functionalities to users like sharing files over the network created, call between users, and messaging.

A Gist

This summer was focused on building the FFI Layer to implement the features supported by the library in the upstream. So if you have been following the initial three posts by me on the same topic, then you must be aware of the fact that the biggest challenge being encountered is compiling library properly and linking it to the application in the compile time itself.Apart from this, considerable challenges were about maintaining the robust CI which makes sure we don’t break stuff at any point of development process, and the very sensitive FFI layer. We got through these challenges and finally implemented some of the upstream features in the application, but not all. Because with this sophisticated setting of the components we need to move forward carefully in order to not break stuff, and with limited time in our hands we decided to implement some of the very basic features in the application and write an unbreakable CI for them, from which we can make use of the build artifacts and can keep track where things break.

Work Done

Without going into too much depth of the concepts/thought process and discussion,let’s quickly touch upon the work done in the course of this summer of code. You can refer to the previous posts if find yourself interested in detailing of the changes made/steps taken and why and stuff like that.

I. Compiling the Rust Library

The very first thing I did as a part of the summer of code was fixing the rust library compilation. Initially, the rust library was broken due to the massive refactor and some portion of the huge codebase being left.Due to compilation errors in the rust library(and I being beginner to Rust back then) it took some time to fix the errors, refactor the remaining portion of code accordingly and make it build green. As soon as the rust library was up, the target was to make the application compile and link the library to the application in the compile time.With all these changes being made, a challenge was to write CI for all this cross-compilation setting, which I had never done before.

II. Writing the CI

Writing the CI for android components including our FFI bridge wasn’t that tricky, but it did require some good knowledge of cross-compilation, Cargo and obviously android :P. But we ended up implementing that too, and with the current state of CI, nothing can break easily and we have awesome and strict checks that compile the components as per the need. We made use of GitLab’s one of the greatest and finest works, which is their CI,how they organize and define Pipelines, Jobs, triggering mechanisms and artifacts handling in subsequent and post Jobs. We combined the power of GitLab-CI and our own custom docker image irdest-android-build-env. This made our CI run lightning fast, Jobs that took 11mins to run without any Hi-Fi image being used now finished in 3 to 4 minutes, this was a huge gain and we were able to optimize our CI runs even more via redefining Pipelines, Jobs flow and via introducing the concept of Child Pipelines, another great piece of work by GitLab.

III. Implementing the Features Supported by Library

After all this CI and basic stuff being done, we moved ahead with implementing the functions supported by our rust library in the application. So I implemented the login and registration features, both in the single MR and due to very less time left in hands, I had to make major UI changes in the same MR, thereby increasing its size, the UI changes were not stellar, but they made the application layouts very responsive and with almost zero dimension hardcodings, everything works like springs, other ones get adjusted automatically, if the change is observed/experienced by any one of all present(for a particular layout).

IV. Some UI Fixes

Also there was a very nasty UI bug that I can remember of, in the Login/Registration screen, in which the screen got split into two components,the login one and the registration one, so in this I setup the optimal fragment transactions and created an abstract layout in the root screen which is empty by default and sets the desired layout file as per the requirements, e.g., it shows the Registration one if clicked on registration button and similar for the others.

V. Codebase Modernization

In the final days, we moved towards modernizing the application codebase via following some best practices in it and removing the old/deprecated ones : P , but sadly this couldn’t be merged because of the changes made in the NDK v23 API,which made our cross-compiler plugin incompatible with the project and thereby leading to CI failures, although all of this has now been fixed locally at my fork, but we wish to implement a stable and elegant solution after pondering on the problem for some time. So, along the lines for codebase modernization, the opened MRs included the migration from ol’ school Groovy Gradle files for dependency management to human readable Kotlin DSLs, along with some tool version bumps(out of which one was our NDK which I bumped to v23from v21 xD, yeah I can see ‘ya a bit sad, it hurts ; ( ) and some changes in Kotlin scripts we were able to compile the library directly from the Android-Studio itself, which previously was a great PITA and we had to manually compile the library. The next MR targeted the migration from legacy view scans to ViewBinding, increasing the application performance!





Ah, I am not going to list all the MRs opened by me in the summer here, but if interested you can give ’em a look here:
* we/irdest/merge_requests?author=s-ayush2903

Further Possible Improvements

Well there are really a bunch of improvements that can be made in the existing codebase! Let me help you think of few:

* Writing Unit tests for the features implemented by far
* Writing Instrumented tests for UI flow implemented by far
* Making the application support many/some more functions that the library supports
* Running instrumentation tests on CI
* Fixing the NDK v23 incompatibility with our cross-compiler plugin

last entry was a joke(that was no joke btw), ignore it xD

Acknowledgements : )

Well we finally arrive here. A huge thanks to my amazing mentor, Spacekookie ❤️, who was always there to help me out when stuck and shared their valued thoughts on what directions we need to take for the project. Discussions with them have always been super super insightful and let me ponder for a while about their thought process in figuring out the solutions. A big thanks to you again! Nextly, this project would never have been possible without the organization Freifunk where I got accepted as a GSoC’21 student to work on one of their project. It was a truly amazing experience where I learnt a lot of new stuff and met people having similar interests, which made the project and discussions more involved, productive and helpful. Thanks to all. Although I’m a bit disappointed about the very limited time we had to work on the project and couldn’t make it to the level we thought at a point of time.

But anyways, super happy after working on Irdest!

Btw you can find me on GitHub with username: s-ayush2903 👀

Cheers Until next time we meet 🥂
~Ayush Shrivastava

[GSoC’21] Irdest Android Client – Coding Phase II

Note: You can read the same post in LaTeX here

Prelude

Hell yeah, we paved our way to the conclusion of summer of code while working on this magnificent piece of software, Irdest and I’m super excited that you too are here! It is super happy to see if you’ve been following the series of the trailing blogs where I shared the progress the project made with the time of the course of the summer and the proposed timeline. Okay, so in the final phase of summer of code I focused on implementing the features supported in the upstream by Irdest(in the Rust library) in the android application, along with implementing better CI(will touch upon it later) and revisiting how we used our pipelines, Jobs and our custom docker image for CI, also, easing the cross-compilation for developers and modernizing the application codebase via using the best practices, although we faced blockers due to internal changes in NDK v23 and could not go ahead with all the changes, yeah quite sad : (

Okay, so now let’s see in detail and discuss the work done on the each component and brief thought process behind decisions made. This document first contains the work in final phase of summer of code

I. Implementation of Features Supported by Irdest

This was the crux of the project and quite a tricky and technical task to implement, all the work done on fixing and rewriting the FFI layer, whether it be from android application side or the android-support crate from the rust library, in previous phase comes into action here. Considering the time available to us and keeping in mind about not being overwhelmed or too excited to write a bunch of core-library functions, we decided to implement the basic functionality of user Registration and Login in the Application, and manually test these functionalities work fine and wrote a CI for them as well, to not let regressions creep in our codebase again. Yeah, so for implementing the Registration feature in the application, I fixed the FFI layer(again : P) and correctly set the wrap/unwrap functions in the rust side of FFI layer, fixing package name along with mentioned tweaks resulted in correct functioning of the Registration feature. So you can now create a new user and get a cryptographic ID assigned to it and use the credentials to login to the application. Making similar changes in the Login function of library, fixed stuff. With these library functions being fixed, the Auth began to function, theoretically. I had to change and fix the UI/Navigation setup in the application, how screens are changed/exchanged etc, in order to make Auth work, from point of view of an end-user.

II. Redefining & Re-architecting the App Navigation

Previously, the Register screen wasn’t being displayed properly, it was nested or better word to use, split the screen in two parts, Login one and Registration one(see we/irdest/#21 for more context and a clear picture). The problem turned out to be how Fragment transactions in the application were being handled and how we exchanged layout files on-the-fly along with the aforementioned Transactions. So previously everything was handled inside a single root file only, the layout(of login screen) was already present there by default and on going to registration did not entirely remove the Login layout instead split the screen, and some hardcoded dimensions too were present, making the problem persist more and less easy to fix . So what I did was creating an abstraction in the root layout file and keeping that abstract layout empty by default, with proper dimensions, which made sure the entire space is occupied by the concerned screen. So that root layout in the main file was essentially a FrameLayout which spanned screen accordingly and that FrameLayout held exactly which layout is going to be displayed on the screen. So you can consider this FrameLayout as a container which showed layouts as per requirement and initially contains nothing. Yep, you never get to see an empty screen, which is because we dynamically set the layout to be displayed in the FrameLayout via Kotlin files in the order the screens are supposed to appear. Well that’s enough discussion on the topic.
I made all the changes discussed in the previous two sections in a single MR : P , so here it goes
* we/irdest/!38

III. Revisiting the Project CI

Okay, so by then we had our Rust library being compiled in our CI pipelines, but we wanted more than that, the usability of the components/artifacts that were being produced as a result of builds. So we decided to publish the rust library to GitLab CI directly from the pipelines and use those artifacts as per need.Also, we used to publish the APK but since no cross-compilation was taking place in the CI, hence the APK being published from there was pretty much useless, so we enabled the cross-compilation in the CI and continued the APK uploading, as a result of which, the application installed using the APK from CI pipelines was running properly on the device. Next were some productivity related changes madein the CI, e.g., by design the APK obtained from application build is stored deep down in the app/build/.../.../debug/app-debug.apk and was being uploaded to same path in the artifacts archive from GitLab CI, I removed this Matryoshka dolls style hierarchy and moved the needed build files/reports to the top level directory.

You can find the corresponding MRs below:
* Enabling the Cross-compilation: we/irdest/!34
* Uploading Rust Library as CI artifact: we/irdest/!35
* Removing Matryoshka dolls style artifacts archive hierarchy: we/irdest/!40
* Uploading Lint Reports on Failure: we/irdest/!42

IV. Modernizing the Application Codebase

In the final days of the summer of code, we took active and fast steps to migrate chunks of our application codebase to follow Modern Android Development practices. Although, due to some NDK version incompatibility with the cross-compiler plugin we were unable to merge these changes and unable to fix our docker image too.But anyways, since the CI was green previously with optimized build time and our exhaustive docker image, so it’ll have to work again this time too! Okay, so coming back to the topic, the first MR I created in this direction was the migration from Groovy Gradle files to Kotlin DSLs, this migration already as numerous and obvious benefits over conventional Groovy Gradle files, but the cherry on the top was that with these commits in the MR, the cross-compilation was automatically being triggered on hitting the build button/icon only! Previously we had to compile the library first and then the application, to link the library to the application, but this MR saved us a huge time and PITA : )

The next step was regarding improvement of application performance via reducing Memory consumption while it is running. To achieve it we first eradicated all the findViewById() calls and the not so recommended Kotlin Synthetics as well, you can learn about the reason for the change in the linked issue(s). We instead used ViewBinding to bind and reference the views in runtime without worrying about the application crashes, this was a huge asset and since no view scans were being performed in the application runtime, the application runtime speed also increased and resulted in a decrease of memory consumption. But sadly,we couldn’t go ahead with the merging of these MRs because of the mentioned NDK version and cross-compilation plugin incompatibilities : (
We’ll be able to merge these as soon as we fix the docker image. Although, there is a way to fix it but that is not elegant, also, we want to do it for once and all, like no need to touch that CI file again unless we have to introduce some entirely new Job.
Find the corresponding linked MRs here:
* Migration from Groovy Files to Kotlin DSLs: we/irdest/!36
* Using ViewBinding & Remove Slow stuff: we/irdest/!41

And, the issue:
* Using ViewBinding instead old methods: we/irdest/#22

Cheers Until next time we meet and hope to see ‘ya in the final report!
~Ayush Shrivastava

[GSoC’21] RetroShare Mobile

Prologue :

Hello everyone 👋

This is the first update on the GSoC 2021- Retroshare Mobile App Project.

Recap :

The goal of the first project was:

  • Upgrading the project with new statemangement tool.
  • Adding short invite and QR code invite feature.
  • Adding import account functionality.
  • Upgrading the project with new version of flutter.
  • Adding support of createSigned Identity, deleteIdentity and UpdateIdentity feature.
  • Improving the UI of Add friend Screen.
  • Fixing minor bugs.
  • Adding support of retroshare-service inside the app.
  • Adding Emoji support.
  • Refactor codes.

Achievements :

I have almost added all the features to the app that I was proposed in my proposal for the first half. Currently, I am working on improving the chat backend.

QR Scanner ScreenAdd Friend ScreenAbout Screen
Create Identity ScreenChange Identity ScreenUpdate Identity Screen
Identity Info ScreenEmoji Support IEmoji Support II

Related Patches :

Next steps:

  • Work on Retroshare API wrapper with Elrepo mentor.
  • Will work on Forums support in Retroshare App.

See you in the next few days with a new update post. ✌️

[GSoC’21] 802.11v Client-Capability Measurement

As a next part of my work, I want to start with a client capacity measurement.

I want to find out how many of my Wifi clients support the different standards like 802. 11v / 802. 11k / 802. 11r etc. Since the access points know which features the clients have, I just have to be able to read out this data.

The heart of the Accesspoint is the interconnect system uBus which is used by most services running on an OpenWrt setup. UBus works as a message bus between different daemons in OpenWrt. The user space daemon software hostapd ensures that the network interface card functions as an access point. Furthermore, the OLSR protocol is installed on the Accesspoint to always ensure the shortest route between the individual Accesspoints.

To find out which clients support which standard I will write a new LUA script and install it on a virtual machine. This script is used to ask the hostapd via uBus for the clients that support 802. 11v. Since I want to evaluate the data and also want to display this graphically, I will install the Prometheus Exporter as monitoring software.

After I have evaluated the collected data with Prometheus, I can start the further implementation of the 802. 11v standard within the network.

Freifunk Digital Twin – GSoC 2021 – Phase I

Prologue

Hey community 👋
This is the first update on the GSoC 2021-Freifunk Digital Twin Project.

Recap

The goals for the first phase were:

  • determine a management tool for 100+ VM´s and OpenWrt Devices
  • figure out how to fetch and prepare topology information from routing daemons

So let´s see how that went…

Achievements

After testing and playing around with Qemu, the next step was to find a suitable VM-Management tool that is able to manage 100+ VMs. After several hours of reading and testing, i finally chose libvirt, because it´s open source, lightweight and CLI-controllable. But because it´s so extensive, i needed plenty of time to read in an get up a proper OpenWRT-VM but now it works quite well. After that, we decided to migrate our test environment from my local computer to our lab-server, which is more powerful. After some technical issues, which cost us plenty of time, the environment was finally up an running and it is now able to start and host 100+ OpenWRT VMs. Because of the stated problems, i doesn´t have much to show to you at the moment, so i´m going to update this post in the next few days.

Next steps

  • figure out how to fetch and prepare topology information from routing daemons
  • Write a script that combines VM management and network management

See you in the next few days with a new update post. ✌️

Building an app for network capability

This image has an empty alt attribute; its file name is AlterMundiInicio.png

Building an app for network capability

Hi! I’m Tomás. This post is a brief of the work that we did in the last few weeks. The prototype of a network capability app was achieved, and we’re starting to test it on communities. The app is still a prototype: it has only three functions (connect to a webpage using the WiFi, check if you’re in a LibreMesh network, and check the private IP of the device) and the front-end consists of only these three buttons, but it has now all the logic that was needed to start working on the rest of the app.

Basic functions

The first approach was to check if the user was able to connect to the LibreMesh local address by checking it with a ping, and then we decided to move forward to an HTTP GET instead. With this idea in mind, we prepared a new version of the application that sends a command to the device (a curl command) instead of a Java method with a previously developed android interface (for the ping version).

public boolean httpGetToLibreMesh() throws InterruptedException, IOException {
    //FIXME: modificar google por la IP de LibreMesh
    String[] cmdLine = {"sh", "-c", "curl --head --silent --fail google.com"};
    Process p1 = java.lang.Runtime.getRuntime().exec(cmdLine);
    int returnVal = p1.waitFor();
    return returnVal == 0;
}

This simple code solves the problem. It returns true if the HTTP GET to google.com worked, and false if it didn’t. It can be easily modified with the LibreMesh IP Address.

The next objective was to inform the user if the device wasn’t connected to the WiFi. In order to do so, we have to get the WifiManager from the ApplicationContext, and then check if the wifi is working.

public boolean verifyLibreMeshConnection() {
    WifiManager wm = (WifiManager) getApplicationContext().getSystemService(WIFI_SERVICE);
    if(wm.isWifiEnabled()) {
        return (wm.getConnectionInfo().getNetworkId() == -1) ? false : true;
    }
    return false;
}

Then, we needed a web navigator (WebView) inside the app with the capability to run the LibreMesh router website (On the first approach, to a google.com website).

Using a WebView object with the shouldOverrideUrlLoading overridden we can show a webpage in the app without the requirement of showing an external navigator (Android provides the Android WebView App that does this inside the LibreMesh app).

So with this simple code, we can configure the WebView to enter to a site inside the app.

WebView navegador;
navegador = (WebView) findViewById(R.id.navegadorLibreMesh);
navegador.setWebViewClient(new WebViewClient() {

@Override
public boolean shouldOverrideUrlLoading(WebView view, String url) {
    view.loadUrl(url);
    return true;
}
});
navegador.loadUrl("http://www.google.com");

Choosing through which network interface to send data to

Once having the WebView, the next step was to control through which network interface the application sends the network requests. In order to do that we have to access the ConnectivityManager. It was created as a class variable and defined on the function “onCreate” of the activity that holds the WebView. The connectivityMaganer isn’t a new instance but a reference to the object that controls the connections in the context of the App.

connectivityManager = (ConnectivityManager) getApplicationContext().getSystemService(Context.CONNECTIVITY_SERVICE);

Then we needed a function that can request to use the WiFi. The idea is to make a NetworkRequest and send it to the connectivityManager, but it also needed a NetworkCallback to specify what to do when the Network was available to accomplish the request. So as a second parameter of the request there’s an anonymous class that overrides the methods needed.

@RequiresApi(api = Build.VERSION_CODES.LOLLIPOP)
private void requestWifi() {
    final NetworkRequest networkRequest = new NetworkRequest.Builder()
           .addTransportType(NetworkCapabilities.TRANSPORT_WIFI)
           .build();

    connectivityManager.requestNetwork(networkRequest, new ConnectivityManager.NetworkCallback() {
        @Override
        public void onAvailable(Network network) {
            if(Build.VERSION.SDK_INT >= Build.VERSION_CODES.M)
                connectivityManager.bindProcessToNetwork(network);
            else
                ConnectivityManager.setProcessDefaultNetwork(network);
        }

        @Override
        public void onLost(Network network) {
            if(Build.VERSION.SDK_INT >= Build.VERSION_CODES.M)
                connectivityManager.bindProcessToNetwork(null);
            else
                ConnectivityManager.setProcessDefaultNetwork(null);
        }

        @Override
        public void onUnavailable() {
            super.onUnavailable();
        }
    });
}

The last thing that I needed to do was a function that runs the WebView.

private void iniciarNavegador() {
    if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.LOLLIPOP) requestWifi();

    WebView navegador;
    navegador = (WebView) findViewById(R.id.navegadorLibreMesh);
    navegador.setWebViewClient(new WebViewClient() {

       @Override
        public boolean shouldOverrideUrlLoading(WebView view, String url) {
            view.loadUrl(url);
            return true;
        }
    });
    navegador.loadUrl("192.168.0.2");
}

Getting the LibreMesh address

The next step was to move forward with getting the LibreMesh IP address. On the other hand, that’s not more than just an algorithm or a gateway, either way, does the same results. This can give us an alternative way to see if the user is connected or not to the LibreMesh server (we get the IP through the algorithm and compare the gateway version).

The idea was pretty simple and only required a int to ip auxiliar function. So we decided to collect all the methods that returned wifi information and send them to a new WifiInformationManager class. So, this class sends all the information that we need from the WiFi:

public class WifiInformationManager extends AppCompatActivity {
    private static String intToIp(int addr) {
        return  ((addr & 0xFF) + "." +
                ((addr >>>= 8) & 0xFF) + "." +
                ((addr >>>= 8) & 0xFF) + "." +
                ((addr >>>= 8) & 0xFF));
    }

    public static String getPrivateIp(WifiManager wm) {
        int ip = wm.getConnectionInfo().getIpAddress();
        return intToIp(ip);
    }

    public static boolean verifyWifiConnection(WifiManager wm) {
        if (wm.isWifiEnabled()) {
            return wm.getConnectionInfo().getNetworkId() != -1;
        }
        return false;
    }

    public static String getGateway(WifiManager wm) {
        return intToIp(wm.getDhcpInfo().gateway);
    }

}

The function getGateway solves in an elegant way the problem of the LibreMesh Local-Address. The rest of the job was simply to change the address of the WebView to this one.

Using logcat to find bugs

The logic step then was to try the application and test if it worked okay, but when we did that the WebView that shows the Lime-App showed a white screen instead. Using the logcat inside the Android Studio we were able to easily find the error, showing the importance of using this type of debugging tools.

Using the logs it’s easy to see that there’s a TypeError when trying to get the property ‘getVoices’. The problem comes with the plugin ‘window.speechSynthesis’ that isn’t available for some browsers.

The Lime-App is the graphical interface that LibreMesh uses for the configuration of community networks. We found the .js that was calling the function:

let synth = window.speechSynthesis;
let voices = synth.getVoices();

export const speech = (text, lang) => {
let utterThis = new SpeechSynthesisUtterance(text);
utterThis.pitch = 0.9;
utterThis.rate = 1.2;
utterThis.voice = voices.filter(x => x.lang === lang)[0];
synth.cancel();
synth.speak(utterThis);
};

It can be seen that in the line 2 the variable voices is set to a synth.getVoices, but if synth is undefined, then that line will not succeed.
The solution was pretty simple, with a control structure we check if the speechSynthesis was available or not. So the fixed code is:

let synth = window.speechSynthesis;

export const speech = (text, lang) => {
if(synth != "undefined") {
let voices = synth.getVoices();
let utterThis = new SpeechSynthesisUtterance(text);
utterThis.pitch = 0.9;
utterThis.rate = 1.2;
utterThis.voice = voices.filter(x => x.lang === lang)[0];
synth.cancel();
synth.speak(utterThis);
}
};

I sent a pull request to the Lime-App repository fixing this problem and it’s currently waiting to be merged.

Next steps

With the functions of detecting and configuring a libremesh network, we plan to add some features to the app the next weeks:

  • A better graphical interface with the integrations of all the planned functions of the app.
  • Support other services in addition to Lime-App.
  • Add the app to the LibreMesh operating system, giving the posibility to the user to obtain the app directly from the router.

Video

Github project

[GSoC’21] Irdest Android Client – Coding Phase I

NOTE: You can read the same post in LaTeX here

Prelude

Hello! Good to see you here : ) This blog is mostly a summary of work done till now under the first coding phase of summer of code of ’21. Picking from the end of previous blog post, we planned implementing chat feature in the application module, but due to the aforementioned massive refactor in the entire codebase and upgradation of existing modules to support modern hardware the chat API has been deprecated, and some components, because of not being part of CI, got broken : ( To implement features in the module the very first step was to get the project build properly, previously (maybe) due to migration from different version control system to GitLab and that massive refactor there were some unidentified problems that did not let the application codebase build properly, also the main lead of the picture `android-support` crate, not being a part of our GitLab CI workspace too, wasbroken. We fixed all this entire stuff in multiple different steps, each solving a mini problem and writing CI for each missing component so that we or anyone joining the project never encounter similar problem(s) in the future.

I. Fixing the Android Application Codebase

The application codebase was considerably broken and for the very first time when I built the application, it instantly said build failed in10ms, which is really very weird as when for the very first time you build an android application, it takes noticeable time(~6 minutes), this time is for fetching the dependencies that are declared in the dependency management file of android codebase(the ones with the build.gradle name) followed by compiling the android project codebase. It was quite astonishing at first sight,but on closer look to the files present in the android codebase the cause was observable. It was the presence of dependency archives in the android codebase and their corresponding XML files too, and since these dependencies were already present, studio didn’t take the pain of fetching them from maven. So the question arises, When the dependencies were already present still then application didn’t compile, why? Actually what happens is that these dependencies’ XML files are editable so even the slightest edit in them renders them useless and studio doesn’t even report any kind of problem with them, another thing that happens under the hood is that studio stores these dependencies in its local cache so that when user re-compiles the application, no time is taken in fetching the dependencies and it can perform the real build. Also, by time these caches get corrupted and usage of very old cache does not let the project work in the way it should.
Okay, now let’s come to the point how we fixed it. As by now it should’ve been clear that the problem was the existence of binaries and XMLs of dependencies present in the android codebase, so the solution that I anticipated was the deletion of these files. It was not sufficient. After doing this dependency management did go as expected, but still the build failed : ( After some more inspection, I found there was some problem with gradle executable scripts as well and the gradle-wrapper.properties too. So I just referred these scripts from my previous working projects and it finally started working, a moment of joy 🥳, after many days + nights of pain ; ) As this problem was fixed, after working on some other crucial matter(see next section, II one), we wrote a CI pipeline specially for our android application codebase so that it doesn’t break again ever in the mainstream. The android-application pipeline comprises of the 3 stages, in which its lint is checked, followed by build and then the tests are run. In the upcoming coding phase we plan to make this CI pipeline even more robust and enforce stricter formatting rules, introduce Static Analysis and run android Integration Tests on GitLab CI, well we’ll discuss it the next time we meet, leaving some topics for then 👀
You can find the corresponding MRs here:
* Fixing the android application codebase: we/irdest/!21
* CI for android application codebase: we/irdest/!23

II. Fixing the FFI Layer

After the previously mentioned refactor, everything was working fine, only android specific components of the codebase were broken. A part of which, was our FFI layer, the android-support crate. This layer still held references of several deleted and deprecated APIs, therefore compiling this crate too gave a bunch of errors. Fixing them took much more time as this crate was written in Rust and then I was not that fluent with it. So fixing it included updating/modifying existing functions or we had to remove functions as well because of the deprecations. A nice challenge that we encountered was saving the state across multiple platforms(supported hardware), because the crate we used for saving state provided support for almost all the operating systems other than Android. So what we did was using the knowledge of android that an application has access to its own private storage which no other application/service can see, so all we needed to do now was to find this directory in android device file-system, this path we achieved using the ADB, now we investigated where our crate went wrong, so for this we dived deep into the crate’s API and read how it achieved similar behavior for other platforms, which was that, it first of all found the HOME(environment variable) for the OS and then located corresponding path(s) for saving state in dir/file(s), turned out, that the crate was identifying HOME wrong only for android file-system. After this was diagnosed, we wrote a custom API that found HOME env var on all platforms irrespective of their OS (see this patch), by this API we were able to access the app-specific private directory and save state there. It was quite challenging, but we figured it out! Everything related to FFI layer, after this fix was quite easy. We eliminated the problems that existed in FFI layer via refactors and some modifications in functions and then, it built green! After fixing the FFI layer we wrote the CI for it that makes sure it builds each time a commit is pushed to any MR or branch and we can see the build status in pipelines too. Writing the CI for android-support crate was not a cakewalk,actually the application needs cross-compilation of our Rust library in order to function, so we need to make sure that the library which application is going to use on android devices is really compatible with android platform and by virtue we compile it on our PCs directly so that doesn’t work quite, to make expected behavior happen there are two options:
* Compiling the Rust library codebase on android-device(less feasible), or
* Cross-compiling the library on our PC/CI runner via providing support tools for android components
So quite obviously we went with the second option, for this we installed rust and android compatible components in our runners during the CI runtime and then compiled the library via checking out to the correct directory. But since, for compiling Rust library in each CI run we had to install the components and this specific pre-compilation part(or better to say setup portion) consumed a considerable portion of our CI script(and made it look a bit daunting too), so we packaged these components to our custom docker image and pulled it each time in our CI runs, this made our life easy and scripts beautiful : )

NOTE: If you don’t have much idea of cross-compilation, then you can have a look at this awesome blog-post by Milan✨. It gives a clear understanding to the reader what cross-compilation is, irrespective of their previous knowledge on the same(yes, but basic knowledge on compilers is needed a bit). Spoiler alert: That someone in the opening of blog post is me 😛

Also, since the refactor was incomplete in our android application codebase and the android-support crate so between these to big tasks, I fitted this small refactoring, as a light break 😛
You can see the MRs for them here:
* Fixing the FFI Layer: we/irdest/!31
* Refactoring the android-components: we/irdest/!32
* Adding the android-support crate in our CI Pipelines: we/irdest/!33

III. UI Improvements

After fixing issues in android application and our Rust library, and writing a robust end-to-end CI for them we moved forward towards improving the UI of the application. Previously, the application used legacy design components and ideology, under this task we modernized these UI components and followed the material design guidelines(material.io), that improved the overall look of our Authentication screens. There is nothing much to explain in it as it was really quite easy to achieve and also we didn’t encounter any problems.
You can see the related MR here: we/irdest/!26

Acknowledgements

Well, by far it has been the most exciting summer for me and I had interesting experiences working on the project. Fixing components, was very difficult at the beginning due to many reasons one I would mention is the huge codebase which we have and it is not easy to learn about the functionality of each component present in it in short time, and also everything is intertwined too at many places. Going through it and fixing issues would definitely not have been possible without the immense support from my mentor, Spacekookie, who was always there to help me out and direct what to do. Their advice greatly helped in speeding up the development process and they are also a source of inspiration to me. Most importantly, Milan, who is not officially my mentor but has helped me a ton of times in technicalities of CI and setting up Nix environment(which I initially used for cross-compilation) about which I knew nothing, and many more instances. It won’t have been easy for me to accomplish aforementioned tasks without direction and help from Spacekookie & Milan.
Thanks again to both of them : ) and I’m more excited to work on the project with them further!

Cheers Until next time we meet!
~Ayush Shrivastava