A view from the trenches
Some thoughts from my experience in Autonomous Vehicles. On Medium.
All views expressed are entirely my own.
I has taken 8 months to complete this writeup, but better late than never. It’s been a mixture of life and things going (too) well at Oxbotica.
Anyway, the TL;DR is that this is a summary of my trip to Borneo in May this year with v2.0 of my drone based orangutan tracking system, developed in partnership with International Animal Rescue (IAR). The story of v1.0 and why one would uberhaupt need to track orangutan’s in the jungle, and why using a drone is useful can be found in this post.
As an interesting aside, the v1.0 system got lost in a tree somewhere last time. But a few months after my return somebody randomly stumbled across the drone bits in the jungle (see top right picture in the collage below).
I am very fortunate to have worked in many stimulating environments, on many very different applications, and with some very awesome people. Since about 3 months Im happy to see that trend continue.
I recently joined Oxbotica, a startup out of the Oxford University Mobile Robotics Group (MRG), specialising in mobile autonomy. For various reasons it wasn’t a straightforward decision but so far has panned out well. Within the company we are developing Selenium, a cross vehicle and cross platform autonomy stack that third parties can license in whole or in part. So note this is about building and licensing software. Not cars.
Update: this has now morphed into https://www.theplastictide.com/
A project I have been thinking about and wanting to do for a very long time is to build a fully autonomous litter collecting robot. This driven by the annoyance I always feel when passing a nearby park, littered with Twix wrappers, coke cans, and the like. Challenging? Very much so. Impossible? No. You just have to pick your constraints.
I wish this was the blog post I would explain how I built the robot and how it all works, including snazzy youtube video. However, while I have already started down the path I still have a long (but fun!) way to go. In particular my Orangutans have been keeping me busy and will continue to do so for a while. I have also changed my professional affiliation but that’s a topic for the next post. It does explain though why my interest was piqued when Peter Kohler (GIS expert from SESexplore, and Fishackathon fame) told me about his project to raise awareness around marine litter. And that is the real topic of this post.
Update: read part II here
TL;DR: You are faced with a few thousand hectares of rainforest that you know harbours one or more orangutans that you need to track down. Where, how, and why do you start looking?
About a year ago I was doing a lot of drone related work and was presented with the following problem: Would it be possible to use a drone to fly above the Bornean jungle and search for tagged orangutans?
To understand the motivation behind the question we need some background.
In my last post I discussed my little flying object detector project that I’ve been doing for fun. While it worked it relied on communication with an external laptop in order to work (the on board odroid was not powerful enough to run the convnet model).
Hence, what better excuse to buy an NVIDIA Jetson TK1 dev board which should have more than enough juice to run everything on board the drone itself. As added benefit it should come in useful for my litter robot. I then added websocket support to the flask web app and you can now see the detections appear in real time on a map.
It took some fiddling to get the wifi to work with the Jetson and Im still surprised at how quickly the wifi degrades (even though Im using the 5GHz band to avoid collisions with the Tx/Rx). But it did work in the end:
It wasn’t the best place to test it out but I was short on time and it should be good enough to illustrate the concept
PS: on a related note, love what the guys from vertical.ai are working on
About a year and a half ago I saw Jetpac’s video of their Spotter app and I remember thinking at the time that it would be so cool to get this flying on a drone. I didn’t have the bandwidth to work on it at the time but ended up poking at it with Markus Aschinger at the ASI and with two A level students (Jawad / Isaac) from the Nuffield Foundation. While they did good work and it got me a step closer, it still hadn’t quite come together. Hence I sat down the past week to do a full rewrite, integrate it with a quad I had lying around and do a little demo. The result can be seen in the video below.
There are many applications for GPR but the one I have been focussing on the most has been landmine and UXO detection. GPR is a popular sensor for this but by no means the only one. From big complex neutron sources to small thermal cameras and magnetometers a wide variety of sensors have been developed to detect landmines.
While we were focussing on GPR I wanted to explore what other sensors we could mount or fly alongside the GPR and that could provide a complimentary picture. This then crystallised into a summer project for Skycap intern Alex Davey. After some in-depth research into the spectrum of relevant sensors the decision was made to experiment with an active EMI sensor (i.e., metal detector).
Besides the machine learning angle discussed in the previous post, the UAV based GPR system that I have been working on has involved an interesting foray into robot swarming technology. The reason being that the sensor places a number of restrictions on the operation of the aerial robot. In particular the swath that can be covered in one pass is rather limited. There are different routes to ameliorating that and one of them is the use of multiple drones to increase coverage rates.
One of the projects that has taken up a lot of my time the past few months is that of a UAV (drone) based Ground Penetrating Radar (GPR) system. There are a number of applications for this but the one we have been focussing on initially is landmine and UXO clearance. The elements that make up such a system are quite broad. Ranging from sensor design, UAV integration, positioning, terrain following to data analysis. As with many drone projects most of the attention tends to go to the hardware and the flying. While that is certainly important and I have been working on those elements too, the whole system is only as good as the quality and interpretability of the data you get back. That is key. With this post I’ll aim to give a brief summary of the work I have been leading on this front.