I've spent the last couple months on a small ocean engineering project with a dual purpose: to provide me an opportunity to test some engineering hypotheses, and to provide an opportunity for a friend in the humanities, Christina Vagt, to observe the engineering and operation of a simple oceanographic platform, a Lagrangian drifter. We called our drifter "Message in a Bottle", or MiB for short. Christina and I co-presented this at the Modeling the Pacific workshop a few days ago.
The project is still a work in progress, because, on the one hand, I have a lot of ideas about where to take it next, and, on the other, Christina would like to continue her work with it. But I think it's OK to wrap the work so far with a bow, and call phase I done.
From my perspective the MiB was built to test the hypothesis that a modern smartphone is a good platform to send to sea to collect data, equipped as it is with integrated sensors (GPS, accelerometer, mic and cameras) and communications (cell & WiFi). I understand why an ocean vehicle, i.e. something that navigates, may be better served by a flight controller (something in the Pixhawk family, for example), and why an ocean observing tool that integrates a sensor might better be served by a mini-computer (like a BeagleBone or a Raspberry Pi). But I don't get why there aren't at least some options out there, especially ones more focused on doing rather than making, that are based on old smartphones, which are abundant, cheap and universally available. So I set out to figure out that why or, failing that, to fill the niche.
So far I've gone through four versions of deployable hardware, three of which were actually deployed. The smartphone used for all of them was the same: my old Nexus 5X, which was dropped on its head and retired due to a cracked front camera lens. The housing was also the same for all: two acrylic hemispheres, which are sold on Amazon as fence windows that dogs can look through. The versions were differentiated by a) the external battery used in each; b) the scaffold used to stabilize the phone, battery and solar light inside the housing; c) the data acquired by each version; and d) small differences in the gasketing, taping and ballasting. Drifters often have an underwater component, for example a drogue, that anchors them to the surface of the water and resists the action of wind. Thus their motion captures the effect of ocean surface currents, and less so wind. In our case we had to make a decision whether to incorporate such a component and risk entanglement in the kelp that is so pervasive in the Santa Barbara Channel, or skip it and suffer the effects of the wind. We went with the latter, and I have some thoughts about how to use math to unmix the portion of the MiB's motion that's due to current from the portion that's due to wind.
The software for all deployments in this phase was simply Tasker (TODO: add Tasker scripts to GitHub). I originally developed two generations of Python alternatives, one based on QPython and one based on Kivy, with which to perform the data collection (TODO: add to GitHub). In the end, for what we were doing in this phase, Python, with its second class citizen status on Android, proved to be a bug, not a feature. Neither QPython nor Kivy played especially well with Android 8.1 power management. To use either I would have to either use work-arounds, use extremely power-hungry lock acquisition tactics, or, in the case of a Kivy app, launch the code through a scheduler, like Tasker. In the end, for the simple things we wanted to do, it was more power efficient and more native to simply use Tasker.
Telemetry (time, GPS coordinates, GPS accuracy and battery state) was always uploaded to a Google Sheet, a simple and accessible solution. (TODO: put instructions for doing this in GitHub) In deployment 2 we collected 23 hours of audio using the smartphone's mic, and in deployment 3 we collected an hour of photos using the selfie camera, which was less obstructed by the scaffold due to its location closer to the smartphone's bezel, and did not have a broken lens. Recordings and photos were downloaded upon retrieval of the MiB. These can, in principle, be synchronized to shore while the MiB is still at sea, at some additional cost. For connectivity we used a Google Fi data-only SIM. For security reasons, the smartphone in the MiB was not signed into to my personal Google account, but rather into a dummy one, so that if it were retrieved by someone other than us, it would not open a giant security hole in my digital life. The SIM, on the other hand, had to be activated by my personal Google Fi account for the billing to work. I activated it in a device connected to my account, and then moved it to the MiB smartphone, where the dummy account was logged in. So, it seems that, as of this writing in fall 2019, a Google Fi data-only SIM is a decent option for field IoT projects, with very little added effort required.
Launches were performed from a powerboat and sailboat with equal success. Retrievals were done from a powerboat, a sailboat and shore -- the MiB beached on sand at the end of its second deployment, and was retrieved an hour later.
Neither the audio nor the photos we recorded were anything special. The MiB has a lot of self-noise, the sound of water slapping against the enclosure as it rocks back and forth. We captured some airplanes, a boat or two, and a handful of larger crash-splashes that could have been the MiB being completely engulfed, or a marine mammal surfacing nearby. The audio could be analyzed to get wave spectra, but that's about the extent of its usefulness. Presumably the accelerometer data would be more precise for that task. The photo series was too short to really see anything. There was glare from the enclosure, and later there were focusing issues as condensation formed in the enclosure (that deployment had condensation problems due to another hardware change, the addition of more foam, intended to tension the phone-battery-light sandwich more firmly inside the enclosure, but which led to a bad seal along the gasket). The photos of the water were of a couple different hues of blue, which could have been a real effect or due to the orientation of the MiB at the time, which we did not record.
The final task left to wrap up phase I is to assemble all the digital artifacts (instructions, code, the BoM, and the telemetry) in GitHub. After that I have many ideas for what to tackle first and what to tackle next, but it might be nice to spend some time finding a tiny bit of funding for this. Stay tuned, and let me know if you have any comments or ideas.
1 Comment
Greg
10/17/2019 06:00:26 pm
What a wonderful piece of research. I don't know anything about Android, but am curious about the Python CPU problems... were these problems specifically with QPython and Kivy, in that they don't properly sleep? Worth trying a Ruby-based platform? Admittedly this is off-topic from the main thrust of your work, which is fascinating BTW.
Reply
Leave a Reply. |
Teri
Sailor, geek. ArchivesCategories |