POST
MaSCOT Report: Part I
I need to write a final report for the MaSCOT project during 2016. I thought drafting the report as a series of blog posts would kill two birds with one stone…
This is an introduction the Marinized Stereo Camera Operational Testbed (MaSCOT), an internal R&D project supported by the APL Science and Engineering Group (SEG). The project had multiple goals:
- to procure one or more stereo vision cameras and package them for shallow-water underwater testing;
- to any sort of machine vision camera at all for use underwater (which is mostly redundant with one); and
- to support my serious engagement with at least one Open Source SLAM program – there is such a diversity of code available, I felt it was better to incrementally improve and existing codebase than start from scratch.
Besides hardware costs, the project supported a small amount of time for a mechanical engineer at APL to design and construct the camera housing. My time was supported under the APL postdoc program.
The original project plan called for puchasing two stereo cameras, a StereoLabs Zed, and an Elphel 393 two-camera configuration. It was rapidly apparent that I wouldn’t have that bandwidth the adequately deal with two radically different cameras. As the out-of-the-box experience with the Zed camera was very appealing, I dedicated my energies towards the Zed.
{:.center}
The system-level design was driven by two factors: first, the Zed is a USB 3 camera and its software is CUDA-accelerated, which necessitates a USB 3 host w/ CUDA cores in close proximity; second, we had an existing 50 foot underwater gigabit ethernet cable which was terminated in a Subconn DIL8M which was originally purchased for use with a Sexton underwater IP camera. Reusing this cable would save a significant up front cost.
[I know there are a variety of USB3 extenders available, including USB3 over fiber optics, though you have to consider that I would need to move the USB3 not just over a distance, but through a subsea connector, then over a distance underwater. The cost of fiber-based penetrations and cabling would have sunk the project. Putting a computer in proximity to the camera had two positive side effects: first, it justified buying a Jetson board, and second it was a positive step towards running SLAM at the camera itself as in an untethered or autonomous robot.]
{:.center}
The USB3+CUDA requirement led me to the NVidia Jetson TX1, which offers a quad-core 64-bit ARM processor and 256 CUDA cores on a single module. Moreover, the Jetson is one of Stereolabs officially supported platforms, if at a reduced performance than a desktop PC.
I completed the core system with a SATA SSD (the Jetson development kit motherboard has an M.2 slot but it is the relatively obscure E keying rather than the B and M keyings used on most consumer SSDs). I also purchased two Blue Robotics 1500 lumen LED lights.
The lights consume a nominal 15W, plus a conservative 10 for the Jetson gives 40W total. Given the goal of reusing the existing Gigabit cabling, I focused onto Power over Ethernet. “Standard” PoE (15.4 watts) and PoE+ (25.5 watts) wasn’t up to the task, but a variety of vendors make non-standard PoE systems going up to 60W. The ultimate problem was finding a high-power PoE “splitter” to take the PoE’ed ethernet and extract the DC power from it. Ultimately I purchased a matching injector and splitter from Planet PoE (via Amazon).