Building a Logistics demo with RC models and indoor navigation
In my current role I am often told I have the best job in the world. I get with play with lots of cool ‘toys’ and connect them to SAP systems.
One of the fun demos I had the chance to work on was the Delivering the Products Demo that was at Leonardo Live in Chicago in 2017.
This is a high level description of the demo architecture with an overview video is here:
The demo was made up of multiple parts that were shared between the team:
- Factory to create the Product
- Truck to deliver the product to the distribution center
- Drone to take the product from the distribution center to the customer
- Indoor navigation to provide location data from drone and truck
- SAP Analytics Cloud to show real time data from the truck and drone
Marvelmind Indoor Navigation
GPS is a little tricky for indoor navigation. This is where Marvelmind came in really handy. Marvelmind gives you the ability to map a zone and provide GPS or mm coordinates within the zone. This allowed us to track the truck and drone, and also provide geofencing for the truck to be semi autonomous.
This is a sonar based system, you can hear the clicks while it’s working.
The kit consisted of a number stationary beacons, mobile beacons (Hedgehogs) and a modem.
The drone and the truck were fitted with the mobile beacons for tracking, the stationary beacons placed around the area.
The dashboard shows the beacons (green) and the truck moving (blue). This allows for calibration and setup of the area.
The modem provides the location data from all the beacons and hedgehogs, this was attached via a serial port to a gateway.
It’s fun to watch a truck drive itself around a track and interact with a drone, but it’s even more fun to be able to interact with the truck.
We wanted the truck to be semi autonomous, slowing down in the factory and the distribution center, but we wanted to let anyone attending experience driving the truck. An autonomous truck is fun to watch, but being able to drive it is even more fun.
Those familiar with RC cars will be aware of the existing setup. The speed and direction comes into a receiver, the receiver outputs Pulse Width Modulation to control the Electronic Speed Control of the truck as well as the steering servo.
To combine the data from marvel mind and the receiver we used an Arduino 101 (unfortunately discontinued, we now use Feather32) to read the location from the mobile beacon via UART and intercept the PWM signals via an interrupt.
If you are ever building a hardware demo you need to support at a conference, I can highly recommend a pocket oscilloscope.
The PWM signal was then modified based on location determination before being applied to the servos.
This is the tractor part of the truck being tested for location before putting it on the ground. The trucks Hedgehog is being moved a few centimeters, that is enough to trigger a location threshold crossing and have the trucks speed change, in this case to zero.
We added a DHT11 temperature and humidity sensor to the truck to add to the data we were collecting. This gave us Speed, RPM, location, Temperature, Humidity and Vibration data we could push to the cloud for the truck.
When building physical demos like this it helps to have a great space in which to do it. The d-shop in our Newtown Square office was definitely a help. We are lucky to have over 30 of these maker spaces globally. If you are ever in Newtown Square you are more than welcome to visit.
As the truck drove through the factory a package is loaded onto its flatbed from the conveyor belt. The Factory was built using Lego Mindstorms as a simple way to create an autonomous sensor driven factory.
This took a lot more lego than we had, thankfully SAP Autism at Work program came to our help. As part of their program they use Mindstorms for design prototyping. They were kind enough to lend us the parts we needed.
The truck position in the factory was detected with EV3 sensors. Once in position the box was moved from the conveyor belt onto the flatbed of the truck.
Choosing the drone was an interesting challenge, and something I will never forget.
The drone needed to be able to lift and carry and object, be able to release the object and be safely flown indoors.
The first drone we considered was the 3DR Solo as this model gave us the potential for auto navigation, leveraging a Pixhawk and the Marvelmind navigation system.
Unfortunately we were not able to achieve a stability we needed indoors with this approach. During testing, even though the drone was in a cage I still jumped behind tables!
Another option we considered was the Mavic Spark. The Mavic drones drones come with a controller unit that also utilizes a smart phone. The SDK (https://developer.dji.com/mobile-sdk/ )
We finally settled on a DJI Mavic Pro. This gave us the required interactions through the SDK along with a drone with enough lift capability to transfer the package from the truck to the delivery area.
We used neodymium magnets on each station with a varying degree of strength to allow the package to be held and released. The underside of the drone also had a carriage with magnets to allow the pickup of the package.
SAP Analytics Cloud was used for the dashboard to visualize data collected from both the drone and the truck via SCP.
A node service on an x86 gateway was used with Edge Services to retrieve the data. We captured location data from the MarvelMind modem and device data via bluetooth from the truck and forwarded to the cloud via MQTT.
When putting together disparate systems on a gateway, node modules are a time saving tool. The bluetooth modules come with platform specific twists that at the time had issues with os versions, but we were able to get the gateway functioning in a few hours.
This was definitely a fun project to work on comprising a wide range of technologies and also my colleagues Dan McIntosh (Factory), Kevin Changela (Drone), Anthony McLeod (SAC).