The smart car is called ZRobot, which is provided by Digilent. On board is the main control component: ZEDBORAD (XILINX Zynq-7000 FPGA, CORTEX-A9). There are also many extensions such as camera, wireless router, ultrasonic detector, wheel encoder, etc.
This platform allows us to develop both hardware and software, gaining a deeper understanding of the embedded system, operating system, wireless communication, and robotics and control theory.
We have a Linux system running and controlling the smart car, and we can also program it with cables and Xilinx software without OS. The hardware module and driver code for wheel driving (PWM wave), ultrasonic sensors, Bluetooth, and wheel encoder are already developed. Such functionalities allow the smart car to run automatically to avoid obstacles or to be controlled by computers and smartphones.
Currently, we are doing some upper level system design using this platform. One is to automatically gather indoor WiFi fingerprint which will be used in indoor localization. This is similar to SLAM problem, but we have to design a cooperating scheme for multiple robots to scan the entire indoor area. How to minimize location drifting is our biggest problem, since sensors on the smart car may contain error. We want to use probability and filtering techniques to decrease localization error, such as Kalman filter and particle filter.
Another system is on the networking level, which is to build a mobile wireless networking group of smart cars. We want to connect smart cars with ad hoc network functionalities to enable various communication protocols and increase the throughput. We want to extend the network card on Zedboard to enable multicasting among several smart cars. Networked smart car group can perform some intelligent tasks such football competition and house monitoring.
We choose Parrot AR.Drone 2.0 as our research platform. The drone runs a Linux Busybox as its operation system. It also has a 720p front camera and a 480p vertical camera. There are some integrated sensors, e.g., accelerometer, gyroscope, magnetometer, pressure sensor, etc.
The drone supports WiFi. Communication links can be set up between the drone and other devices, e.g., smart phones, laptops, remote server or even other drones and smart cars. Thus we may design complicated working scenarios for the drone. And by extend its wireless communication methods, for instance using a bundled smart phone to relay data transmission, as shown in the photo, we may realize large scale communication between various devices.
We can interact with the on-drone Linux terminal through telnet. Also we can run programs on the drone. By analysing on-drone sensor data and the two cameras' video flows, we can realize auto-pilot schemes for various usages.
Computer vision techniques can be applied to the drones video flow. We are using the drone's two video flows and some sensor data to achive simultaneous localization and mapping. With person recognition, a new indoor navigation method is being achieved.
Currently we are using the drone verify our thoughts on 3D indoor localization. We are also studying drone-car networks. And we are building a demo to utilize the drone as a personal assistant.