Category Archives: Video

Dyson Released its New Vacuum Cleaner Robot: 360 Eye

This state-of-the-art vacuum cleaner robot, 360 Eye, was released by Dyson few months ago. From the official website, it can be seen that the robot uses V-SLAM technique which dramatically increases the computational overhead. To fit the computational load, I think the processor of this robot should be at least a Cortex A8 running at 1GHz or an ARM-DSP SoC, like TI DaVinci DM64xx. I think the innovative part which makes this robot unique, is the use of tank-like structure and a 360 degrees camera for video navigation:

Figure. 360 Eye Vacuum Robot by Dyson (picture from IEEE Spectrum)

The tank structure makes this robot robust to small scale obstacles such as your carpet on the floor and small rising edges. For the idea of using a 360 degrees camera, well, I have to say it is a genius concept. A full view camera promises more contrast features and corners for the robot to navigate while it also reduce the possibility that the view of the robot is blocked by large obstacles like a human.

This is no doubt that this is the most advanced vacuum cleaner robot so far and only the newest technology of micro-processors can ensure this aggressive design. The only problem is, the cost of this robot is also considerable, thus its selling price is more than 900 dollars.

Last thing that I am wondering about is, what if this robot encounters a mirror?

( For Chinese users please visit Youku to see the video: )

AR.Drone Position Servoing and Visual Tracking

A demonstration of my Master’s Thesis: Visual-Based Localization and Tracking of a UGV with a Quadcopter. In this project, a visual tracking framework is designed to track the UGV with an AR.Drone quadcopter from Parrot. The system utilizes a centralized control by a ground station which is running ROS and Ubuntu 12.04 LTS.

The first two experiments were taken with the support of a global vision system which was designed using a low cost web camera. While in the last experiment, the quadcopter simply uses IMU data for navigation. The image was captured from the bottom camera of the AR.Drone and processed with OpenCV. Four PID controllers were designed to control the motion of the quadcopter to make it hold at a position or track a trajectory.

The next step is to use such a robot system for factory and infrastructure inspection. But since I have to return my quadcopter to the department, it is really problematic for me to imply this idea. Hope I can find the chance to get another AR.Drone soon.


Leap Motion尝鲜体验 – Cut the Rope



2012美国旧金山湾区Maker Faire制汇节

不久之前参加了深圳Maker Faire制汇节,算是中国创客活动的先河了,不过内容、规模上还是非常有限。相比之下,美国旧金山湾的Maker Faire宛如另一个梦幻世界。