I enjoy converting my crazy ideas into the reality! From 2018 to 2020, I has been the president of 'PKU Makerspace', an organization for 'makers' and 'Geeks' to create in GeekLab of Peking University. I often build hardware and software systems from bottom to top, such as robots, wearable devices, IoT device, software application and so on. Fortunately, most of my works is invited by some famous technology companies for exhibition, receives award in competition.
Developed a quadruped robot dog from bottom to top (including the 3D-printing body, hardware and ROS-based software).
Implemented a computer vision algorithm to recognize and obey the traffic sign when robot dog walks on the road.
Exhibited in PKU International Culture Festival
UbiTouch is a novel interaction system using depth camera and projection.
The Linux Desktop is projected on the common plane in our life (like wall or ground),
where users can use their finger as mouse cursor to interact with it (move, click, double click, etc).
Invited by ARM China in exhibition
We built a six-foot robot from bottom to top.
We implemented WiFi-based communication for remote web control and
designed acoustic processing algorithm to enable the robot to dance with the tone change of music.
Invited by Baidu ABC Summit 2019
Developed a wearable virtual musical "instrument" to detect the bend of fingers,
which enables people to use their fingers to play piano or other musical instrument cooperatively in the air.
Wifi-based communication and multiple-thread programming enable multi-user access and cooperative playing.
Invited by Baidu ABC Summit 2019
Developed the interactive robot hand using the computer vision to recognize the action of human and respond accordingly.
The current realized interaction included: rock-paper-scissors, hand shaking, and rapper. like rock-paper-scissors, hand shaking, and hip-hop.
Exhibited in Central Academy of Fine Arts (CAFA)
GeneMirror would reflect virtual image of people with genetic diseases (such as ichthyosis, chondropathy, etc), according to different gene sequence.
We use depth camera and the image processing algorithm for human extraction, denoising, body deformation, and filtering.
First Prize in the Global Youth Artificial Intelligence and Robotics Competition
Designed an electronic glove equipped with a IMU and flexible sensors which can recognize
different gestures using machine learning to remotely control a mobile robot and a robot arm.