Bulletin No. 2, 2024

Components of the 3D Vison-Driven Robot being developed by Professor Liu’s team example. The team is aiming to give these abilities a major boost. “In the current project, we are also getting the robot to estimate accurately the pose of the object in front of it and to calculate the relative positions of the object and the robotic arm in three-dimensional space, so that the robot can plan the motion of its arm to grasp any randomly placed target objects from suitable angles. This way, robotic arms will provide greater flexibility than before.” Artificial intelligence models currently on the market mostly focus on collecting and organising various corpora and feeding them into neural network inference models for the computer to learn, she notes. They primarily rely on language model- based machine learning techniques to effectively accomplish natural language processing. “However, these AI systems are generally believed to have limited understanding of the physical world. For instance, when we need a robot to grasp an object, where should it place the object afterwards? We need to help the robot develop a thorough understanding of the real world, collecting data from warehouses and actual scenarios in the service industry in order to build and adapt large AI models.” In preparing for the RAISe+ application, Professor Liu says everything has built on the team’s collective efforts over the years. “You need to accumulate a lot of experience and knowledge over a long period of time and then identify suitable applications. It’s equally important to develop what we call ‘deep tech’, where your technology is competitive and innovative,” Professor Liu remarks. “Lastly, bring in industry partners.” The project team will closely collaborate with some of these partners to drive the commercialisation of three-dimensional, vision-driven robots in various fields. Companies like China Resources and Wuling Motors are on board to provide different service, industrial and construction scenarios for the team to explore. 21

RkJQdWJsaXNoZXIy NDE2NjYz