previous arrownext arrow
Prof. Hiroshi Ishiguro

Hiroshi Ishiguro (石黒浩 Ishiguro Hiroshi) is director of the Intelligent Robotics Laboratory, part of the Department of Systems Innovation in the Graduate School of Engineering Science at Osaka University, Japan. A notable development of the laboratory is the Actroid, a humanoid robot with lifelike appearance and visible behaviour such as facial movements. In robot development, Ishiguro concentrates on the idea of making a robot that is as similar as possible to a live human being. At the unveiling in July 2005 of the gynoid Repliee Q1Expo (in the cybernetic world, the term for female android, gynoid, from ancient Greek "gyne", that is woman) he was quoted as saying, "I have developed many robots before, but I soon realised the importance of its appearance. A human-like appearance gives a robot a strong feeling of presence. ... Repliee Q1Expo can interact with people. It can respond to people touching it. It's very satisfying, although we obviously have a long way to go yet." In his opinion, it may be possible to build an android that is indistinguishable from a human, at least during a brief encounter. Ishiguro has made an android that resembles him, called the Geminoid. The Geminoid was among the robots featured by James May in his 5 October 2008 BBC2 documentary on robots Man-Machine in May's series Big Ideas. He also introduced a telecommunication robot called the Telenoid R1. Hiroshi also uses the android to teach his classes at Osaka University of Japan and likes to scare his students by making Geminoid do human-like movements like blinking, "breathing" and fidgeting with his hands. Ishiguro has been listed, in 2011, as one of the 15 Asian Scientists to Watch by Asian Scientist Magazine. In 2018, Ishiguro was interviewed interacting with one of his robots for the documentary on artificial intelligence Do You Trust This Computer?.
Title: Coming Soon...


Abstract: Coming Soon...

Intelligent Robotics Laboratory, Osaka University


A Perceptual Information Infrastructure monitors and recognizes real environment through sensor networks. The sensor network tracks people in real-time and recognizes human behaviors which provide rich information for understanding real world events and helps peoples and robots working in the real world.

An Intelligent Robot Infrastructure is an interaction-based infrastructure. By interacting with robots, people can establish nonverbal communications with the artificial systems. That is, the purpose of a robot is to exist as a partner and to have valuable interactions with people.

Our objective is to develop technologies for the new generation information infrastructures based on Computer Vision, Robotics and Artificial Intelligence. Introduction(PDF)

Prof. Davide Scaramuzza

Davide Scaramuzza is a Professor of Robotics and Perception at the University of Zurich, where he does research at the intersection of robotics, computer vision, and machine learning, using standard cameras and event cameras, and aims to enable autonomous, agile navigation of micro drones in search and rescue applications. For his research contributions, he won prestigious awards, such as a European Research Council (ERC) Consolidator Grant, the IEEE Robotics and Automation Society Early Career Award, a Google Research Award, and two Qualcomm Innovation Fellowships. In 2015, he cofounded Zurich-Eye, today Facebook Zurich, which developed the visual-inertial SLAM system running in Oculus Quest VR headsets. He was also the strategic advisor of Dacuda, today Magic Leap Zurich. Many aspects of his research have been prominently featured in wider media, such as The New York Times, BBC News, Discovery Channel.
Autonomous, Agile Micro Drones: Perception, Learning, and Control


Abstract: Autonomous quadrotors will soon play a major role in search-and-rescue, delivery, and inspection missions, where a fast response is crucial. However, their speed and maneuverability are still far from those of birds and human pilots. High speed is particularly important: since drone battery life is usually limited to 20-30 minutes, drones need to fly faster to cover longer distances. However, to do so, they need faster sensors and algorithms. Human pilots take years to learn the skills to navigate drones. What does it take to make drones navigate as good or even better than human pilots? Autonomous, agile navigation through unknown, GPS- denied environments poses several challenges for robotics research in terms of perception, planning, learning, and control. In this talk, I will show how the combination of both model- based and machine learning methods united with the power of new, low-latency sensors, such as event cameras, can allow drones to achieve unprecedented speed and robustness by relying solely on onboard computing.

Contributions


Davide Scaramuzza is most known for:

(1) pioneering contributions to learning agile vision-based flight. video, paper, research.
(2) pioneering contributions to event-camera-based algorithms for mobile robots. 2014, 2017. Link to the research page.
(3) pioneering contributions to visual-inertial-SLAM-based autonomous navigation of micro drones. ICRA'10 paper.
(4) inventing the 1-point RANSAC algorithm, an effective and computationally efficient (1000 times faster) reduction of the standard 5-point RANSAC for visual odometry, when the vehicle motion is non-holonomic. IJCV'11 paper.
(5) authoring the Omnidirectional Camera Calibration Toolbox for MATLAB (OCamCalib), used at many companies (e.g., NASA, Philips, Bosch, Daimler, etc.). LINK. The toolbox is also part of the Matlab Computer Vision Toolbox.

previous arrowprevious arrow
next arrownext arrow
PlayPause
Slider
Menu