Navigation : EXPO21XX > ROBOTICS 21XX > H21: Humanoid and Androids Research > Osaka University
Osaka University
Videos
Loading the player ...
  • Offer Profile
  • Our objective is to develop technologies for the new generation information infrastructures based on Computer Vision, Robotics and Artificial Intelligence.

    Videos 1-6: copyright Prof. Ishiguro in Osaka University and Kokoro Co., Ltd. - All rights reserved

    Videos 7-14: copyright ATR Intelligent Robotics and Communication Laboratories - All rights reserved
Product Portfolio
  • Android Science: Towards a new methodology of cognitive science

  • Both appearance and behavior are significant issues in the development of humanoid robots. However, designing the robot's appearance, especially to give it a humanoid one, was always a role of the industrial designer. To tackle the problem of appearance and behavior, two approaches are necessary: one from robotics and the other from cognitive science. The approach from robotics tries to build very humanlike robots based on knowledge from cognitive science. The approach from cognitive science uses the robot for verifying hypotheses for understanding humans. We call this cross-interdisciplinary framework android science.
    • Integration of science and engineering

    • uncanny valley

    • A fundamental issue in android science is the existence of the uncanny valley. As robots appear more human, they seem more familiar, until a point is reached at which subtle imperfections create a sensation of strangeness or eeriness. Our important role is to verify the existence of the uncanny valley and to explore how to overcome the uncanny valley problem with androids.
  • Androids: Repliee Q1expo: Repliee Q2: 'An adult type android'

  • Repliee Q2 is an upgraded version of Repliee Q1. The face of Repliee Q2 has become more humanlike. Furthermore, it has 13 DoFs in the head so that it can make some facial expressions and mouth shapes. We have shown Repliee Q2 with the name of Repliee Q1expo at the World Expo held in Aichi, Japan in June, 2005. In the demonstrations, Repliee Q1expo interacted with people by impersonating a tv interviewer. Repliee Q1expo had omnidirectional cameras and microphones surrounding her and tactile sensors embedded under a carpet and could recognize a person's gestures, voice, and standing position.
    • Repliee Q2

    • Adult android: it has 42 air actuators in the upper torso. The appearance is realized by making a copy of an existing person.
       
    • Repliee Q2

    • Degree of freedom:
      • Eyes: 3
      • Face: Eyebrows: x 1, EyeLids: x 1, Cheeks: x 1
      • Mouth: 7
      • Neck: 3
      • Arms: 9x2
      • Fingers: 2x2
      • Torso: 4
    • Repliee Q2

    • Facial expressions of the adult android: 13 of the 42 actuators are used in the head. Humanlike facial expressions are realized by the motion of the eyes and mouth.
  • Repliee Q1 'An adult type android'

  • Natural communication with humans in everyday situations is one of the most important functions of humanoid robots. We think that humanlike appearance and motion are very important. A humanoid robot which has these qualities is called "an android".

    An android needs to possess the same motion mechanisms as a human to perform humanlike motion. If an android has such mechanisms, however, it is difficult to balance with humanlike appearance. We developed Repliee Q1 in collaboration with KOKORO, Inc. Repliee Q1 has a humanlike appearance and 31 degrees of freedom in the upper body, so the android is able to reproduce humanlike motion. Repliee Q1 Expo has been upgraded to 41 degrees of freedom, allowing for a greater number of humanlike motions. All of the joints are controlled by air actuators. An air actuator has a damping characteristic, so we can control the android softly without compliance control.
    • Repliee Q1: Natural communication

    •  
    • Repliee Q1 : Appearance and touch

    • The skin of Repliee Q1 is made of silicone, so the appearance and the touch are humanlike. High sensitivity skin sensors, which use a piezoelectric element, are attached in 11 places; on the brow, cheeks, shoulders, upper arms, forearms, and palms. The sensor can output a value according to the modification speed of the element, so the android can respond in various ways reactions depending on how it is touched.
    • Repliee Q1: Degree of freedom

    • Degree of freedom:
      • Eyes: 5
      • Mouth: 1
      • Neck: 3
      • Arms: 9x2
      • Torso: 4
         
  • Androids: An Android robot whose appearance is humanlike 'Repliee R1'

  • Appearance and motion are important to the impression a robot makes on people during communication between humans and robots. When using existing robots it is hard to measure whether appearance or motion is more important. However, using a robot with humanlike appearance, we can research the effect of motion alone, and then we can also research the effect of appearance.
    • This android has 9 degree of freedom in her head. She can move her eyes, eyelids, mouth, and neck.
    • Its body is covered with silicone, so the skin feels humanlike.
    • And it has 4 high sensitivity skin sensors under the skin.
    • Repliee R1

    • This Android robot looks like a human in detail, because this Android robot was created from a model of a human being.
    • Repliee R1

    • Detailed view of the inside.
    • Repliee R1

    • Skin is made of silicone and the inside is made from urethane.
  • GEMINOID

  • Appropriate teleoperation methods
    It is necessary to study the method to tele-operate an android in order to convey “presence”, which is quite different with traditional teleoperation for mobile robots and industrial robots. We will study a method to autonomously control an android by transferring a motion of an operator measured by a motion capturing system. Also, a method to autonomously control eye-gaze and small motion will be investigated.

    Natural behavior production matched with speech utterances on teleoperation
    We will investigate how to produce natural behaviors during speech utterances, in order to transmit “presence” by tele-operation of an android. Besides the information transmitted by words during speech utterances, we will study the effects on non-verbal communication, by investigating not only synchronization of speech and lip movements, but also the effects of facial expressions, head and even the whole body movements.

    Understanding and transmitting Human Presence
    We will investigate the effect of transmitting “Sonzai-Kan” from a remote place, such as participating to a meeting instead of the person himself. Moreover, we will investigate “what is presence” through experiments. For example, we will study whether the android can represent authority of the person himself, by comparing the person himself and the android.