下載/瀏覽Download

Download Report

Transcript 下載/瀏覽Download

Design of an Intelligent Visual
Acoustic Humanoid Robot
Ming-Yuan Shieh, Ming-Hung Tsai , Yen-Shao Chen
Southern Taiwan University, Taiwan
Outline








Abstract
Introduction
Visual System
Acoustic System
Locomotion Control System
Simulation Results
Experimental Results
Conclusions
2
Abstract

Intelligent Visual Acoustic humanoid Robot (IVAR) consist of




Visual recognition subsystem
Speech recognition subsystem
Locomotion controller subsystem
The robot is mainly based on
its visual system
but assisted by its acoustic
and postural control systems.
3
Introduction

Observers like Bill Gates believe that by 2025 we could have
robots in every home.

In famous movie “A.I.”


the protagonist is a humanoid robot with full of passions who often
pours his feelings to a considerate mechanical bear.
Because of this, many approaches or projects aim to design a
robot to accompany and pleasure the children.
4
Visual System

The diagram of color recognition show

Input
It consists of image filter, color space transform ,color pattern
matching.
Image color
filter
Color space
transform
Lowpass filter
RGB→YCbCr
Color pattern
matching
Output
Fit or not
Fit→Original color
Not→black color
5
Color Space Transform

We transform image into the YCbCr color space formula
Y  16   0.2549 0.5059 0.0980   R 
Cb   128   0.0451 0.2902 0.4392  G 
    
 
Cr  128  0.4392 0.3647 0.0706   B 

By using YCbCr model, one can easily distinguish intensity
information of the image from it color data.

A jump searching algorithm is adopted to reduce the
computing loads.
6
Acoustic System
In the figure, it display the proposed configuration of the
acoustic system


In this disposal, there are four microphones mounted as a mask at each side.
Since the quadratic disposal could receive all the sounds come from any
direction.
it provides omnibearing detection method to determine the exact direction
of the signal.
MIC 2
IVAR
MIC 3
MIC 1

MIC 4

7
SOURCE
Locomotion Control System

We adopt an ANFIS based identifier to determine the
dynamic model between the inputs and the outputs of the
locomotion control system.
r(t)
Reference
Model
ANFIS
Controller
u(t)
Unknown
Biped Robot
Dynamic
ANFIS
Identification
Model
yd(t)
+
-
yp(t)
+
ei(t)
ec(t)
Off-line modeling
8
Integrated ANFIS Controller

Vx
qn(t-1)
Vy
θtilt_x
θtilt_y
PZMP_x
PZMP_y
.
.
.


θ1
JC2
θ2
.
.
.
.
.
.
.
.
.
The ANFIS controller consists of 10
sub-controllers
JC1
.
.
.
These detected data can provide
as the inputs of the integrated
proposed ANFIS controller to
determine adaptive control
actions.
it needs to be training for
better system performance
.
.
.
JC10
θ10
9
ANFIS Structure

It is a seven-input and
single-output neuro-fuzzy
network.

Vx
Vy
The error signals of outputs
q
can be propagated back
and used to adjust the
θtilt_x
parameters of the controller.
θtilt_y
PZMP_x
PZMP_y
θtilt_x
NB
PB
NB
Π
N
f1
PB
NB
Π
N
f2
PB
NB
Π
N
f3
PB
NB
.
.
.
.
.
.
.
.
.
.
.
.
.
.
Π
N
fn-1
Π
N
fn
PB
NB
PB
NB
PB
Σ
θtilt_y
10
θn
Simulation Results
11
Image Recognition

One can search the color of the ball within 140<Cb<160 and
100<Cr<120 from this image.
12
The Hardware
13
Experiment Result
1.Robot Walking
14
Experiment Result
2.Visual Recognition on Objective
15
Conclusions

This paper presents a scheme based on a visual and acoustic
servo system.

The design procedures include



distinguishing the color of objects
to perform system identification of the biped robot by an ANFIS
The successful results of simulations and experiments
demonstrate the feasibility of the proposed schemes.
16
Thanks for your attendance!
17