En-tête de navigationNavigation principaleSuiviFiche


Unité de recherche
PCRD EU
Numéro de projet
99.0675-1
Titre du projet
INTERFACE: Multimodal analysis / synthesis system for human interaction to virtual and augmented environments
Titre du projet anglais
INTERFACE: Multimodal analysis / synthesis system for human interaction to virtual and augmented environments

Textes relatifs à ce projet

 AllemandFrançaisItalienAnglais
Mots-clé
-
-
-
Anzeigen
Autre Numéro de projet
-
-
-
Anzeigen
Programme de recherche
-
-
-
Anzeigen
Description succincte
-
-
-
Anzeigen
Partenaires et organisations internationales
-
-
-
Anzeigen
Résumé des résultats (Abstract)
-
-
-
Anzeigen
Références bases de données
-
-
-
Anzeigen

Textes saisis


CatégorieTexte
Mots-clé
(Anglais)
Facial animation; MPEG-4; dialogue manager
Autre Numéro de projet
(Anglais)
EU project number: IST-1999-10036
Programme de recherche
(Anglais)
EU-programme: 5. Frame Research Programme - 1.2.4 Essential technologies and infrastructures
Description succincte
(Anglais)
See abstract
Partenaires et organisations internationales
(Anglais)
Coordinator: University of Genova (I)
Résumé des résultats (Abstract)
(Anglais)
We have contributed this second year to various activities of the InterFace project, especially in machine-to-man part (Facial Animation Tables) and modules for server (Phoneme/Bookmark to FAP converter and Dialogue Manager).
For Feedback based on facial animation, we have worked on facial animation engine based on MPEG-4 Face Animation Tables. We focused our work on different aspects:
- extension of MIRALab facial animation engine to provide automatic FAT generation
- definition of some facial test models
- development of tools to compile deformations designed by an animator into FATs
- study and implementation of different techniques to improve visual quality of facial animation
- development of Java applet for synthesis facial animation based on FATs and integration into different demonstrations. (Sample on http://www.miralab.unige.ch/ ). We also have provided to partner a common applet form in order to integration face and body synthesis deformation in the same application.
In System Integration, we have worked for the development and the implementation of Dialogue Manager. The Dialogue Manager (DM) serves as the link between the synthesis and analysis parts of the interface platform. As a software module, it takes as input text messages annotated with emotions from the Input Module. The input to the DM is in the form of text annotated with MPEG-4 TTS tags signalling emotions and face actions. The output of the DM is the response text, annotated with face actions and emotions in an SAPI text format using emotional bookmarks. The format of the input and the output has been decided in coordination of all the concerned partners. The DM is based on a freely available/freeware text analysis tool, and is extended to provide support for emotions.
We have also worked to improve the Phoneme/Bookmark to FAP converter. This tool allows translating the data coming from the dialogue Manager and the Text-To-Speech to MPEG-4 FAP streams.

Références bases de données
(Anglais)
Swiss Database: Euro-DB of the
State Secretariat for Education and Research
Hallwylstrasse 4
CH-3003 Berne, Switzerland
Tel. +41 31 322 74 82
Swiss Project-Number: 99.0675-1