photo JY Antoine
Jean-Yves ANTOINE
France Deutsch Portugese Brezhonneg
Home Research Activities Publications Teaching Activities History of Art


> Research > NLP and Handicap
NLP & Handicap
My researches on the help computers should provide to people with special needs concern Natural Language Processing (NLP) and Human-Machine Interaction (HMI) as well). In particular, I am working on the following applications :
  • Augmentative and Alternative Communication (AAC) 
  • Monitoring and control systms for Ambient Assisted Living (AAL) 

iconeAugmentative and Alternative Communication (AAC)


Augmentative and Alternative Communication (AAC) aims at restoring communicative abilities for persons with severe speech and motion impairments (cerebrally and physically handicapped persons). Whatever the disease considered, oral communication is impossible for these persons who have also serious difficulties to physically control physically their environment. In particular, they are not able to use input devices of a computer. Communication with an AAC systems means communicating by the help of a table of symbols (words, letters or even icons), where the handicapped person selects successively item after item. The selection is achieved by pointing on a virtual keyboard displayed on the screen of the computer.
Basically, an AAC system consists of four components: at first, a physical input interface connected to a computer. This interface is adapted to the control capacities of the user. Often, these only amount to a binary reply (e.g. an eye glimpse): the control of the environment is therefore restricted to a Yes/No-command. Secondly, a virtual keyboard allowing the user to select successively symbols to compose messages. For instance, key selection can be achieved by a linear scan: a cursor highlights successively each key, which can then be selected. The last two components are a text editor (to write e-mails or other documents) and a speech synthesis which is activated in case of oral communication.

    AAC system


The main weakness of AAC systems results from the slowness of message composition (on the average 1 to 5 words per minute). Moreover, this task is extremely tiring for the patients. Two complementary approaches are possible to speed up communication.
  • optimizing the selection on the virtual keyboard : most probable symbols are dynamically presented at first on screen.
  • minimizing the number of keystrokes: one tries to predict the words which are likely to occur just after those already typed.
Several approaches can be used to carry out this prediction, among which Markov language models that provide a list of word suggestions, depending on the n (typically 1-4) last inserted words. I am currently investigating the use of advanced language models (LM) for word prediction : structural LM that integrates syntactic considerations and adaptive language models that particularly consider the semantic backgournd of the current topic of communication (Latent Semantic Analysis).

Besides, my work concerns also the ergonomy of AAC systems and the study of the effective use of AAC systems for people (see for instance the current PhD of Samuel Pouplin).

icone Ambient Assisted Living for dependent people (AAL)


Care for dependents is becoming a major social and economic issue for the next few years. In 2050, 30% of people from the european countries will be at least 65 years old. Home automation for ageing and disabled people is considered as an solution to allow one to facilitate or automate devices activation and to provide services tailored to the user needs: we talk about Smart Home.
Due to the miniaturization of electronic devices and their dissemination in the environment, smart home is no longer confined to the simple control of household appliances. This field of application is evolving more and more toward Pervasive Systems, where context-awareness is an important challenge. Indeed, knowing who is doing what and where is essential to adapt a \smart behavior", especially in the area of Ambient Assisted Living. The use of sensors distributed in the environment is the most commonly used solution to collect these contextual data. Two main methods can be distinguished, which can be coupled : the use of cameras which is intrusive, and the massive deployment of sensors in the environment which implies a significant financial cost.

During my research period in the Lab-STICCnouvelle fenêtre lab, I am currently investigating with Willy ALLEGRE (PhD into progress) the interest of monitoring activities based on home automation logs (user requests only) to limit the use of these sensors. We first based the monitoring system on a frequency analysis in which a service is considered as a repeating pattern over time. Based on the assumption of the existence of very regular activities, the goal was to automatically extract and to propose daily living scenarios to wheelchair-bound users with limited capabilities. However, this assumption is not always verified in practice, so that some potential scenarios are not proposed. This is why we are now investigating the use of a semantic analysis of each home automation log to interpret service activation at a given time. In this perspective, it is possible to provide the user adapted services without any training steps and using as few sensors as possible.

To take up this challenge, we propose to model user-system interactions and their semantics by means of an ontolog. The use of rules applied to ontology concepts and properties allows an inference engine to deduce user location and intention, leading to service delivery. In the context of a large scale deployment at Kerpape rehabilitation center, we propose to derive automatically the ontology from a
model-driven process.

 icone Works and projects


This research is centered on the development of the Sibylle sortie nouvelle fenêtreAAC system. The SIBYLLE system benefits from the experience of seven years of daily use in the rehabilitation center of Kerpape (Brittany, France). While we are working on ergonomic aspects with the computer science lab. of Kerpape (Jean-Paul Departe), I supervised two PhDs that focused on word prediction :
  • chunk-based structural language model for word prediction : PhD of Igor Schadle (2003)
  • adaptive word prediction :  Latent Semantic Analysis (LSA) for topic adaptation : PhD of Tonio Wandmacher (oct. 2008)

capture écran système Sibylle

  • ESAC_IMC new windowproject (Fondation Motrice new window , 2006-2007) -ACC system for people suffering from additional language disorders.
  • VOLTAIRE project (AFM new window , 2008-2009) - Integration of Sibylle word prediction in the CVK sortie vers site ESAC_IMC / CiViKey sortie vers site ESAC_IMC open source virtual keyboard.
  • EMOTIROB new window  project (2007-2010) -  Emotional companion robot for children in hospitals.

 icone Selection of publications



  • Samuel POUPLIN, Johanna ROBERTSON, Jean-Yves ANTOINE, Antone BLANCHET, Jean-Lou KAHLOUN, Philippe VOLLE, Justine BOUTEILLE, Frédéric LOFASO, Djamel BENSMAIL (2014) Effect of a Dynamic Keyboard and Word Prediction Systems on Text Input Speed in Patients with Functional Tetraplegia Journal of Rehabilitation Research and Development51(3), 467-480.
  • Willy ALLEGRE, Thomas BURGER, Jean-Yves ANTOINE, Jean-Paul DEPARTE, Pascal BERRUET (2013) A Non-Intrusive Context-Aware System for Ambient Assisted Living in Smart Home. Health & Technology , 3(2), 129-138, Springer [HAL-00910386introduction article ACM TASSESTS
  • Samuel POUPLIN, Johanna ROBERTSON, Jean-Yves ANTOINE, Antoine BLANCHET, Jean-Loup KAHLOUN, Philippe VOLLE, Justine BOUTEILLE, Djamel BENSMAIL (2012) Integration and evaluation of a lexical prediction engine in a virtual keyboard support on text input for people with mobility impairments, Annals of Physical and Rehabilitation Medicine, 55(S1),131-132. Supplement volume 27ème Congrès de Médecine Physique et de Réadaptation, Toulouse, France. Octobre 2012.
  • Willy ALLEGRE, Pascal BERRUET, Thomas BURGER, Jean-Yves ANTOINE (2012) A Non-Intrusive Monitoring System for Ambient Assisted Living Service Delivery, Proc. 10th International Conference on Smart Homes and Health Telematics ICOST'2012, Firenze, Italie., pp. 148-156[HAL-00788163] introduction article ACM TASSESTS
  • Marc LE TALLEC, Jeanne VILLANEAU, Jean-Yves ANTOINE, Dominique DUHAUT (2011) Affective Interaction with a Companion Robot for vulnerable Children: a Linguistically based Model for Emotion Detection Proc. LTC’2001, Language Technology Conference, Poznan, Poland. 445-450. [HAL-00664618] introduction article ACM TASSESTS
  • Tonio WANDMACHER, Jean-Yves ANTOINE, Jean-Paul DEPARTE, Franck POIRIER (2009) SIBYLLE, an assistive communication system adapting to the context and its user, ACM Transactions on Accessible Computing, 1(1), pp. 1-30  introduction article ACM TASSESTS version étendue et révisée d'une communication à la conférence ASSETS'2007, International ACM Conference on Assistive Technologies, Phoenix, Arizona document PDF ASSETS'2007
  • Tonio WANDMACHER, Jean-Yves ANTOINE (2007) Methods to integrate a language model with semantic information for a word prediction component. Proc. ACL SIGDAT Joint Conference EMNLP-CoLLN'2007, &Prague, Tchéquie, pp. 503-513. article PDF EMNLP 2007 [HAL-00280477 / ArXiv 0801.4716]
  • Igor SCHADLE (2004) AAC system using NLP techniques. Proc.&9th International Conference on Computer Helping People with special needs, ICCHP'2004, Paris, France. In. Lecture Notes in Computer Science, LNCS 3118, Springer-Verlag, pp. 1009-1015. article ICCHP 2004&
NLP & Handicap rappel haut de page

Jean-Yves ANTOINE - Last update : june 30th, 2014