Trouver et confondre les coupables : un processus sophistiqué de correction de lexique
Résumé
The coverage of a parser depends mostly on the quality of the underlying grammar and lexicon. The development of a lexicon both complete and accurate is an intricate and demanding task, overall when achieving a certain level of quality and coverage. We introduce an automatic process able to detect missing or incomplete entries in a lexicon, and to suggest corrections hypotheses for these entries. The detection of dubious lexical entries is tackled by two techniques relying either on a specific statistical model, or on the information provided by a part-of-speech tagger. The generation of correction hypotheses for the detected entries is achieved by studying which modifications could improve the parse rate of the sentences in which the entries occur. This process brings together various techniques based on different tools such as taggers, parsers and entropy classifiers. Applying it on the Lefff, a large-coverage morphologi- cal and syntactic French lexicon, has already allowed us to perfom noticeable improvements.
Domaines
Informatique et langage [cs.CL]Origine | Fichiers produits par l'(les) auteur(s) |
---|
Loading...