Megamodeling and Metamodel-Driven Engineering for Plastic User Interfaces: MEGA-UI - Inria - Institut national de recherche en sciences et technologies du numérique Accéder directement au contenu
Chapitre D'ouvrage Année : 2009

Megamodeling and Metamodel-Driven Engineering for Plastic User Interfaces: MEGA-UI

Résumé

Models are not new in Human Computer Interaction (HCI). Consider all the Model-Based Interface Design Environments (MB-IDE) that emerged in the 1990s for generating User Interfaces (UI) from more abstract descriptions. Unfortunately, the resulting poor usability killed the approach, burying the models in HCI for a long time until new requirements sprung, pushed by ubiquitous computing (e.g., the need for device independence). These requirements, bolstered by the large effort expended in Model-Driven Engineering (MDE) by the Software Engineering (SE) community, have brought the models back to life in HCI. This paper utilizes both the know-how in HCI and recent advances in MDE to address the challenge of engineering Plastic UIs, i.e., UIs capable of adapting to their context of use (User, Platform, Environment) while preserving usability. Although most of the work has concentrated on the functional aspect of adaptation so far, this chapter focuses on usability. The point is to acknowledge the strength of keeping trace of the UI's design rationale at runtime so as to make it possible for the system to reason about its own design when the context of use changes. As design transformations link together different perspectives on the same UI (e.g., user's tasks and workspaces for spatially grouping items together), the paper claims for embedding a graph that depicts a UI from different perspectives at runtime while explaining its design rationale. This meets the notion of Megamodel as promoted in MDE. The first Megamodel was used to make explicit the relations between the core concepts of MDE: System, Model, Metamodel, Mapping, and Transformation. When transposed to HCI, the Megamodel gives rise to the notion of Mega-UI that makes it possible for the user (designer and/or end-user) to browse and/or control the system from different levels of abstraction (e.g., user's tasks, workspaces, interactors, code) and different levels of genericity (e.g., model, metamodel, meta-metamodel). Yet, a first prototype (a rapid prototyping tool) has been implemented using general MDE tools (e.g., EMF, ATL). So far, the effort has been directed on the subset of the graph that links together different perspectives on the same UI including its mapping on the platform. Via an Extra-UI, the designer controls the UI's molding and distribution based on a library of self-explanative transformations. Extra-UIs were previously called Meta-UIs. But as Meta is confusing with the same Meta prefix in MDE, we prefer the prefix Extra to assess there is no change of level of genericity. By contrast the Meta-UI manipulates upper levels of genericity (Meta levels in MDE) for making it possible for the user (designer and/or end-user) to observe and/or define languages for specifying UIs and Meta-UIs. Meta-UIs is the next step in our research agenda. Mega-UI is the overall UI that encompasses UIs, Extra-UIs, and Meta-UIs.
Fichier non déposé

Dates et versions

hal-00953104 , version 1 (28-02-2014)

Identifiants

  • HAL Id : hal-00953104 , version 1

Citer

Jean-Sébastien Sottet, Gaelle Calvary, Jean-Marie Favre, Joelle Coutaz. Megamodeling and Metamodel-Driven Engineering for Plastic User Interfaces: MEGA-UI. Seffah, Ahmed and Vanderdonckt, Jean and Desmarais, MichelC. Human-Centered Software Engineering, Springer London, pp.173-200, 2009, Human-Computer Interaction Series, 978-1-84800-906-6. ⟨hal-00953104⟩
183 Consultations
1 Téléchargements

Partager

Gmail Facebook X LinkedIn More