Warning:
This wiki has been archived and is now read-only.
Layered Architecture
Abstraction Levels in Interactive Systems
As often happens, the solution to a complex problem, such as the design of an interactive system, can be based on a small set of clear basic concepts. In order to address such issues it is important to consider the various viewpoints that it is possible to have on an interactive system. Such viewpoints differ for the abstraction levels (to what extent the details are considered) and the focus (whether the task or the user interface is considered). The model-based community has long discussed such possible viewpoints (see for example [S96]). Such abstraction levels are:
- Task and object model, at this level, the logical activities that need to be performed in order to reach the users’ goals are considered. Often they are represented hierarchically along with indications of the temporal relations among them and their associated attributes. The objects that have to be manipulated in order to perform tasks can be identified as well.
- Abstract user interface, in this case the focus shifts to the user interface supporting task performance. Only the logical structure is considered, in a modality-independent manner, thereby avoiding low-level details. Interaction objects are described in terms of their semantics through interactors [PL94]. Thus, its is possible to indicate, for example, that at a given point there is a need for a selection object without indicating whether the selection is performed graphically or vocally or through a gesture or some other modality.
- Concrete user interface, at this point each abstract interactor is replaced with a concrete interaction object that depends on the type of platform and media available and has a number of attributes that define more concretely how it should be perceived by the user.
- Final user interface, at this level the concrete interface is translated into an interface defined by a specific software environment (e.g. XHTML, Java, …).
To better understand such abstraction levels we can consider an example of a task: making a hotel reservation. This task can be decomposed into selecting arrival and departure dates and other subtasks. At the abstract user interface level we need to identify the interaction objects needed to support such tasks. For example, for easily specifying arrival and departure days we need selection interaction objects. When we move on to the concrete user interface, we need to consider the specific interaction objects supported. So, in a desktop interface, selection can be supported by a graphical list object. This choice is more effective than others because the list supports a single selection from a potentially long list of elements. The final user interface is the result of these choices and others involving attributes such as the type and size of the font, the colours, and decoration images that, for example, can show the list in the form of a calendar. Many transformations are possible among these four levels for each interaction platform considered: from higher level descriptions to more concrete ones or vice versa or between the same level of abstraction but for different type of platforms or even any combination of them. Consequently, a wide variety of situations can be addressed. More generally, the possibility of linking aspects related to user interface elements to more semantic aspects opens up the possibility of intelligent tools that can help in the design, evaluation and run-time execution.
Figure 1: Models and Related Tools in the Model-based Development Process (taken from [P05])
Figure 1 shows the models considered and how tools can exploit them in the development process. There are tools that help in developing the models, others that aim to analyse their content and others that use them in order to generate the user interface. For this purpose these latter support set of design criteria and can implement transformations through different abstraction levels. Other possibilities are offered by tools that take the user interface implementation and reconstruct the corresponding models that can then be modified or analysed through the other tools.
References
[S96] Szekely, P. Retrospective and Challenges for Model-Based Interface Development. 2nd International Workshop on Computer-Aided Design of User Interfaces, Namur, Namur University Press.
[PL94] Paternò, F., Leonardi, A. “A Semantics-based Approach to the Design and Implementation of Interaction Objects”, Computer Graphics Forum, Blackwell Publisher, Vol.13, N.3, pp.195-204, 1994.
[P05]F.Paternò, Model-based Tools for Pervasive Usability, Interacting with Computers, Elsevier, May 2005, Vol.17, Issue 3, pp. 291-315.