W3C home > Mailing lists > Public > public-mbui@w3.org > September 2012

Re: AUIHomeFinderCaseStudy (public-mbui@w3.org)

From: Sebastian Feuerstack <Sebastian@Feuerstack.org>
Date: Tue, 25 Sep 2012 16:40:38 -0300
Message-ID: <506208B6.3070503@Feuerstack.org>
To: public-mbui@w3.org
Dear colleagues,

thanks for this interesting example. I agree that one basic distinction
between AUI and CUI modeling is answering the question of “what” should
be part of the interaction vs. “how” this interaction is performed for a
specific mode. Thus, in my understanding the “bedSelector” is actually a
“rangeSelector” that is “instantiated” as a “NumberOfBedroomsSelector”
for the AUI model. “How” a range is selected depends on the mode used
and needs to be modeled as part of the CUI.

Further on, i would like to argue against using a “composition” relation
to specify behavior (constraints) and instead prefer to specify just an
abstract behavior class that is associated to the
AbstractInteractionUnit class. Using a composition relation to define
constraints like the min/max Selectors is only one way to specify
behavior. Other possibilities are using ECA-rules, petri-nets, or state
chart models that we use for our implementation (for instance we specify
the “DistanceSelector” with such a state chart:
in SCXML . Therefore, i would like to propose to keep the behavior
specification for the standard optional and therefore open to different

I see the advantage of an agreed task/aui model in a first step towards
interoperable tools and use cases. Thus, i would like to propose that we
focus on discussing differences between the working group submissions
instead of starting questioning similarities before we have a first
draft. The recursive AbstractInteractionUnit design is an interesting
idea but as the initial review of the submissions revealed, none of the
submissions uses this approach (all submissions use an "interactor
container"). I assume that for most of the submissions design tools have
already been developed and every change in already agreed concepts 
would make it even more challenging for the developers to make their
tools compatible than it already is.


On 09/24/2012 12:58 PM, Paolo Bottoni wrote:
> Thank you Davide and Carmen for stating your point of view.
> Rather than punctual comments on the file, I would like to sum up
> here what seems to me to be the point of difference, a philosophical
> one, I should say.
> Citing from the beginning of the document MBUI - Abstract User
> Interface Models, I read
> The Abstract User Interface (AUI) (corresponding to the
> Platform-Independent Model –PIM– in MDE) is an expression of the UI
> in terms of interaction units without making any reference to
> implementation both in terms of interaction modalities and in terms
> of computing platform.
> and
> An AUI could be connected upwards to a task model and/or downwards to
> one or many Concrete User Interface Models. This connection could be
> achieved via different mechanisms, such as mapping, transformation
> either at design-time or at run-time.
> In your comments I see a rather specific way of defining this upwards
> relation, where the elements in the AUI are only there to allow the
> execution of tasks. I rather see them as ways of logically organising
> the information needed to perform the task as well as the support for
> their execution, without necessarily being in any specific relation
> with them. The relation would be given by the transformation,
> mapping, etc.
> For example, a spreadsheet could support a number of tasks, while a
> complex interface such as the one in the case study basically
> supports only one. So, the difference between abstract and concrete
> UI models, to me is not a difference between WHAT and HOW, but
> between platform and modality independent vs dependent.
> In this sense, the tree I drew was to be meant as a lower limit. I
> agree that one could stop at any level above the ones I put there,
> but my intention was that one should not go further. In any case, it
> is undeniable that defining a range means defining a minimum and a
> maximum, so even at the abstract level one might require that the
> interface provides ways to receive these. Even the example where one
> writes a string "From X to Y" (and the system parses it) would in any
> case be an example of this, as X and Y are two different
> place-holders for those specific roles, whereas writing a string
> "Range centered in Z of size W" would not.
> So, if one stops the modeling at distASelector, both solutions are
> acceptable, whereas if one distinguishes minDistASelector and
> minDistASelector then only the first is. I think that a language
> should not force one in either way, as it is the decision of the
> designer up to which point the AUI must compel the CUI. To reiterate,
> to me the AUI is a way of specifying units of interactions, based on
> some logical connection which can then be related to the task model,
> not units with specific tasks. Otherwise, why could we not have a
> direct relation between the task model and the CUI?
> But now, there is the second philosophical point, concerning the
> whole enterprise. If we are going to define a metamodel, I think this
> means that this must represent a common interchange language in which
> to express different languages. This is an important question to
> understand if we are having a metamodeling approach, or we are
> defining a standard language. In the metamodeling approach, instances
> of AbstractInteractionUnits are not specific elements, but types with
> which to model the abstract interaction. So, for example in UML, the
> class WorkingGroup (which is part of the W3C model) is an instance of
> the class Class (which is part of the UML metamodel) and MBUI is an
> instance of WorkingGroup. If this is the way we intend to proceed,
> then I would say that we have to be as liberal as possible in leaving
> languages to realise the notion of AIU, and not commit to any
> specific modeling discipline, except that represented by having this
> collection of models. If we intend to define a language, then I think
> this has to be discussed in much more detail.
> It might be that I completely misunderstood what you have in mind, in
> which case, I would be happy to revise my understanding
> best
> paolo !DSPAM:5060833531504018419992!

Sebastian Feuerstack
Department of Computer Science
Federal University of Sao Carlos - Brazil

Check out MINT 2010 - the Multimodal INTeraction Framework
Received on Tuesday, 25 September 2012 19:41:05 UTC

This archive was generated by hypermail 2.3.1 : Tuesday, 6 January 2015 19:58:23 UTC