Sei sulla pagina 1di 4

Visual mediation mechanisms for collaborative design

Carmelo Ardito1, Barbara Rita Barricelli2, Maria Francesca Costabile1, Piero Mussio2, Rosa Lanzilotti1, Antonio Piccinno1, Stefano Valtolina2, Li Zhu2 1 Dipartimento di Informatica, Universit di Bari, Bari, Italy 2 Dipartimento di Informatica e Comunicazione, Universit di Milano, Milano, Italy
{ardito, costabile, lanzilotti, piccinno}@di.uniba.it, {barricelli, mussio, valtolina, zhu}@dico.unimi.it

Abstract
Collaborative design has emerged as a response to organizational needs determined by the evolution of interactive software which, in most cases, must be modified in order to. During time, users and environments change: this is another reason why software has to be modified. Different stakeholders, including consultants, designers internal to the organization and, recently, end users, have to collaborate among themselves, and possibly with the software providers, to shape software. This collaboration creates many problems; researchers are now exploring different ways for effectively supporting it. Technical means must be identified, which relate and integrate users and developers views in order to provide a seamless way of moving between use and design environments, thus facilitating software adaptation to users needs and environments. This paper contributes to this innovative research by presenting visual mediation mechanisms for collaborative design. A case study illustrating their application is discussed.

1. Introduction
Collaborative design has emerged as a response to organizational needs determined by the evolution of interactive software which, in most cases, must be modified in order to be adapted to specific environments and users. The software is also modified at use time to comply with users and environments changes. Product developers cooperate already with end users and implementation consultants when developing new software. But we are now assisting to a paradigm change in the way software is designed, developed, used and evolved. The report from a study commissioned by the US Department of Defense to define a research agenda on Ultra-Large-Scale Systems [1] (ULSS report) confirms this change and states that software will evolve during usage at all levels: system software, middleware, and the

applications themselves. Development activities will be distributed over and initiated by various stakeholders, including end users. The report also uses the term socio-technical ecosystem, which indicates the complex mutual dependencies among different parts of the software development infrastructure. Researchers are now exploring different ways for effectively supporting the collaborative activities of the diverse stakeholders in such ecosystems. In current development practices, communication and collaboration take place through channels that are separated from the actual software, e.g. phone, e-mail. There is no support to bridge the softwares run time environments and its development environments, thus limiting very much communication and cooperation between end users and other stakeholders involved in collaborative design. One main problem is that stakeholders are very diverse, characterized by different cultures and skills, they use different languages and notations, adopt different documentation styles, i.e. they belong to different communities of practice (CoP) [2]. There is the need to develop technical means to relate and integrate users and developers views in order to provide a seamless way of moving between use and design environments, thus facilitating software adaptation to users needs and environments. Such technical means include new modeling languages, architectures that support multilevel design and development, but also mediation mechanisms capable to allow the integration of professional development environments with end-user environments across the ecosystem. This paper contributes to this innovative research by formally defining visual mediation mechanisms for collaborative design. The paper has the following organization. Section 2 describes the mediation process, while Section 3 defines the mediation mechanisms. Section 4 illustrates a case study and reports an example of mediation process. Section 5 concludes the paper.

2. Mediation process

bottom. H2 replies to H1s message by sending a new message (Message2) to H1. Also this message is captured and managed by the Mediator that, by exploiting the knowledge base, translates it into the interaction language used by H1. The translated message (Message2) is then delivered to H1. An example of such messages exchange and translation is provided in Section 4, with reference to the case study. Figure 1. Mediation process between two human
actors (H1 and H2).

3. Mediation mechanisms
The elements involved in a mediation process constitute a Mediation Mechanism (MM), defined as: MM = (Mediator, KB, MIVL), where: Mediator is the agent that supports the two human actors in reaching an agreement through a mediation process; KB is the knowledge base accumulated in a specific domain in which the actors collaborate; MIVL (Mediation Interactive Visual Language) is the interaction visual language constituted by the set of messages exchanged by the two human actors by means of the Mediator. A MIVL message is a pair <data, metadata>. Data describe an instance of the interaction process at hand and metadata specify some characteristics of the sending and receiving actors and of the protocol used in the communication. When an actor sends a message, a mediation process starts. Such message is constituted as follows. Data: a. EP, an executable program, that is the software artifact that the two actors are collaborative designing or part of it; b. A, the annotation that the sender attached to the EP in order to communicate with the receiver; Metadata: c. S, the profile of the human actor that acts as a sender; d. R, the profile of the human actor that acts as a receiver; e. Pl, the definition of the platform being used; f. P, the protocol that underlies the mediation process; g. Mep, metadata related to program EP; h. Md, metadata related to the annotation A. All the messages following the first one convey the opinions of the actors participating in the collaboration process. These messages consist of the data EP and A, and the metadata Mep and Md. The other metadata are not necessary because they do not change for all the process. In [5], examples of message exchange of different CoPs, working with the software environment specific

A mediation process allows two human actors (shortly actors) to reach a common understanding, related to a specific domain, by the support of an agent, the Mediator [3]. In a collaborative design context, the mediation process consists of exchanging messages between the two actors playing a certain role in the collaboration. Such actors are generally members of different Communities of Practices (CoPs) [2]. In an ecosystem, members of a CoP should use dedicated interactive environments to reason on and perform their activities. Each actor interacts with the CoP environment through an interaction language reflecting: the CoP language, notations, and culture; the actors role in the collaboration; and the platform actors are using. The CoP environments support actors in the communication process by allowing them to create and transmit boundary objects and annotations to such objects. Boundary objects are artifacts that collaborating actors use to reach a common understanding of the activities they are performing. Boundary objects provide "a means for team members to interact, react and negotiate around a concept using concrete representation to create a common language and experience for dialogue and critique" [4]. The messages exchanged in the mediation process are annotated boundary objects. The CoP environment is equipped with translators of messages, called Mediator agents, which translate the exchanged messages into the addressee interaction language. The first two steps of a generic mediation process are illustrated in Figure 1. The first step, represented by the black arrows at the top, starts with H1, a human actor, who sends a message (Message1) to another human actor H2. Before reaching H2, Message1 is captured and managed by the Mediator that, by exploiting the knowledge base (KB) for the current domain, translates it into the interaction language used by H2, so that H2 can understand it. The translated message, Message1, is then delivered to H2. The second step of the mediation process is represented in Figure 1 by the white arrows at the

Figure 2. An example of shop window.

for the CoP the actor belongs to, are provided, in which boundary objects are augmented with multimedia annotations to support negotiation among the different actors. The annotations allow each actor to explain the modification s/he introduces in the boundary objects. Boundary objects and annotations are the elements of the common language, called mediation language, which must be explicitly designed as an element of the ecosystem. Because of the diversity of people belonging to different CoPs, the first challenge is to provide them with easily accessible and manipulable software environments, specific for their characteristics and needs, as foreseen by the Software Shaping Workshop design methodology [5]. Since each CoP develops notations to share, communicate and store, the interaction language exploited in each software environment has to be derived from the notation used by the CoP the environment is devoted to. Another important challenge is how to enable such CoPs, working with different software environments, to properly communicate and collaborate within the ecosystem. Concepts expressed in the language of a community must be made understandable by the members of the other CoPs through mediation mechanisms.

more complex templates (e.g. with more multimedia elements) are more expensive. Figure 2 depicts a virtual window showing woman accessories, whose template is composed of a textual description at the left and four windows at the right, one large and three small below it. Figure 3 shows four CoPs, which are some of the stakeholders of this system: Customers, who are web surfers interested in looking at the shop windows. Content providers, i.e., shop owners, who provide contents to be shown in their shop window(s). Editorial staff, who create templates to be sold to the content providers. Administrators, who design the software environment in which the editorial staff design the window templates. Members of these CoPs (customer excluded) collaborate in designing the portal, by interacting with software environments specific for them. None of the, is required any computer since expertise. Actually, administrators have to be familiar with the application domain and have some experience in Web development. The software environment used by the administrators is designed by professional software developers, but this is out of the scope of this paper. The reader interested in the whole design process can refer to [5]. In order to provide an example of a mediation process, let us consider the case in which the content provider wants a feature modification in her/his shop window. S/he can use the annotation tool available in the system. Such a primitive allows the content provider to annotate the application s/he is managing (Figure 4) adding a specific request for the editorial staffs (e.g., a request for a higher number of pictures to be shown in her/his shop window). A mediation process is thus activated. The mediation mechanism involved in this process has the following three components: Mediator is the

4. A case study: the virtual showroom


The case study described in this paper refers to a web portal that provides virtual windows for shops of various natures, which advertise their goods. The company that has developed such portal actually sells advertisement spaces (the virtual windows) to shop owners. The novelty of the system is that such owners are allowed to create and manage the content of the virtual windows, so that they can update them as they like. Such virtual windows are of different types, depending on specific templates, which are made available by the company and sold at different prices:

Figure 3. The CoPs involved in the case at hand (bold


arrow identifies the considered mediation process).

the annotation created by the Editorial staff to explain the applied modifications; Md and Mep, respectively the metadata describing A and EP. After these two steps, the mediation process keeps going until the content provider and the editorial staffs reach an agreement.

5. Conclusions
This paper has discussed and provided a novel definition of visual mediation mechanisms for collaborative design. Visual mediation mechanisms provide a means to improve communication and cooperation among stakeholders involved in the design of software artifacts, including end users. A case study referring to a web portal that provides advertisement for shop of various natures has been presented; it provides an example of how visual mediation mechanisms are used to permit the collaboration among the different stakeholders. The work presented is part of an ongoing research that aims at investigating about the integration of professional development environments with end-user environments, in order to cooperatively adapt the software to user needs and environment.

Figure 4. An example of annotation containing a


request of the content provider.

annotation tool through which the content provider makes her/his requests of changing the contents of the shop window; KB is the database, underlying the system, containing all the information regarding the shop window (photos, address information, contract details, etc.), and the MIVL is the interactive visual language composed by the messages exchanged during the entire mediation process. For example, let us consider the mediation process between a member of the content providers and a member of the editorial staff (see Figure 3). The first message in the process is constituted by: a. EP, the shop window application that the content provider is asking to modify; b. A, the annotation that the content provider attached to the EP in order to communicate her/his request to the editorial staff; c. S, the profile of the content provider (the sender in this mediation process); d. R, the profile of the editorial staff (the receiver in this mediation process); e. Pl, the definition of the Web portal; f. P, the protocol that underlies the mediation process; g. Mep, metadata related to the program EP; h. Md, metadata related to the annotation A. Such message is presented as a bold arrow in Figure 3. The screenshot shows the content provider request through an annotation. The receiver, a member of the editorial staff, gets the message translated in the language of her/his CoP and, if the request is acceptable, s/he allows the requested change; otherwise s/he activates another mediation process with a member of the administrator CoP to report the request for further modifications. In the first case, the reply message to the content provider, consist of EP, the shop window application modified according to the Content provider request; A,

6. Acknowledgments
This work is supported by Italian MIUR and DIPIS.
The work of Li Zhu was supported by the Initial Training Network "Marie Curie Actions", funded by the FP 7 People Programme with reference PITN-GA-2008215446 entitled "DESIRE: Creative Design for Innovation in Science and Technology. We also thank Nicola

Claudio Cellamare for his collaboration in the development of the case study.

7. References
[1] Software Engineering Institute, Ultra-Large-Scale Systems: The Software Challenge of the Future, 2006; http://www.sei.cmu.edu. [2] E. Wenger, Communities of practice: Learning, Meaning, and Identity, Cambridge University Press, 1998. [3] L. Boulle, Mediation: principles, process, practice, LexisNexis Butterworths, 2005. [4] P. Jennings, Tangible social interfaces: critical theory, boundary objects and interdisciplinary design methods, Proc. 5th conference on Creativity & Cognition (C&C), ACM, 2005, pp. 176-186. [5] M.F. Costabile, D. Fogli, P. Mussio and A. Piccinno, Visual Interactive Systems for End-User Development: a Model-based Design Methodology, IEEE Trans. on System Man and Cybernetics Part A-Systems and Humans, vol. 37, no. 6, 2007, pp. 1029-1046.

Potrebbero piacerti anche