h1

affordance evaluation?

February 10, 2007

I was reading an article that refrerred to several evaluation techniques of collaborative software in table format (see below). Several evaluation methods are briefly discussed, refferring to the interesting studies that consider real activities in the evaluation.
It seems to me evaluation of any software has to be affordance-based. Affordances of the tools are revealed in activities.

Article elaborates the following aspects of evaluation:
*Single- and multi-user software evaluation is different because, the latter suggests the modelling of complicated user behaviour that occurs in collaborative systems, that cannot be done at laboratory basis
*Systems must be evaluated in context of learning

Main methods of groupware evaluation have been based on:
*heuristics
*scenarios
*frameworks
*group task models

A unit of analysis is the task, a set of interactions between a single user and the system, performed to accomplish a goal.
Groupware evaluation must extend this to focus on teamwork, defined by Pinelle and
Gutwin as “[…] the actions that group members must carry out in order to complete a task as a group” [30, p.456].

Neale et al [26] list the different variables, including usability for both single-users and groups, the social and organizational impact, and the context of use. Their approach is to develop a multi-facetted model, based on the concept of “activity awareness” (ibid, p.115).

Bodker [8], in the tradition of activity theory, uses the terms system, medium and tool, to denote the different perspectives on how a computer artifact mediates the work of collective and individual subjects. These are useful concepts to distinguish the differentaspects of an artifact that have to be evaluated.
The systems perspective is the artifact perceived from an organizational viewpoint – how it contributes to the realization of organizational goals.
The tool perspective emphasizes how it is experienced by the individual subject carrying out his work while “[…]
the media perspective emphasizes the human engagement with other human beings through the computer application.
Thus, a medium mediates the relation between the acting subject and the community of practice surrounding the subject and the activity.”
(Ibid, p.154).

From article
“…real, concrete facts about what works …”: Integrating Evaluation and Design Through Patterns
Elizabeth S. Guy

evaluation of collaborative tools

4.Baker, K., Greenberg, S. and Gutwin, C. Empirical Development of a Heuristic Evaluation Methodology for Shared Workspace Groupware. In Proceedings of the Conference on Computer Supported Cooperative Work. (CSCW ’02, New Orleans, Louisiana, November 16-20, 2002.) 96-105.

16. Greenberg, S., Fitzpatrick, G., Gutwin, C. and Kaplan, S. Adapting the Locales Framework for Heuristic Evaluation of Groupware. In Proceedings of the Australian Conference on Human Computer Interaction. (OZCHI’99, Wagga Wagga, NSW, November 28-30,1999.) 30-36.

20. Haynes, S., Purao, S. and Skattebo, A. Situating Evaluation in Scenarios of Use. In Proceedings of the ACM Conference on Computer Supported Cooperative Work. (CSCW 2004, Chicago, November 06-10, 2004.) 92-100. November 4, 1992.) 362-369.

26. Neale, D., Carroll, J. and Rosson, M. B. Evaluating Computer-Supported Cooperative Work: Models and Frameworks. In Proceedings of the ACM Conference on
Computer Supported Cooperative Work. (CSCW 2004, Chicago, November 06-10, 2004.) 112-121.IEEE Transactions on Software Engineering 24, 12, (1998), 1171-1181.

[30] Pinelle, D. and Gutwin, C. Groupware Walkthrough: Adding Context to Groupware Usability Evaluation. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. (CHI 2002, Minneapolis, Minnesota, USA, April 20 – 25, 2002.) 455 – 462.

[31] Potts Steves, M., Morse, E., Gutwin, C. and Greenberg, S. A Comparison of Usage Evaluation and Inspection Methods for Assessing Groupware Usability. In Proceedings of the
International ACM SIGGROUP Conference on Supporting Group Work. (Group ’01, Boulder, Colorado, September 30- October 03, 2001.) 125-134.

[32] Ross, S., Ramage, M. and Rogers, Y. PETRA: Participatory Evaluation Through Redesign and Analysis. Interacting with Computers 7, 4, (1995), 335-360.

[35] Stiemerling, O. and Cremers, A. The Use of Cooperation Scenarios in the Design and Evaluation of a CSCW System. IEEE Transactions on Software Engineering 24, 12, (1998),
1171-1181.

8 Bodker, S. Applying Activity Theory to Video Analysis. In: Nardi, B. A., Ed. Context and Consciousness: Activity Theory and Human-Computer Interaction. MIT Press, Cambridge, MA, 1996. 147-174.

About these ads

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Follow

Get every new post delivered to your Inbox.

%d bloggers like this: