Design user evaluation of initial framework design

From Organic Data Science Framework
Revision as of 08:23, 25 August 2014 by Gil (Talk | contribs)

Jump to: navigation, search


Evaluation Approaches

The following sections explain different approaches to evaluate the initial framework design. The group interview in combination with the hiding features approach is currently the most preferred approach.

Group interview

After the new version is completely rolled out, the main users are interviewed in a group interview. The basic idea behind the group interview is to bring all users together and discuss the value of every feature (or defined criteria) until they agree on one certain value collaboratively. The participants are asked to evaluate the old version and the new version. As result a spider diagram illustrates the difference between the old version and the new version.

Would be good to include here a picture of a spider diagram. -- -- Yolanda

Comparison with a baseline

Comparing the system with a baseline version is helpful to show the overall improvement. This approach needs to track also the old version and how it was used. Alternatively an artificial evaluation scenario can be set up and the time to accomplish the scenario in the different versions is measured.

Comparison with an ablated version of the system

An ablated version of the system removes features for a certain time or user group or both. This approach allows evaluating selected features in the system. The ablated version is a baseline system, and is the most comparable baseline system possible.

Instrument the user's selection of a feature

A given feature is instrumented by hiding it until the user asks for this feature. The new feature is replaced by simple boxes where users need to click to activate them. With this approach one can ensure that the user who enables a feature really wants to use it.

Tracking tools

Most approaches need to track the user behavior therefore different tools can be used. Currently the site is tracked with the [Metrica] tool. The new features are using Java Script which modifies the page content massively this limits the use of most tracking tools a bit. Therefore we have implemented also our own Java Script tracker which is catching all user events directly within the code.

Yandex.Metrica