Design user evaluation of initial framework design

From Organic Data Science Framework
Revision as of 08:18, 25 August 2014 by Gil (Talk | contribs)

Jump to: navigation, search


Evaluation Approaches

The following sections explain different approaches to evaluate the initial framework design. The group interview in combination with the hiding features approach is currently the most preferred approach.

Group interview

After the new version is completely rolled out, the main users are interviewed in a group interview. The basic idea behind the group interview is to bring all users together and discuss the value of every feature (or defined criteria) until they agree on one certain value collaboratively. The participants are asked to evaluate the old version and the new version. As result a spider diagram illustrates the difference between the old version and the new version.

Would be good to include here a picture of a spider diagram. -- -- Yolanda

Comparing a baseline with the system

Comparing the system with a baseline version is helpful to show the overall improvement. This approach needs to track also the old version and how it was used. Alternatively an artificial evaluation scenario can be set up and the time to accomplish the scenario in the different versions is measured.

Remove feature

Remove features for a certain time or user group or both. This approach allows comparing single features within the new version without a baseline version.

Hide feature with option to active them

New Features are hided until the user asks for this certain feature. The new feature component is replaced by simple boxes where users need to click to activate them. With this approach you can ensure that the user who enables a component really wants to see the new feature.

Tracking tools

Most approaches need to track the user behavior therefore different tools can be used. Currently the side is tracked with the [Metrica] tool. The new features are using Java Script which modifies the page content massively this limits the use of most tracking tools a bit. Therefore we have implemented also our own Java Script tracker which is catching all user events directly within the code.

Yandex.Metrica