Design user evaluation of initial framework design
Contents
Evaluation Approaches
The following sections explain different approaches to evaluate the initial framework design.
The most preferred approaches currently are the group interview in combination with instrumenting the user selection of features. This is because the other approaches require testing the users with artificial scenarios that can be compared across the system and the baseline, which requires quite a bit of effort for the users and quite a bit of effort for the evaluators as well.
- How long before the instrumentation of features is implemented? -- -- Yolanda
Group interview
After the new version is completely rolled out, the main users are interviewed in a group interview. The basic idea behind the group interview is to bring all users together and discuss the value of every feature (or defined criteria) until they agree on one certain value collaboratively. The participants are asked to evaluate the old version and the new version. As result a spider diagram illustrates the difference between the old version and the new version.
- Would be good to include here a picture of a spider diagram. -- -- Yolanda
Comparison with a baseline
Comparing the system with a baseline version is helpful to show the overall improvement. This approach needs to track also the old version and how it was used. Alternatively an artificial evaluation scenario can be set up and the time to accomplish the scenario in the different versions is measured.
Comparison with an ablated version of the system
An ablated version of the system removes features for a certain time or user group or both. This approach allows evaluating selected features in the system. The ablated version is a baseline system, and is the most comparable baseline system possible.
Instrument the user's selection of a feature
A given feature is instrumented by hiding it until the user asks for this feature. The new feature is replaced by simple boxes where users need to click to activate them. With this approach one can ensure that the user who enables a feature really wants to use it.
Tracking tools
Most approaches need to track the user behavior therefore different tools can be used. Currently the site is tracked with the [Metrica] tool. The new features are implemented in Java Script which modifies the page content massively, and this limits in some ways the use of most tracking tools. Therefore we have implemented also our own Java Script tracker which is catching all user events directly within the code.