Design user evaluation of initial framework design
Contents
Group interview
After the new version is completely rolled out, the main users are interviewed in a group interview. The basic idea behind the group interview is to bring all users together and discuss the value of every feature (or defined criteria) until they agree on one certain value collaboratively. The participants are asked to evaluate the old version and the new version. As result a spider diagram illustrates the difference between the old version and the new version.
Comparison baseline with new version
Comparing the new version with the baseline version is helpful to show the overall improvement. This approach needs to track also the old version an how it was used. Alternatively an artificial evaluation scenario can be set up and the time to accomplish the scenario in the different versions is measured.
Remove feature
Remove features for a certain time or user group or both. This approach allows to compare single features within the new version without a baseline version.
Hide feature with option to active them
New Features are hided until the user ask for this certain feature. The new feature component are replaced by simple boxes where users need to click to activate them. With this approach you can ensure that the user who enables a component really want to see the new feature.
Tracking tools
Most approaches need to track the user behavior therefore different tools can be used. Currently the side is tracked with the [Metrica] tool. The new features are using Java Script which modifies the page content massively this limits the use of most tracking tools a bit. Therefore we have implemented also our own Java Script tracker which is catching all user events directly within the code.