Difference between revisions of "Write about the evaluation"

From Organic Data Science Framework
Jump to: navigation, search
Line 32: Line 32:
  
 
# is more than one person editing the metadata of tasks?
 
# is more than one person editing the metadata of tasks?
 +
 +
::Data Source: Tracking Data.
 +
 
# is more than one person editing the content text of tasks?
 
# is more than one person editing the content text of tasks?
 +
 +
::Data Source: Wiki Logs.
  
  

Revision as of 17:12, 15 September 2014


Possible dimensions for evaluation:

  1. show how easy it is for new people to participate. For example, show total training time, show how often they go back to consult documentation, show how often they delete things they have created because they made mistakes, etc. Could also do a survey of new users. New users will include: Jordan, Craig, Hilary, Gopal (anyone else?).
  2. show how people find relevant tasks. For example, show how they use search, how they get to tasks they want to do, etc.
  3. show how people track their own tasks. Here we would do a survey to ask how easy it is for people to track tasks.

Need to show collaboration touch points:

  1. is more than one person signed up to each task?
Nr of Participants Tasks in %
1 10
2 25
3 50
4 10
>4 5
Data Source: JSON Serialized Tasks.
  1. is more than one person editing the metadata of tasks?
Data Source: Tracking Data.
  1. is more than one person editing the content text of tasks?
Data Source: Wiki Logs.


Yandex.Metrica