This pages and its children are old pages to be removed at some point
link to TB.WP4 sharepoint site:https://portal.ait.ac.at/sites/Scape/TB/TB.WP.4/default.aspx
Evaluation areas
Level 1: Preservation system
Goal |
Objective (what to obtain) |
Metrics |
Current result | How assessed | Who assesses |
---|---|---|---|---|---|
Data preparation (how easy is it to prepare data, how long time does it take) | How much time? | Timing and comments | Technical staff that runs the experiment | ||
Effectiveness of preservation (whether the system as a whole is capable of effectively preserving the digital assets against the issues identified in the scenarios) | Storytelling | A subjective judgement of wether the issue is solved | Repository owners in cooperation with issue-owners | ||
Performance, scalability |
|
Running tests against sample datasets and measuring performance in aspects of interest | Automated. Should be output of components/workflows and/or Taverna | ||
System implementation effectiveness (based on the concrete hardware, software and infrastructure setup - proactive monitoring should be done to identify hidden bottlenecks at strategic measurement points. e.g. an HDFS implementation has other critical parameters than a NAS implementation and needs an expert to identify which parameters to monitor proactively. e.g. current Disk Queue length, average Disk Queue length, Disk idle time, Net Packet Received Errors, Connections established, page reads, page writes, etc...) | Implementation of system monitoring at strategic measurement points depending on the infrastructure system implementation | System Engineer for a particular system | |||
User experience |
|
Structured questionnaire | Users of preservation system (managers, data acquisition staff, …) | ||
Organisational fit (whether the system as a whole integrates well into the operations of the organisation that runs the archive/repository) | Structured questionnaire | Repository managers | |||
Industrial/commercial readiness |
|
Personal judgement supported by evidence | Suitable qualified project partners |
Level 2: Preservation tools
Comment: This level may need to cover not only individual tools but whole workflows.
Goal |
Aspects |
How assessed |
Who assesses |
---|---|---|---|
Correctness of performance (the extent to which the tool does the job it is supposed to---verification) | Runs over sample datasets, with some extreme cases | Tool developers | |
Robustness, failure rates | Runs over sample datasets, with some extreme cases | Tool developers | |
Performance of tool (distinct from the performance of the preservation system as a whole. A format converter that took 10 minutes per file would probably be of little practical use!) | Running tests against sample data objects and measuring performance | Tool developers | |
Ease of integration (how easily the tool can be integrated into the SCAPE platform) | Personal judgement supported by evidence | Testbed partners in conjunction with tool developers |
Scenarios <--> Evaluation Areas Matrix
Labels:
None