View Source

h1. Definition of top-10 goals and objectives

Top-10 goals and objectives will be defined by testbed WP-leads and reviewed by SP-leads. The following two documents were used in the process of defining the overall goals and objectives.
* An overview of scenarios (now named user stories) and how they relate to work packages in [this matrix|http://wiki.opf-labs.org/download/attachments/14352645/Scenarios_WPs_matrix.pdf?version=1&modificationDate=1340274253000]
* An overview of goals, objectives and suggested metrics defined by TB.WP4 and reviewed by SP-leads in [this document|http://wiki.opf-labs.org/download/attachments/14352645/TB4-objectives-metrics-evaluation-20120530.docx?version=1&modificationDate=1340270446000]

Goals and objectives on the components and platform level will be mapped to the SQUARE software quality model. The following diagram is taken from D14.1. !SQUARE quality attributes.png|border=1,width=924,height=572!

A number of project wide objectives is defined in the [Description of Work|https://portal.ait.ac.at/sites/Scape/Shared%20Documents/Contractual/Description%20of%20Work/DOW%20SCAPE%20(270137)%202013-08-30.pdf] Part-B page 9-11
* DoW-1: Addressing the problem of scalability in four dimensions: number of objects, size of objects, complexity of objects, and heterogeneity of collections
* DoW-2: Introducing automation and scalability in the areas of (2a) Preservation actions, (2b) Quality assurance, (2c) Technical watch , and (2d) Preservation planning
* DoW-3: Answering the question, what tools and technologies are optimal for scalable preservation actions, given a defined set of institutional policies?
* DoW-4: Providing a methodology and tools for capturing contextual information across the entire digital object lifecycle
* DoW-5: Producing a reliable, robust integrated preservation system prototype within the timeframe of the project
* DoW-6: Validating and demonstrating the scalability and reliability of this system against large collections from three different Testbeds
* DoW-7: Developing a skills-base through training
* DoW-8: Ensuring a viable future for the results of this and other successful digital preservation projects and engaging with users, vendors, and stakeholders from outside the digital preservation community
* DoW-9: Providing insight into remaining barriers to take-up, clarifying the business cases for preservation, and investigating models for the provision of scalable preservation services
* DoW-10: Increase the variety of SCAPE deployments to include data center environments and new hardware facilities
* DoW-11: Extend the functionality of SCAPE services to ensure the integrity and privacy of data that is preserved by remote and third party institutions
* DoW-12&13: Extend the SCAPE user-base by large-scale preservation scenarios from domain scientists and data-center customers
* DoW-14: Increased publication and dissemination activities in particular beyond the preservation community

The specific evaluations in the testbed evaluation methodology will be linked to these where appropriate.


h1. Top-10 goals and objectives

The table describes the 10 goals and objectives that has been chosen as topics for evaluating experiments, the last three columns are solely for overview and should be filled when experiments and evaluations are performed. Each experiment will select the goals and objectives, which are relevant for the particular experiment.
More information about this topic can be found in deliverable [D18.1|https://portal.ait.ac.at/sites/Scape/Shared%20Documents/Deliverables/Final/SCAPE_D18.1_SB_V1.0.pdf], being the first deliverable of _Evaluation of Results_ work package (TB.WP4).

| *No* | *Goal* | *Sub-goal* | *Objective* | *Comments* | *DoW objectives* | *Relevant user stories* | *Evaluations* |
| 1 | Performance efficiency | Capacity \\
Resource utilization \\
Time behaviour | Improve DP technology to handle large preservation actions within a reasonable amount of time on a multi node cluster | Evaluates different kinds of performance - e.g. throughput, time per MB time per sample, memory per sample, maximum files | | | |
| 2 | Reliability | Stability indicators | Package tools with known methods and run development with good open source practices | Support available, release cycle, active community. Not directly relevant for testbeds but components developed in SCAPE in connection with scenarios in all testbeds could be used to evaluate this | | | |
| 3 | Reliability | Runtime stability | Improve DP technology (platform and tools) to run automated with proper error handling and fault tolerance | E.g. ability to handle invalid input, error codes | | | |
| 4 | Functional suitability | Completeness | Improve number of file formats correctly identified within a heterogeneous corpus | Identification, Automated Watch | | | |
| 5 | Functional suitability | Correctness | Develop and improve components to do preservation actions more correctly | Valid and well-formed objects from action tools QA accuracy (e.g. correct similarity between two files) Automated Watch: Correct information | | | |
| 6 | Organisational maturity | Dimensions of maturity: Awareness and Communication; Policies, Plans and Procedures; Tools and Automation; Skills and Expertise; Responsibility and Accountability; Goal Setting and Measurement | Improve the capabilities of organisations to monitor and control preservation operations to a point where SCAPE methods, models and tools enable a best-practice organisation to be on level 4 | This is the compound effect of policy-based planning and watch, cf. the vision described in the paper at [ASIST-AM 2011|https://portal.ait.ac.at/sites/Scape/TU/Lists/Dissemination%20log/DispForm.aspx?ID=6] | | | |
| 7 | Maintainability | Reusability | Increase number of tools registered in components catalogue making them discoverable | This is more like a platform/watch evaluation - not directly linked to any specific scenarios or components | | | |
| 8 | Maintainability | Organisational fit | Ensure SCAPE technology fits organisational needs and competences | How does it fit in an organisation. How easy is it to integrate with existing infrastructure and processes. Should be implicit in all we're doing rather than an explicit testbed requirement. We should be able to evaluate this in any solutions actually implemented in real organisations within the project | | | |
| 9 | Planning and monitoring efficiency | Information gathering and decision making effort | Drastically reduce the effort required to create and maintain a preservation plan | cf. the metrics described in the paper at [ASIST-AM 2011|https://portal.ait.ac.at/sites/Scape/TU/Lists/Dissemination%20log/DispForm.aspx?ID=6] | | | |
| 10 | Commercial readiness | | Evaluate to what extent SCAPE technology is going in a direction that makes it ready for commercial exploitation | | | | |


*Example table*
| *No* | *Goal* | *Sub-goal* | *Objective* | *Comments* | *DoW objectives* | *Relevant user stories* | *Evaluations* |
| EXAMPLE-1 | Performance efficiency | Capacity | Improve DP technology to handle large preservation actions within a reasonable amount of time on a multi node cluster | This is an example - all definitions and figures are examples | DoW-1: \\
number of objects | | [EVAL-EX-1|EVAL-EX-1] |
| EXAMPLE-2 | Functional suitability | Completeness | Improve DP technology to identify the majority of web content files with the correct MIME type | This is an example - all definitions and figures are examples | DoW-1: \\
heterogeneity of collections | | [EVAL-EX-2|EVAL-EX-2] |