View Source

h2. Evaluator(s)

_Stefan Proell, [email protected]_

h2. Evaluation points

h5. Assessment of measurable points

|| Metric || Description || Metric baseline || Metric goal || _evaluation date, e.g. December 20, 2013_ || _evaluation date_ || _evaluation date_ ||
| | _Goal, objective, baseline notes_ | _10_ | _1000_ | _115_ | | |
| | | | | | | |
_Note: Metrics must be registered in the_ _[metrics catalogue|SP:Metrics catalogue]_

h5. Assessment of non-measurable points

_For some evaluation points it makes most sense to a textual description/explanation_

Please include a note about goals-objectives omitted, and why.

h2. Technical details

_Remember to include relevant information, links, versions about workflow, tools, APIs (e.g. Taverna, command line, Hadoop, links to MyExperiment, link to tools or SCAPE name, links to distinct versions of specific components/tools in the component registry)_

h5. WebDAV

We would like to store sufficient information about an experiment (hadoop program, configuration, etc.), so we are able to rerun it. For this purpose, ONB is providing a WebDAV - if you have questions and need more information, please contact Sven or Reinhard at ONB.
Taverna workflows will still be stored on [|].

Link: [|]

Please use the following structure for storing experiment results



where institutionid = onb, storyid = arc2warc, experimentid = jwat, timestamp = 1374526050

h2. Evaluation notes

_Could be such things as identified issues, workarounds, data preparation, if not already included above_