Skip to end of metadata
Go to start of metadata

Contents of the Release

Installation Requirements (e.g. Hadoop version)

Installation Instructions

A Virtual Machine Image (Virtual Box)

A downloadable binary release package

Candidates for First Platform Release are:

Outcome
Author
Download Link
Code
Provided Examples
Documentation
TODO
MapRed Tool Executor
AIT
github here package will contain toolspecs and scripts, needs to be updated
readme and javadoc
downloadable package
Hadoop METS InputFileFormat
AIT
github here
simple example hadoop application included
readme and javadoc
downloadable package
TCK for Repository Connector API
FIZ
github here unit tests
readme
javadoc, download package
Repository Loader applications EXL/FIZ
github here unit tests
readme and cmd-line help
javadoc, downloadable packag
eSciDoc Reference Implementation FIZ
github
here
installed on central instance but not released yet
will be provided here on the wiki
finish implementation, documentation, and central deployment, downloadable package
Taverna Component Plugin
UNIMAN
here
here on myExeriment, will be found by plugin
general plugin documentation will be provided on Taverna web site (plugin part), SCAPE ontology will be put into github repository
documentation, publish ontology,
customized SCAPE-specific plugin version for migration paths
Component Registration and Lookup
UNIMAN
Part of: here integrated with myExperiment code base
included with Taverna component plugin
included with myExperiment documentation:
here
Component search docs (pending merge with rest of myExp docs):
http://wiki.opf-labs.org/display/SP/Components+on+myExperiment
Add documentation for semantic search (SCAPE components)
PPL (Program for parallel Preservation Load)
TUB
github here Workflows in Resources
Screencast
Readme (on github) implementation, documentation

Planned SCAPE Scenarios (for Dev. Workshop):

  • Automatic population (and distributed processing) of digital objects using ONB book scans (METS) as input (AIT, FIZ, ONB).
    Use-case depends on Hadoop METS Input/Output FileFormat, SCAPE DataConnector API Implementation, pre-processing of METS files, Loader Application.
    Example ONB book scans (METS + OCR files): (1) Ingest METS into SCAPE repository ref. impl, (2) exec. MapReduce app. that processes dig obj. in parall, e.g. creating a Lucene index for the OCR text. (3) adds this information to the objects and updates them within the repo.
  • Component Creation, Registration, and Lookup integrated with PW.
    Example use-case: If PLATO is used for planning file format migrations.
    Relation to Scenarios: There is a need for annotated/registered Taverna preservation workflows
    Required: List of available tools that can be semantically described and registered using the Component Catalogue.
  • Automatic Workflow Parallelization and Execution.
Labels:
None
Enter labels to add to this page:
Please wait 
Looking for a label? Just start typing.