Skip to end of metadata
Go to start of metadata
You are viewing an old version of this page. View the current version. Compare with Current  |   View Page History

Candidates for First Platform Release are:

Outcome
Author
Download Link
Code
Provided Examples
Documentation
TODO
MapRed Tool Executor
AIT
github https://github.com/openplanets/scape/tree/master/pt-mapred package will contain toolspecs and scripts, needs to be updated
on github
final commit and update of documentation and examples, download package
Hadoop METS InputFileFormat
AIT
github https://github.com/openplanets/scape-platform
simple example hadoop application included
readme and javadoc
download package
TCK for Repository Connector API
FIZ
github https://github.com/fasseg/scape-tck unit tests
readme
javadoc, download package
Repository Loader applications EXL/FIZ
github https://github.com/shaibe/loader-app unit tests
readme and cmd-line help
javadoc, download package
eSciDoc Reference Implementation FIZ
github
https://github.com/escidoc/escidoc-core/tree/SCAPE-1.4


installed on central instance but not released yet
will be provided here on the wiki
finish implementation, documentation, and central deployment, download package
Taverna Component Plugin
UNIMAN
http://build.mygrid.org.uk/taverna/internal/scape/240
https://taverna.googlecode.com/svn/unsorted/component/trunk/component on myExeriment, will be found by plugin
general plugin documentation will be provided on Taverna web site (plugin part), SCAPE ontology will be put into github repository
documentation, publish ontology,
customized SCAPE-specific plugin version for migration paths
Component Registration and Lookup
UNIMAN
Part of: http://www.myexperiment.org/workflows.xml integrated with myExperiment code base
included with Taverna component plugin
included with myExperiment documentation:
http://wiki.myexperiment.org/index.php/Developer:API
Add documentation for semantic search (SCAPE components)
Taverna to PPL translator
TUB
         

Planned SCAPE Scenarios (for Dev. Workshop):

  • Automatic population (and distributed processing) of digital objects using ONB book scans (METS) as input (AIT, FIZ, ONB).
    Use-case depends on Hadoop METS Input/Output FileFormat, SCAPE DataConnector API Implementation, pre-processing of METS files, Loader Application.
    Example ONB book scans (METS + OCR files): (1) Ingest METS into SCAPE repository ref. impl, (2) exec. MapReduce app. that processes dig obj. in parall, e.g. creating a Lucene index for the OCR text. (3) adds this information to the objects and updates them within the repo.
  • Component Creation, Registration, and Lookup integrated with PW.
    Example use-case: If PLATO is used for planning file format migrations.
    Relation to Scenarios: There is a need for annotated/registered Taverna preservation workflows
    Required: List of available tools that can be semantically described and registered using the Component Catalogue.
  • Automatic Workflow Parallelization and Execution.
Labels:
None
Enter labels to add to this page:
Please wait 
Looking for a label? Just start typing.