
Evaluator(s)
Tomasz Hofmann (PSNC)
Evaluation points
In this task the efficiency of hadoop job for gathering information about abnormal laboratory results for specified ICD10 code has been evaluated. As the metric the number of objects per second (the number of HL7 files parsed per second) is used. It is worth to note, that test 2 has been performed on local file system and the result is not comparable with results of other tests.
Assessment of measurable points
Metric | Description | Metric baseline | Metric goal | July 21, 2014 [Test 1] | July 28, 2014 [Test 2] |
July 30, 2014 [Test 3] |
---|---|---|---|---|---|---|
number of objects per second![]() |
the number of HL7 files parsed in one second | - | - | 69.60 [obj/s] |
66,17 [obj/s] | 70,75 [obj/s] |
Note: *as an object we proposed to use one HL7 file
Metrics must be registered in the metrics catalogue
Visualisation of results
The chart below presents results of analysis for Test 2. Colours indicate different ICD10 codes of diseases. Test has been performed for the patients who visited WCPT hospital between 1-01-2013 and 31-12-2013. Each column height is equal to the number of abnormal results for laboratory examinations of patients. Additional notes bellow chart describes presented ICD10 codes.
Notes:
- A15.0 - Tuberculosis of lung, confirmed by sputum microscopy with or without culture
- A15.1 - Tuberculosis of lung, confirmed by culture only
- J85.1 - Abscess of lung with pneumonia
In tables 2-3 additional statistics are presented. Results in table 2. and 3. show times of execution map and reduce tasks. Overall satistics are presented in Table 1.
Table 1. Overall statistics
Parameter | Test 1 | Test 2 | Test 3 |
---|---|---|---|
Analyzed period |
1.06.2013-1.07.2014 | 1.01.2013-31.12.2013 |
1.01.2014-1.05.2014 |
Processing time |
196 [s] | 284 [s] | 281 [s] |
Table 2. Statistics for map task
Parameter | Test 1 | Test 2 | Test 3 |
---|---|---|---|
Processing time (for all records) |
196 [s] | 284 [s] |
281 [s] |
Number of records |
13 658 |
18 827 |
19 894 |
Table 3. Statistics for reduce task
Parameter | Test 1 | Test 2 | Test 3 |
---|---|---|---|
Processing time (for all records) |
0,008 [s] | 0,006 [s] |
0,002 [s] |
Number of records |
- | - |
- |
Raw log files
Technical details
Workflow
The experiment is composed of the following steps (accordingly to the MapReduce schema):
- the map task [laboratory.sh
]:
- for each HL7 file saved on HDFS do:
- parse document in order to find out abnormal laboratory results - count them and next add into the context the following pair: Key=icd10 code, Value=the count of the abnormal results
- for each HL7 file saved on HDFS do:
- the reduce task [laboratory.sh
]:
- for each icd10 code accumulate the count of the abnormal results
- produce the result pair Key=icd10 code, Value=the number of abnormal results
- statistics are gathered by downloading and parsing log files [test.sh
]
Scripts used to execute evaluation
Execution commands
./laboratory.sh -admission 20090601 -destination tomek/tmp/laboratory11 -discharge 20140710 -hospital wcpit -icd10s J85.1 -icd10s A15.0 -icd10s A15.1 -laboratory RDW -width 800 -height 600 ./test.sh laboratory where: -admission : date of patient admission to hospital -discharge : date of patient discharge from hospital -destination : folder for hadoop job results (only one per job execution) -width : width of the chart in pixels -height : height of the chart in pixels -icd10s : list of idc10 codes
Important note: please change the -destination for each job execution.
Hadoop job
https://git.man.poznan.pl/stash/projects/SCAP/repos/mr-jobs/browse/epidemic-jobs/laboratory