Final Rank
Below, the final rank list of ICPR2016-MICHE-II of all eligible algorithms. All authors of algorithms in the list have been
selected to submit an extended version to the PRL Special Issue . Please
consider that the current final rank list reports the best performing version among the ones submitted for each author. The
rank have been obtained by the Recognition Rate (RR) and the Area Under Curve (AUC) achieved.
Following the final rank list, detailed info behind the achieved results.
Rank | Algorithm | ALLvsALL | GS4vsGS4 | Ip5vsIP5 | Final Rank |
---|---|---|---|---|---|
1 | tiger_miche | 0.99 | 1.00 | 1.00 | 1.00 |
2 | Bata | 0.98 | 0.98 | 1.00 | 0.99 |
3 | karanahujax | 0.89 | 0.89 | 0.96 | 0.91 |
4 | irisom | 0.79 | 0.82 | 0.88 | 0.83 |
5 | FICO_matcher | 0.77 | 0.78 | 0.92 | 0.82 |
6 | otsedom | 0.78 | 0.80 | 0.78 | 0.79 |
7 | ccpsiarb | 0.75 | 0.72 | 0.77 | 0.75 |
Detailed data
The complete ranking list for all algorithm and variants submitted.
Rank | Algorithm | ALLvsALL | GS4vsGS4 | Ip5vsIP5 | Final Rank |
---|---|---|---|---|---|
1 | tiger_miche | 0.99 | 1.00 | 1.00 | 1.00 |
2 | Bata | 0.98 | 0.98 | 1.00 | 0.99 |
3 | karanahujax_Model2 | 0.89 | 0.89 | 0.96 | 0.91 |
4 | karanahujax_Model1 | 0.82 | 0.90 | 1.00 | 0.91 |
5 | irisom_10x10 | 0.79 | 0.82 | 0.88 | 0.83 |
6 | FICO_matcher_V1 | 0.77 | 0.78 | 0.92 | 0.82 |
7 | Irisom_5x5 | 0.77 | 0.75 | 0.90 | 0.81 |
8 | otsedom | 0.78 | 0.80 | 0.78 | 0.79 |
9 | ccpsiarb_17 | 0.75 | 0.72 | 0.77 | 0.75 |
10 | ccpsiarb_2 | 0.74 | 0.72 | 0.75 | 0.74 |
11 | ccpsiarb_42 | 0.73 | 0.72 | 0.72 | 0.73 |
12 | FICO_matcher_V2 | 0.61 | 0.65 | 0.75 | 0.67 |
Here the complete collection of ranks used for the final rank list.
ALL vs ALL | GS4 vs GS4 | IP5 vs IP5 | |||||||
---|---|---|---|---|---|---|---|---|---|
Algorithm | RR | AUC | Global Score | RR | AUC | Global Score | RR | AUC | Global Score |
Bata | 0.98 | 0.98 | 0.98 | 0.97 | 0.99 | 0.98 | 1.00 | 1.00 | 1.00 |
ccpsiarb_17 | 0.68 | 0.83 | 0.75 | 0.63 | 0.81 | 0.72 | 0.70 | 0.85 | 0.77 |
ccpsiarb_2 | 0.65 | 0.82 | 0.74 | 0.63 | 0.81 | 0.72 | 0.63 | 0.86 | 0.75 |
ccpsiarb_42 | 0.65 | 0.81 | 0.73 | 0.63 | 0.81 | 0.72 | 0.63 | 0.81 | 0.72 |
FICO_matcher_V1 | 0.73 | 0.80 | 0.77 | 0.67 | 0.89 | 0.78 | 0.87 | 0.98 | 0.92 |
FICO_matcher_V2 | 0.48 | 0.73 | 0.61 | 0.50 | 0.79 | 0.65 | 0.57 | 0.93 | 0.75 |
irisom_10_10 | 0.80 | 0.78 | 0.79 | 0.77 | 0.88 | 0.82 | 0.83 | 0.92 | 0.88 |
Irisom_5_5 | 0.75 | 0.79 | 0.77 | 0.63 | 0.88 | 0.75 | 0.87 | 0.93 | 0.90 |
karanahujax_Model1 | 0.88 | 0.76 | 0.82 | 0.83 | 0.97 | 0.90 | 1.00 | 1.00 | 1.00 |
karanahujax_Model2 | 0.92 | 0.86 | 0.89 | 0.83 | 0.95 | 0.89 | 0.93 | 0.98 | 0.96 |
otsedom | 0.63 | 0.93 | 0.78 | 0.67 | 0.94 | 0.80 | 0.63 | 0.92 | 0.78 |
tiger_miche | 1.00 | 0.99 | 0.99 | 1.00 | 1.00 | 1.00 | 1.00 | 1.00 | 1.00 |
Here the link to download the list of files composing the Probe and Gallery set during the experimental sessions.