Results for IJCNN 2011 competition (1st stage)

Please find below all results that were submitted during the first stage of the IJCNN 2011 competition "The German Traffic Sign Recognition Benchmark" between January 19, and January 21, 2011. The top contestants won a free registration for the conference and were invited to participate in final live competition session that was held at IJCNN 2011 in San Jose, CA, USA. The results of this final session are published in the official GTSRB result table.


Team
Method
Total
Subset
[32] AlcalaGRAM2Gaussian SVM param selection
75.67%
75.67%
[36] AlcalaGRAM2Contrast Gaussian param modif
85.87%
85.87%
[30] Alcalá-GRAMContrast-Gaussian SVM
84.96%
84.96%
[152] bilboANN~80/600 (sect. hue+moments)
38.06%
38.06%
[165] bilboANN~88/1200(sect. hue+moments)
28.63%
28.63%
[169] bilboANN~88/700s(sect. hue+moments)
42.91%
42.91%
[155] Brainsignalssimple alignment plus htsa
94.24%
94.24%
[48] Brainsignalsmet1
94.61%
94.61%
[174] CAORRandom trees HOG3
92.13%
92.13%
[102] CAORKdTree + HOG 9720
93.21%
93.21%
[117] CAORKdTree Hue
17.53%
17.53%
[109] CAORKdTree HOG Set1
70.65%
70.65%
[116] CAORKdTree HOG 2 2
68.80%
68.80%
[100] CAORKdTree + HOG 2520
92.90%
92.90%
[115] CAORKdTree HOG Set3 2
70.86%
70.86%
[119] CAORKdTree + HOG 3456 kNN 5
88.27%
88.27%
[103] CAORKdTree + HOG 9720 kNN 5
93.40%
93.40%
[118] CAORKdTree HOG9720 kNN5 iter 1000
93.72%
93.72%
[23] Diego Peteiro-BarralOptimal 1-layer ANN (4)
95.57%
95.57%
[22] Diego Peteiro-BarralOptimal 1-layer ANN (3)
95.47%
95.47%
[15] Diego Peteiro-BarralOptimal 1-layer ANN
95.47%
95.47%
[19] Diego Peteiro-BarralOptimal 1-layer ANN (2)
95.57%
95.57%
[24] Diego Peteiro-BarralOptimal 1-layer ANN (5)
95.47%
95.47%
[181] FiVENiNEsMulticlass SVM with CV preproc
74.83%
74.83%
[170] IDSIACNN(IMG)_MLP(HOG3)
98.79%
98.79%
[192] IDSIACNN 7HL newnorm
0.00%
0.02%
[196] IDSIAcnn_cnn_hog3
98.98%
98.98%
[166] IDSIACNN 6HL
97.56%
97.56%
[177] IDSIACNN(IMG)_MLP(HOG3)_MLP(HAAR)
98.72%
98.72%
[191] IDSIACNN 7HL
98.10%
98.10%
[195] IDSIAcnn_cnn_hog3_haar
98.97%
98.97%
[193] IDSIACNN 7HL norm
98.46%
98.46%
[197] IDSIAcnn_hog3
98.98%
98.98%
[2] INI-RTCVHOG features (Set 2) + LDA
96.32%
96.32%
[4] INI-RTCVHOG 1 + 1-NN (Euclidean)
73.65%
73.65%
[1] INI-RTCVHOG features (Set 1) + LDA
94.51%
94.51%
[8] INI-RTCVHOG 2 + 3-NN (Euclidean)
72.81%
72.81%
[199] INI-RTCVHuman performance
98.81%
98.81%
[5] INI-RTCVHOG 2 + 1-NN (Euclidean)
72.81%
72.81%
[9] INI-RTCVHOG 3 + 3-NN (Euclidean)
73.82%
73.82%
[6] INI-RTCVHOG 3 + 1-NN (Euclidean)
73.82%
73.82%
[3] INI-RTCVHOG features (Set 3) + LDA
94.73%
94.73%
[7] INI-RTCVHOG 1 + 3-NN (Euclidean)
73.89%
73.89%
[149] italian_crashMultiDataset Alg Fix1
95.34%
95.34%
[156] italian_crashMultiDataset Alg Fix3
95.48%
95.48%
[72] italian_crashMultiDataset Algorithm
83.08%
83.08%
[159] italian_crashMultiDataset Alg Fix4
91.16%
91.16%
[188] italian_crashMultiDataset Alg Final
95.35%
95.35%
[150] italian_crashmultiDataset Alg Fix2
95.28%
95.28%
[160] italian_crashMultiDataset Alg Fix4.1
91.03%
91.03%
[175] italian_crashMultiDataset Alg Fix 5
96.78%
96.78%
[164] MadRoseHopfield Network
88.84%
88.84%
[16] masakilinear SVM fixed
95.60%
95.60%
[43] masakiEnsemble
95.83%
95.83%
[50] masakimix23
95.78%
95.78%
[54] masakimix23scale
96.07%
96.07%
[65] masakimergef3
96.20%
96.20%
[83] masakimergef8
96.29%
96.29%
[94] masakiwhybrid1
96.34%
96.34%
[17] masakilinear SVM fixed 2
95.60%
95.60%
[46] masakicmmodel 2
95.72%
95.72%
[53] masakilogistic23
95.82%
95.82%
[57] masakimerge5
96.25%
96.25%
[64] masakiHOG02s4c2w1
95.72%
95.72%
[68] masakimergef5
96.31%
96.31%
[79] masakimergef6
96.20%
96.20%
[82] masakiscale23s4c_5
96.29%
96.29%
[86] masaki0000
96.27%
96.27%
[93] masakihybrid3
96.28%
96.28%
[18] masakilinear SVM param selection
95.81%
95.81%
[45] masakiEnsemble 4
95.82%
95.82%
[49] masakiEnsemble23
91.25%
91.25%
[52] masakiHOG2
95.54%
95.54%
[56] masakimergeall
96.05%
96.05%
[63] masakiscale23vote-s0
95.73%
95.73%
[67] masakimergef4
96.28%
96.28%
[78] masakiscale23s4de
96.29%
96.29%
[81] masakimergef7
96.24%
96.24%
[85] masakimergeff
96.28%
96.28%
[89] masaki0002
96.30%
96.30%
[92] masakihybrid2
96.07%
96.07%
[14] masakilinear SVM
3.67%
3.67%
[33] masakilibsvm --s
95.82%
95.82%
[37] masakilinear svm --s param selection
90.39%
90.39%
[40] masakiCMmodel
95.72%
95.72%
[44] masakizeroLinearSVM
95.47%
95.47%
[51] masakireduced2 svm
95.24%
95.24%
[55] masakimergef - mix, logis,23 cm2
96.20%
96.20%
[66] masakihog02s4c4w1
95.72%
95.72%
[80] masakiHOG02s16c4w1
95.72%
95.72%
[88] masaki0001
96.28%
96.28%
[91] masakihybird-scale23s4de-mergef8
96.07%
96.07%
[95] masakiscale23s4c4
96.29%
96.29%
[125] matikoLDA+4SVM
96.45%
96.45%
[180] matikoLDA+OVO+OVA
96.16%
96.16%
[124] matiko2LDA+3SVM
96.44%
96.44%
[129] matiko2LDA+6SVM, various features
96.44%
96.44%
[194] matikoLDA+OVA
96.44%
96.44%
[123] matiko2LDA+2SVM
96.40%
96.40%
[128] matiko2LDA+4SVM - various features
96.40%
96.40%
[132] matiko2LDA+2SVM (2)
96.37%
96.37%
[122] matikoLDA+2SVM
96.38%
96.38%
[127] matiko2LDA+5SVM
96.42%
96.42%
[167] nistorHueHist normalization and LDA
14.50%
14.50%
[29] nistorSOMCL
1.93%
1.93%
[162] nistorHOG2 nomralize with LDA
96.31%
96.31%
[35] nistorSOM CL
3.55%
3.55%
[39] nistorHOGNEW2
2.04%
2.04%
[172] nistorHOG3 normalization + LDA
94.70%
94.70%
[38] nistorHOGNEW
2.04%
2.04%
[96] nistorsMsup
2.32%
2.32%
[135] nistorHOG1 LDA modified
0.29%
0.29%
[58] noobSubspace Analysis 1
88.54%
88.54%
[61] noobSubspace Analysis 4
92.09%
92.09%
[69] noobSubspace Analysis 6
94.84%
94.84%
[76] noobSubspace Analysis 10
96.42%
96.42%
[13] noobNearest Subspace 1
18.77%
18.77%
[87] noobSubspace Analysis 13
96.73%
96.73%
[90] noobSubspace Analysis 14
96.65%
96.65%
[138] noobHOG + Data Clustering
96.39%
96.39%
[60] noobSubspace Analysis 3
91.98%
91.98%
[101] noobHOG2 + VQ + Variant of LDA
96.59%
96.59%
[71] noobSubspace Analysis 8
95.47%
95.47%
[75] noobNearest Subspace 2
86.18%
86.18%
[140] noobHOG + Diaglinear Analysis
95.82%
95.82%
[70] noobSubspace Analysis 7
96.67%
96.67%
[157] noobCombined HOG+VQ+LDA
96.79%
96.79%
[59] noobSubspace Analysis 2
72.71%
72.71%
[10] noobNearest Subspace
87.70%
87.70%
[62] noobSubspace Analysis 5
91.57%
91.57%
[73] noobSubspace Analysis 9
96.74%
96.74%
[77] noobSubspace Analysis 11
96.61%
96.61%
[84] noobHOG+LDA+VQ
96.87%
96.87%
[47] olbustosaRandomForest_ini
64.15%
64.15%
[121] olbustosaRandomForest_Hog_20tree
91.34%
91.34%
[134] olbustosaHOG_SVM
76.35%
76.35%
[148] olbustosaRandomForest_Hog_Hue_100tree
94.47%
94.47%
[151] olbustosaRandomForest_hog_hue_200tree
95.05%
95.05%
[133] olbustosaNaiveBayesMultinomial
73.94%
73.94%
[107] olbustosaNaiveBayes
74.64%
74.64%
[114] olbustosaRandomForest-Hog
83.24%
83.24%
[130] olbustosaRandomForest_HOG_100tree
94.18%
94.18%
[98] Radu.Timofte@VISICSCS+HOGs
96.63%
96.63%
[184] Radu.Timofte@VISICSSRC + LDAs I/HOG1/HOG2
97.35%
97.35%
[176] Radu.Timofte@VISICSCS+HOGs+limited
97.01%
97.01%
[183] Radu.Timofte@VISICSIKSVM + PHOG + HOG2
97.88%
97.88%
[163] RMULGSubwindows+Filters+ET+LIBLINEA
76.31%
76.31%
[97] RMULGSubwindows+ETGRAY+LIBLINEAR
79.71%
79.71%
[31] RMULGSubwindows+ET
67.43%
67.43%
[42] RMULGSubwindows+ETGRAY
74.80%
74.80%
[41] RMULGSubwindows+ETK28
59.64%
59.64%
[178] sermanetEBLearn 2LConvNet ms 108 feats
98.97%
98.97%
[185] sermanetEBLearn 2L CNN ms + validation
98.41%
98.41%
[27] sermanetEBLearn 2-layer ConvNet ss
98.20%
98.20%
[187] sermanetEBLearn 2LConvNet ms 108 + val
98.89%
98.89%
[198] sermanetEBLearn 2-layer ConvNet ms reg
98.41%
98.41%
[26] sermanetEBLearn 2-layer ConvNet ms
98.59%
98.59%
[173] shn2L NL SFA(HOG2) CCC
95.19%
95.19%
[190] shnSFA(SFA(IM)+HOG)
95.16%
95.16%
[21] shntest CC
87.02%
87.02%
[34] shnSecond Network CC
89.70%
89.70%
[179] shn2L NL SFA(HOG02)+RG+GC
96.40%
96.40%
[182] shn2L NL SFA(IM+HOG02)+CC
95.12%
95.11%
[20] shntest GC
86.62%
86.62%
[189] soumithCascaded ConvNet Mixed arch
94.06%
94.06%
[158] soumithCascaded cscscf ConvNet
90.19%
90.19%
[186] soumithCascaded ConvNet 70 feat
94.41%
94.41%
[11] TDCCVOG + CCV + NN (Team 1)
82.37%
82.37%
[105] TDCCVOG+HOG(set1)+ANN
91.81%
91.81%
[12] TDCCVOG + CCV + NN (Team 2)
82.67%
82.67%
[143] TDCHOG+textures+ANN
91.99%
91.99%
[154] TDChybrid feature + ANN
94.73%
94.73%
[74] TDCCVOG + ANN (Team 3)
81.80%
81.80%
[142] TDCCVOG+HOG+texture+ANN
94.73%
94.73%
[99] TDCHOG(set 1) + ANN
90.99%
90.99%
[153] TDCVoting+16ANN
48.25%
48.25%
[161] testnHOG1 features + LDA
94.48%
94.48%
[145] titanlinearSVM_Allpairs_HOG02
93.67%
93.67%
[141] titanlinearSVM_ECOC_HOG02
95.70%
95.70%
[144] titanlinearSVM_Allpairs_HOG123
94.16%
94.16%
[147] titanlinearSVM_Allpairs_HOG02_2
93.67%
93.67%
[168] titanlinearSVM_ECOC_ensemble
95.90%
95.90%
[171] titanlinearSVM_Allpairs_ensemble
94.12%
94.12%
[106] TomatoHOG2 + matlab LDA
96.32%
96.32%
[137] TomatoHOG0203 + LDA
96.40%
96.40%
[104] TomatoHOG_02-L2RL2SVM
95.89%
95.89%
[108] TomatoHOG_02_CramerAndSinger
95.85%
95.85%
[136] TomatoHOG+LDA
96.53%
96.53%
[146] TomatoHOGNoScaleLDA
96.53%
96.53%
[139] TomatoHOGs+LDA
91.88%
91.88%

References

[1] HOG features (Set 1) + LDA, INI-RTCV

[2] HOG features (Set 2) + LDA, INI-RTCV

[3] HOG features (Set 3) + LDA, INI-RTCV

[4] HOG 1 + 1-NN (Euclidean), INI-RTCV

[5] HOG 2 + 1-NN (Euclidean), INI-RTCV

[6] HOG 3 + 1-NN (Euclidean), INI-RTCV

[7] HOG 1 + 3-NN (Euclidean), INI-RTCV

[8] HOG 2 + 3-NN (Euclidean), INI-RTCV

[9] HOG 3 + 3-NN (Euclidean), INI-RTCV

[10] Nearest Subspace, noob

[11] CVOG + CCV + NN (Team 1), TDC

[12] CVOG + CCV + NN (Team 2), TDC

[13] Nearest Subspace 1, noob

[14] linear SVM, masaki

[15] Optimal 1-layer ANN, Diego Peteiro-Barral

[16] linear SVM fixed, masaki

[17] linear SVM fixed 2, masaki

[18] linear SVM param selection, masaki

[19] Optimal 1-layer ANN (2), Diego Peteiro-Barral

[20] test GC, shn

[21] test CC, shn

[22] Optimal 1-layer ANN (3), Diego Peteiro-Barral

[23] Optimal 1-layer ANN (4), Diego Peteiro-Barral

[24] Optimal 1-layer ANN (5), Diego Peteiro-Barral

[26] EBLearn 2-layer ConvNet ms, sermanet

[27] EBLearn 2-layer ConvNet ss, sermanet

[29] SOMCL, nistor

[30] Contrast-Gaussian SVM, Alcalá-GRAM

[31] Subwindows+ET, RMULG

[32] Gaussian SVM param selection, AlcalaGRAM2

[33] libsvm --s, masaki

[34] Second Network CC, shn

[35] SOM CL, nistor

[36] Contrast Gaussian param modif, AlcalaGRAM2

[37] linear svm --s param selection, masaki

[38] HOGNEW, nistor

[39] HOGNEW2, nistor

[40] CMmodel, masaki

[41] Subwindows+ETK28, RMULG

[42] Subwindows+ETGRAY, RMULG

[43] Ensemble, masaki

[44] zeroLinearSVM, masaki

[45] Ensemble 4, masaki

[46] cmmodel 2, masaki

[47] RandomForest_ini, olbustosa

[48] met1, Brainsignals

[49] Ensemble23, masaki

[50] mix23, masaki

[51] reduced2 svm, masaki

[52] HOG2, masaki

[53] logistic23, masaki

[54] mix23scale, masaki

[55] mergef - mix, logis,23 cm2, masaki

[56] mergeall, masaki

[57] merge5, masaki

[58] Subspace Analysis 1, noob

[59] Subspace Analysis 2, noob

[60] Subspace Analysis 3, noob

[61] Subspace Analysis 4, noob

[62] Subspace Analysis 5, noob

[63] scale23vote-s0, masaki

[64] HOG02s4c2w1, masaki

[65] mergef3, masaki

[66] hog02s4c4w1, masaki

[67] mergef4, masaki

[68] mergef5, masaki

[69] Subspace Analysis 6, noob

[70] Subspace Analysis 7, noob

[71] Subspace Analysis 8, noob

[72] MultiDataset Algorithm, italian_crash

[73] Subspace Analysis 9, noob

[74] CVOG + ANN (Team 3), TDC

[75] Nearest Subspace 2, noob

[76] Subspace Analysis 10, noob

[77] Subspace Analysis 11, noob

[78] scale23s4de, masaki

[79] mergef6, masaki

[80] HOG02s16c4w1, masaki

[81] mergef7, masaki

[82] scale23s4c_5, masaki

[83] mergef8, masaki

[84] HOG+LDA+VQ, noob

[85] mergeff, masaki

[86] 0000, masaki

[87] Subspace Analysis 13, noob

[88] 0001, masaki

[89] 0002, masaki

[90] Subspace Analysis 14, noob

[91] hybird-scale23s4de-mergef8, masaki

[92] hybrid2, masaki

[93] hybrid3, masaki

[94] whybrid1, masaki

[95] scale23s4c4, masaki

[96] sMsup, nistor

[97] Subwindows+ETGRAY+LIBLINEAR, RMULG

[98] CS+HOGs, Radu.Timofte@VISICS

[99] HOG(set 1) + ANN, TDC

[100] KdTree + HOG 2520, CAOR

[101] HOG2 + VQ + Variant of LDA, noob

[102] KdTree + HOG 9720, CAOR

[103] KdTree + HOG 9720 kNN 5, CAOR

[104] HOG_02-L2RL2SVM, Tomato

[105] CVOG+HOG(set1)+ANN, TDC

[106] HOG2 + matlab LDA, Tomato

[107] NaiveBayes, olbustosa

[108] HOG_02_CramerAndSinger, Tomato

[109] KdTree HOG Set1, CAOR

[114] RandomForest-Hog, olbustosa

[115] KdTree HOG Set3 2, CAOR

[116] KdTree HOG 2 2, CAOR

[117] KdTree Hue, CAOR

[118] KdTree HOG9720 kNN5 iter 1000, CAOR

[119] KdTree + HOG 3456 kNN 5, CAOR

[121] RandomForest_Hog_20tree, olbustosa

[122] LDA+2SVM, matiko

[123] 2LDA+2SVM, matiko

[124] 2LDA+3SVM, matiko

[125] LDA+4SVM, matiko

[127] 2LDA+5SVM, matiko

[128] 2LDA+4SVM - various features, matiko

[129] 2LDA+6SVM, various features, matiko

[130] RandomForest_HOG_100tree, olbustosa

[132] 2LDA+2SVM (2), matiko

[133] NaiveBayesMultinomial, olbustosa

[134] HOG_SVM, olbustosa

[135] HOG1 LDA modified, nistor

[136] HOG+LDA, Tomato

[137] HOG0203 + LDA, Tomato

[138] HOG + Data Clustering, noob

[139] HOGs+LDA, Tomato

[140] HOG + Diaglinear Analysis, noob

[141] linearSVM_ECOC_HOG02, titan

[142] CVOG+HOG+texture+ANN, TDC

[143] HOG+textures+ANN, TDC

[144] linearSVM_Allpairs_HOG123, titan

[145] linearSVM_Allpairs_HOG02, titan

[146] HOGNoScaleLDA, Tomato

[147] linearSVM_Allpairs_HOG02_2, titan

[148] RandomForest_Hog_Hue_100tree, olbustosa

[149] MultiDataset Alg Fix1, italian_crash

[150] multiDataset Alg Fix2, italian_crash

[151] RandomForest_hog_hue_200tree, olbustosa

[152] ANN~80/600 (sect. hue+moments), bilbo

[153] Voting+16ANN, TDC

[154] hybrid feature + ANN, TDC

[155] simple alignment plus htsa, Brainsignals

[156] MultiDataset Alg Fix3, italian_crash

[157] Combined HOG+VQ+LDA, noob

[158] Cascaded cscscf ConvNet, soumith

[159] MultiDataset Alg Fix4, italian_crash

[160] MultiDataset Alg Fix4.1, italian_crash

[161] HOG1 features + LDA, testn

[162] HOG2 nomralize with LDA, nistor

[163] Subwindows+Filters+ET+LIBLINEA, RMULG

[164] Hopfield Network, MadRose

[165] ANN~88/1200(sect. hue+moments), bilbo

[166] CNN 6HL, IDSIA

[167] HueHist normalization and LDA, nistor

[168] linearSVM_ECOC_ensemble, titan

[169] ANN~88/700s(sect. hue+moments), bilbo

[170] CNN(IMG)_MLP(HOG3), IDSIA

[171] linearSVM_Allpairs_ensemble, titan

[172] HOG3 normalization + LDA , nistor

[173] 2L NL SFA(HOG2) CCC, shn

[174] Random trees HOG3, CAOR

[175] MultiDataset Alg Fix 5, italian_crash

[176] CS+HOGs+limited, Radu.Timofte@VISICS

[177] CNN(IMG)_MLP(HOG3)_MLP(HAAR), IDSIA

[178] EBLearn 2LConvNet ms 108 feats, sermanet

[179] 2L NL SFA(HOG02)+RG+GC, shn

[180] LDA+OVO+OVA, matiko

[181] Multiclass SVM with CV preproc, FiVENiNEs

[182] 2L NL SFA(IM+HOG02)+CC, shn

[183] IKSVM + PHOG + HOG2, Radu.Timofte@VISICS

[184] SRC + LDAs I/HOG1/HOG2, Radu.Timofte@VISICS

[185] EBLearn 2L CNN ms + validation, sermanet

[186] Cascaded ConvNet 70 feat, soumith

[187] EBLearn 2LConvNet ms 108 + val, sermanet

[188] MultiDataset Alg Final, italian_crash

[189] Cascaded ConvNet Mixed arch, soumith

[190] SFA(SFA(IM)+HOG), shn

[191] CNN 7HL, IDSIA

[192] CNN 7HL newnorm, IDSIA

[193] CNN 7HL norm, IDSIA

[194] LDA+OVA, matiko

[195] cnn_cnn_hog3_haar, IDSIA

[196] cnn_cnn_hog3, IDSIA

[197] cnn_hog3, IDSIA

[198] EBLearn 2-layer ConvNet ms reg, sermanet

[199] Human performance, INI-RTCV

Please wait while your results are processed