TY - GEN
T1 - Joint and progressive learning from high-dimensional data for multi-label classification
AU - Hong, Danfeng
AU - Yokoya, Naoto
AU - Xu, Jian
AU - Zhu, Xiaoxiang
N1 - Publisher Copyright:
© Springer Nature Switzerland AG 2018.
PY - 2018
Y1 - 2018
N2 - Despite the fact that nonlinear subspace learning techniques (e.g. manifold learning) have successfully applied to data representation, there is still room for improvement in explainability (explicit mapping), generalization (out-of-samples), and cost-effectiveness (linearization). To this end, a novel linearized subspace learning technique is developed in a joint and progressive way, called joint and progressive learning strategy (J-Play), with its application to multi-label classification. The J-Play learns high-level and semantically meaningful feature representation from high-dimensional data by (1) jointly performing multiple subspace learning and classification to find a latent subspace where samples are expected to be better classified; (2) progressively learning multi-coupled projections to linearly approach the optimal mapping bridging the original space with the most discriminative subspace; (3) locally embedding manifold structure in each learnable latent subspace. Extensive experiments are performed to demonstrate the superiority and effectiveness of the proposed method in comparison with previous state-of-the-art methods.
AB - Despite the fact that nonlinear subspace learning techniques (e.g. manifold learning) have successfully applied to data representation, there is still room for improvement in explainability (explicit mapping), generalization (out-of-samples), and cost-effectiveness (linearization). To this end, a novel linearized subspace learning technique is developed in a joint and progressive way, called joint and progressive learning strategy (J-Play), with its application to multi-label classification. The J-Play learns high-level and semantically meaningful feature representation from high-dimensional data by (1) jointly performing multiple subspace learning and classification to find a latent subspace where samples are expected to be better classified; (2) progressively learning multi-coupled projections to linearly approach the optimal mapping bridging the original space with the most discriminative subspace; (3) locally embedding manifold structure in each learnable latent subspace. Extensive experiments are performed to demonstrate the superiority and effectiveness of the proposed method in comparison with previous state-of-the-art methods.
KW - Alternating direction method of multipliers
KW - High-dimensional data
KW - Joint learning
KW - Manifold regularization
KW - Multi-label classification
KW - Progressive learning
UR - http://www.scopus.com/inward/record.url?scp=85055421780&partnerID=8YFLogxK
U2 - 10.1007/978-3-030-01237-3_29
DO - 10.1007/978-3-030-01237-3_29
M3 - Conference contribution
AN - SCOPUS:85055421780
SN - 9783030012366
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 478
EP - 493
BT - Computer Vision – ECCV 2018 - 15th European Conference, 2018, Proceedings
A2 - Ferrari, Vittorio
A2 - Sminchisescu, Cristian
A2 - Weiss, Yair
A2 - Hebert, Martial
PB - Springer Verlag
T2 - 15th European Conference on Computer Vision, ECCV 2018
Y2 - 8 September 2018 through 14 September 2018
ER -