TY - JOUR
T1 - DeepTLF
T2 - robust deep neural networks for heterogeneous tabular data
AU - Borisov, Vadim
AU - Broelemann, Klaus
AU - Kasneci, Enkelejda
AU - Kasneci, Gjergji
N1 - Publisher Copyright:
© 2022, The Author(s).
PY - 2023/6
Y1 - 2023/6
N2 - Although deep neural networks (DNNs) constitute the state of the art in many tasks based on visual, audio, or text data, their performance on heterogeneous, tabular data is typically inferior to that of decision tree ensembles. To bridge the gap between the difficulty of DNNs to handle tabular data and leverage the flexibility of deep learning under input heterogeneity, we propose DeepTLF, a framework for deep tabular learning. The core idea of our method is to transform the heterogeneous input data into homogeneous data to boost the performance of DNNs considerably. For the transformation step, we develop a novel knowledge distillations approach, TreeDrivenEncoder, which exploits the structure of decision trees trained on the available heterogeneous data to map the original input vectors onto homogeneous vectors that a DNN can use to improve the predictive performance. Within the proposed framework, we also address the issue of the multimodal learning, since it is challenging to apply decision tree ensemble methods when other data modalities are present. Through extensive and challenging experiments on various real-world datasets, we demonstrate that the DeepTLF pipeline leads to higher predictive performance. On average, our framework shows 19.6% performance improvement in comparison to DNNs. The DeepTLF code is publicly available.
AB - Although deep neural networks (DNNs) constitute the state of the art in many tasks based on visual, audio, or text data, their performance on heterogeneous, tabular data is typically inferior to that of decision tree ensembles. To bridge the gap between the difficulty of DNNs to handle tabular data and leverage the flexibility of deep learning under input heterogeneity, we propose DeepTLF, a framework for deep tabular learning. The core idea of our method is to transform the heterogeneous input data into homogeneous data to boost the performance of DNNs considerably. For the transformation step, we develop a novel knowledge distillations approach, TreeDrivenEncoder, which exploits the structure of decision trees trained on the available heterogeneous data to map the original input vectors onto homogeneous vectors that a DNN can use to improve the predictive performance. Within the proposed framework, we also address the issue of the multimodal learning, since it is challenging to apply decision tree ensemble methods when other data modalities are present. Through extensive and challenging experiments on various real-world datasets, we demonstrate that the DeepTLF pipeline leads to higher predictive performance. On average, our framework shows 19.6% performance improvement in comparison to DNNs. The DeepTLF code is publicly available.
KW - Deep neural networks
KW - Heterogeneous data
KW - Multimodal learning
KW - Tabular data
KW - Tabular data encoding
UR - http://www.scopus.com/inward/record.url?scp=85136811073&partnerID=8YFLogxK
U2 - 10.1007/s41060-022-00350-z
DO - 10.1007/s41060-022-00350-z
M3 - Article
AN - SCOPUS:85136811073
SN - 2364-415X
VL - 16
SP - 85
EP - 100
JO - International Journal of Data Science and Analytics
JF - International Journal of Data Science and Analytics
IS - 1
ER -