TY - GEN
T1 - Tuning machine-learning algorithms for battery-operated portable devices
AU - Lin, Ziheng
AU - Gu, Yan
AU - Chakraborty, Samarjit
N1 - Funding Information:
★ This work was partially supported by a National Research Foundation grant “Interactive Media Search” (grant # R-252-000-325-279).
PY - 2010
Y1 - 2010
N2 - Machine learning algorithms in various forms are now increasingly being used on a variety of portable devices, starting from cell phones to PDAs. They often form a part of standard applications (e.g. for grammar-checking in email clients) that run on these devices and occupy a significant fraction of processor and memory bandwidth. However, most of the research within the machine learning community has ignored issues like memory usage and power consumption of processors running these algorithms. In this paper we investigate how machine learned models can be developed in a power-aware manner for deployment on resource-constrained portable devices. We show that by tolerating a small loss in accuracy, it is possible to dramatically improve the energy consumption and data cache behavior of these algorithms. More specifically, we explore a typical sequential labeling problem of part-of-speech tagging in natural language processing and show that a power-aware design can achieve up to 50% reduction in power consumption, trading off a minimal decrease in tagging accuracy of 3%.
AB - Machine learning algorithms in various forms are now increasingly being used on a variety of portable devices, starting from cell phones to PDAs. They often form a part of standard applications (e.g. for grammar-checking in email clients) that run on these devices and occupy a significant fraction of processor and memory bandwidth. However, most of the research within the machine learning community has ignored issues like memory usage and power consumption of processors running these algorithms. In this paper we investigate how machine learned models can be developed in a power-aware manner for deployment on resource-constrained portable devices. We show that by tolerating a small loss in accuracy, it is possible to dramatically improve the energy consumption and data cache behavior of these algorithms. More specifically, we explore a typical sequential labeling problem of part-of-speech tagging in natural language processing and show that a power-aware design can achieve up to 50% reduction in power consumption, trading off a minimal decrease in tagging accuracy of 3%.
KW - Low-power Machine Learned Models
KW - Mobile Machine Learning Applications
KW - Part-of-speech Tagging
KW - Power-aware Design
UR - http://www.scopus.com/inward/record.url?scp=78650911244&partnerID=8YFLogxK
U2 - 10.1007/978-3-642-17187-1_48
DO - 10.1007/978-3-642-17187-1_48
M3 - Conference contribution
AN - SCOPUS:78650911244
SN - 3642171869
SN - 9783642171864
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 502
EP - 513
BT - Information Retrieval Technology - 6th Asia Information Retrieval Societies Conference, AIRS 2010, Proceedings
T2 - 6th Asia Information Retrieval Societies Conference, AIRS 2010
Y2 - 1 December 2010 through 3 December 2010
ER -