학술논문

A Computing-in-Memory-Based One-Class Hyperdimensional Computing Model for Outlier Detection
Document Type
Periodical
Source
IEEE Transactions on Computers IEEE Trans. Comput. Computers, IEEE Transactions on. 73(6):1559-1574 Jun, 2024
Subject
Computing and Processing
Anomaly detection
Task analysis
Training
Testing
Software algorithms
Forestry
Computers
Hyperdimensional computing
outlier detection
computing-in-memory
hardware/software codesign
Language
ISSN
0018-9340
1557-9956
2326-3814
Abstract
In this work, we present ODHD , an algorithm for outlier detection based on hyperdimensional computing (HDC), a non-classical learning paradigm. Along with the HDC-based algorithm, we propose IM-ODHD , a computing-in-memory (CiM) implementation based on hardware/software (HW/SW) codesign for improved latency and energy efficiency. The training and testing phases of ODHD may be performed with conventional CPU/GPU hardware or our IM-ODHD , SRAM-based CiM architecture using the proposed HW/SW codesign techniques. We evaluate the performance of ODHD on six datasets from different application domains using three metrics, namely accuracy, F1 score, and ROC-AUC, and compare it with multiple baseline methods such as OCSVM, isolation forest, and autoencoder. The experimental results indicate that ODHD outperforms all the baseline methods in terms of these three metrics on every dataset for both CPU/GPU and CiM implementations. Furthermore, we perform an extensive design space exploration to demonstrate the tradeoff between delay, energy efficiency, and performance of ODHD . We demonstrate that the HW/SW codesign implementation of the outlier detection on IM-ODHD is able to outperform the GPU-based implementation of ODHD by at least 331.5$\times$×/889$\times$× in terms of training/testing latency (and on average 14.0$\times$×/36.9$\times$× in terms of training/testing energy consumption).