TRAM achieves up to 27% power reduction in multipliers for CNNs and vision transformers by jointly training model weights and approximate multiplier designs.
Approximate computing survey, part II: Application-specific & architectural approximation techniques and applications.ACM Computing Surveys, 57(7):1–36
1 Pith paper cite this work. Polarity classification is still indexing.
1
Pith paper citing it
fields
cs.LG 1years
2026 1verdicts
UNVERDICTED 1representative citing papers
citing papers explorer
-
TRAM: Training Approximate Multiplier Structures for Low-Power AI Accelerators
TRAM achieves up to 27% power reduction in multipliers for CNNs and vision transformers by jointly training model weights and approximate multiplier designs.