About
home
금융기술연구소
home

Amortized Baseline Selection via Rank-Revealing QR for Efficient Model Explanation

구분
논문
상태
등록
날짜
2025/11/25
시기
2025
게재처
CIKM2025
저자
Yeeun Yoo
Daehee Han
Hyeongeun Lee
8 more properties

Abstract

Model-agnostic explanation methods are essential for interpreting machine learning models, but suffer from prohibitive computational costs that scale with the number of baselines. Existing acceleration approaches either lack a theoretical base or provide no principled guidance for baseline selection. To address this gap, we present ABSQR (Amortized Baseline Selection via Rank-Revealing QR). This framework exploits the low-rank structure of value matrices to accelerate multi-baseline attribution methods. Our approach combines deterministic baseline selection via SVD-guided QR decomposition with an amortized inference mechanism that utilizes cluster-based retrieval. We reduce computational complexity from O (m • 2d) to O (k • 2d), where k ≪ m. Experiments demonstrate that ABSQR achieves a 91.2% agreement rate with full baseline methods while providing 8.5× speedup across diverse datasets. As the first acceleration approach that preserves explanation error guarantees under computational speedup, ABSQR makes the practical deployment of interpretable AI systems feasible at scale.

카카오뱅크 금융기술연구소

Financial Tech Lab
Copyright ⓒ KakaoBank Corp. All rights reserved.