About
home
금융기술연구소
home

ProtoFL: Unsupervised Federated Learning via Prototypical Distillation

구분
논문
날짜
2023/10/02
시기
2023
게재처
ICCV 2023
저자
Hansol Kim
Youngjun Kwak
Jinho Shin
8 more properties
Hansol Kim and Youngjun Kwak contributed equally to this work

Abstract

Federated learning (FL) is a promising approach for enhancing data privacy preservation, particularly for authentication systems. However, limited round communications, scarce representation, and scalability pose significant challenges to its deployment, hindering its full potential. In this paper, we propose ‘ProtoFL’, Prototypical Representation Distillation based unsupervised Federated Learning to enhance the representation power of a global model and reduce round communication costs. Additionally, we introduce a local one-class classifier based on normalizing flows to improve performance with limited data. Our study represents the first investigation of using FL to improve one-class classification performance. We conduct extensive experiments on five widely used benchmarks, namely MNIST, CIFAR-10, CIFAR-100, ImageNet-30, and Keystroke-Dynamics, to demonstrate the superior performance of our proposed framework over previous methods in the literature.

카카오뱅크 금융기술연구소

Financial Tech Lab
Copyright ⓒ KakaoBank Corp. All rights reserved.