학술논문
Incorporating BERT-based NLP and Transformer for An Ensemble Model and its Application to Personal Credit Prediction
이용수 77
- 영문명
- Incorporating BERT-based NLP and Transformer for An Ensemble Model and its Application to Personal Credit Prediction
- 발행기관
- 한국스마트미디어학회
- 저자명
- Sophot Ky Kwangtek Na Ju-Hong Lee
- 간행물 정보
- 『스마트미디어저널』Vol13, No.4, 9~15쪽, 전체 7쪽
- 주제분류
- 공학 > 컴퓨터학
- 파일형태
- 발행일자
- 2024.04.30
국문 초록
영문 초록
Tree-based algorithms have been the dominant methods used build a prediction model for tabular data. This also includes personal credit data. However, they are limited to compatibility with categorical and numerical data only, and also do not capture information of the relationship between other features. In this work, we proposed an ensemble model using the Transformer architecture that includes text features and harness the self-attention mechanism to tackle the feature relationships limitation. We describe a text formatter module, that converts the original tabular data into sentence data that is fed into FinBERT along with other text features. Furthermore, we employed FT-Transformer that train with the original tabular data. We evaluate this multi-modal approach with two popular tree-based algorithms known as, Random Forest and Extreme Gradient Boosting, XGBoost and TabTransformer. Our proposed method shows superior Default Recall, F1 score and AUC results across two public data sets. Our results are significant for financial institutions to reduce the risk of financial loss regarding defaulters.
목차
Ⅰ. INTRODUCTION
Ⅱ. PROPOSED METHOD
Ⅲ. EXPERIMENT
Ⅳ. CONCLUSION
REFERENCES
해당간행물 수록 논문
참고문헌
최근 이용한 논문
교보eBook 첫 방문을 환영 합니다!
신규가입 혜택 지급이 완료 되었습니다.
바로 사용 가능한 교보e캐시 1,000원 (유효기간 7일)
지금 바로 교보eBook의 다양한 콘텐츠를 이용해 보세요!