Monash University
Browse

Adversarial Regularisation and Knowledge Distillation for Deep Learning Tasks.

Download (66.12 MB)
thesis
posted on 2024-01-08, 03:15 authored by THANH DUC VAN NGUYEN
Deep learning has made remarkable progress in various applications. In this thesis, we aim to advance robust machine learning, deep semi-supervised learning, and model compression tasks. We improve these learning tasks by developing adversarial regularisation and knowledge distillation. The first objective is to gain an understanding of adversarial attacks and knowledge distillation. Through this exploration, we provide valuable interpretations and insights into both the adversarial attack mechanism and the knowledge distillation process. The second objective is to develop and enhance adversarial regularisation and knowledge distillation technique. This involves investigating the principle of adversarial regularisation and knowledge distillation to improve robust deep learning, deep semi-supervised learning and model compression tasks.

History

Campus location

Australia

Principal supervisor

Dinh Phung

Additional supervisor 1

Jianfei Cai

Additional supervisor 2

He Zhao

Year of Award

2024

Department, School or Centre

Data Science & Artificial Intelligence

Course

Doctor of Philosophy

Degree Type

DOCTORATE

Faculty

Faculty of Information Technology