Saliency-based Weighted Multilabel Linear Discriminant Analysis

Lei Xu*, Jenni Raitoharju, Alexandros Iosifidis, Moncef Gabbouj

*Corresponding author for this work

    Research output: Contribution to journal/Conference contribution in journal/Contribution to newspaperJournal articleResearchpeer-review

    28 Downloads (Pure)

    Abstract

    Linear discriminant analysis (LDA) is a classical statistical machine-learning method, which aims to find a linear data transformation increasing class discrimination in an optimal discriminant subspace. Traditional LDA sets assumptions related to the Gaussian class distributions and single-label data annotations. In this article, we propose a new variant of LDA to be used in multilabel classification tasks for dimensionality reduction on original data to enhance the subsequent performance of any multilabel classifier. A probabilistic class saliency estimation approach is introduced for computing saliency-based weights for all instances. We use the weights to redefine the between-class and within-class scatter matrices needed for calculating the projection matrix. We formulate six different variants of the proposed saliency-based multilabel LDA (SMLDA) based on different prior information on the importance of each instance for their class(es) extracted from labels and features. Our experiments show that the proposed SMLDA leads to performance improvements in various multilabel classification problems compared to several competing dimensionality reduction methods.

    Original languageEnglish
    JournalIEEE Transactions on Cybernetics
    Volume52
    Issue10
    Pages (from-to)10200-10213
    Number of pages14
    ISSN2168-2267
    DOIs
    Publication statusPublished - Oct 2022

    Keywords

    • Class saliency
    • dimensionality reduction
    • linear discriminant analysis (LDA)
    • multilabel classification

    Fingerprint

    Dive into the research topics of 'Saliency-based Weighted Multilabel Linear Discriminant Analysis'. Together they form a unique fingerprint.

    Cite this