wanghongjun
Research Associate
Supervisor of Master's Candidates
- Master Tutor
- Education Level:PhD graduate
- Degree:Doctor of engineering
- Business Address:犀浦3号教学楼31529
- Professional Title:Research Associate
- Alma Mater:四川大学
- Supervisor of Master's Candidates
- School/Department:计算机与人工智能学院
- Discipline:Electronic Information
Software Engineering
Computer Application Technology
Contact Information
- PostalAddress:
- Email:
- Paper Publications
Multi-local Collaborative AutoEncoder
- Impact Factor:8.139
- DOI number:10.1016/j.knosys.2021.107844
- Affiliation of Author(s):西南交通大学
- Journal:KNOWLEDGE-BASED SYSTEMS
- Place of Publication:NETHERLANDS
- Key Words:Restricted Boltzmann machine;Autoencoder;Deep collaborative representation;Feature learning;Unsupervised clustering;
- Abstract:The excellent performance of representation learning of autoencoders have attracted considerable interest in various applications. However, the structure and multi-local collaborative relationships of unlabeled data are ignored in their encoding procedure that limits the capability of feature extraction. This paper presents a Multi-local Collaborative AutoEncoder (MC-AE), which consists of novel multi-local collaborative representation RBM (mcrRBM) and multi-local collaborative representation GRBM (mcrGRBM) models. Here, the Locality Sensitive Hashing (LSH) method is used to divide the input data into multi-local cross blocks which contains multi-local collaborative relationships of the unlabeled data and features since the similar multi-local instances and features of the input data are divided into the same block. In mcrRBM and mcrGRBM models, the structure and multi-local collaborative relationships of unlabeled data are integrated into their encoding procedure. Then, the local hidden features converges on the center of each local collaborative block. Under the collaborative joint influence of each local block, the proposed MC-AE has powerful capability of representation learning for unsupervised clustering. However, our MC-AE model perhaps perform training process for a long time on the large-scale and high-dimensional datasets because more local collaborative blocks are integrate into it. Five most related deep models are compared with our MC-AE. The experimental results show that the proposed MC-AE has more excellent capabilities of collaborative representation and generalization than the contrastive deep models.
- Co-author:Jing Liu, Zeng Yu,Tianrui Li
- First Author:Jielei Chu
- Indexed by:Academic papers
- Correspondence Author:Hongjun Wang
- Document Code:20220211432876
- Discipline:Engineering
- First-Level Discipline:Computer Science and Technology
- Volume:Volume 239
- Issue:5 March 2022
- Page Number:107844
- ISSN No.:0950-7051
- Translation or Not:no
- Date of Publication:2021-12-29