王承竞 副教授
  • 学历:博士研究生毕业
  • 学位:理学博士学位
  • 办公地点:西南交通大学数学学院
  • 毕业院校:新加坡国立大学
  • 所在单位:数学学院
论文成果
当前位置: 中文主页 >> 科学研究 >> 论文成果
  • 发表刊物:IEEE Transactions on Signal Processing
  • 关键字:Support vector machines, semismooth Newton method, augmented Lagrangian method, hidden sparsity
  • 摘要:Support vector machines (SVMs) are successful supervised learning models that analyze data for classification and regression. Previous work has demonstrated the superiority of the SVMs in dealing with the high dimensional, low sample size problems. However, the increase of the sample size brings great challenges to accurately and efficiently solve the large-scale SVMs, especially for the nonlinear kernel SVMs, which may lead to huge computational costs and unaffordable storage burden. In this paper, we propose a highly efficient sparse semismooth Newton (SsN) based augmented Lagrangian (AL) method for solving a class of large-scale SVMs that can be formulated as a convex quadratic programming problem with a linear equality constraint and a simple box constraint. The asymptotic superlinear convergence rate of both the primal and the dual iteration sequences generated by the AL method is guaranteed due to the piecewise linear-quadratic structure of the problem. Furthermore, we reveal the close connection between the number of support vectors and the sparse structure of the generalized Jacobian for the inner subproblem of the AL method. By exploiting this hidden sparsity, the inner subproblem can be solved by the SsN method efficiently and accurately, which greatly reduces the storage burden and computational costs. In particular, for the nonlinear kernel SVMs, since the sparse structure may not manifest in the early iterations of the AL method, we solve a linear kernel SVM approximated by the random Fourier features method to produce a good initial point, and then transfer to solve the original problem. Numerical experiments demonstrate that the proposed algorithm outperforms the current state-of-the-art solvers for the large-scale SVMs.
  • 合写作者:Tang Peipei,Wang Qingsong,Song Enbin
  • 第一作者:Niu Dunbiao
  • 论文类型:SCI
  • 通讯作者:Wang Chengjing
  • 学科门类:理学
  • 卷号:70
  • 页面范围:5608-5623
  • 是否译文:
  • 发表时间:2022-12-07
  • 收录刊物:SCI
  • 附件: Published_version_An_Efficient_Algorithm_for_a_Class_of_Large-scale.pdf

  • 报考该导师研究生的方式

    欢迎你报考王承竞老师的研究生,报考有以下方式:

    1、参加西南交通大学暑期夏令营活动,提交导师意向时,选择王承竞老师,你的所有申请信息将发送给王承竞老师,老师看到后将和你取得联系,点击此处参加夏令营活动

    2、如果你能获得所在学校的推免生资格,欢迎通过推免方式申请王承竞老师研究生,可以通过系统的推免生预报名系统提交申请,并选择意向导师为王承竞老师,老师看到信息后将和你取得联系,点击此处推免生预报名

    3、参加全国硕士研究生统一招生考试报考王承竞老师招收的专业和方向,进入复试后提交导师意向时选择王承竞老师。

    4、如果你有兴趣攻读王承竞老师博士研究生,可以通过申请考核或者统一招考等方式报考该导师博士研究生。

    点击关闭