Using optimization techniques to deal with data separation and data analysis goes back to more than thirty years ago. According to O. L. Mangasarian, his group has formulated linear programming as a large margin classifier in 1960’s. Nowadays classical optimization techniques have found widespread use in solving various data mining problems, among which convex optimization and mathematical programming have occupied the center-stage. With the advantage of convex optimization’s elegant property of global optimum, many problems can be cast into the convex optimization framework, such as Support Vector Machines, graph-based manifold learning, and clustering, which can usually be solved by convex Quadratic Programming, Semi-Definite Programming or Eigenvalue Decomposition. Another research emphasis is applying mathematical programming into the classification. For the last twenty years, the researchers have extensively applied quadratic programming into classification, known as V. Vapnik’s Support Vector Machine, as well as various applications.
As time goes by, new problems emerge constantly in data mining community, such as Time-Evolving Data Mining, On-Line Data Mining, Relational Data Mining and Transferred Data Mining. Some of these recently emerged problems are more complex than traditional ones and are usually formulated as nonconvex problems. Therefore some general optimization methods, such as gradient descents, coordinate descents, convex relaxation, have come back to the stage and become more and more popular in recent years. From another side of mathematical programming, In 1970’s, A. Charnes and W.W. Cooper initiated Data Envelopment Analysis where a fractional programming is used to evaluate decision making units, which is economic representative data in a given training dataset. From 1980’s to 1990’s, F. Glover proposed a number of linear programming models to solve discriminant problems with a small sample size of data. Then, since 1998, multiple criteria linear programming (MCLP) and multiple criteria quadratic programming (MQLP) has also extended in classification. All of these methods differ from statistics, decision tree induction, and neural networks. So far, there are more than 200 scholars around the world have been actively working on the field of using optimization techniques to handle data mining problems.
This workshop intends to promote the research interests in the connection of optimization and data mining as well as real-life applications among the growing data mining communities. It calls for papers to the researchers in the above interface fields for their participation in the conference. The workshop welcomes both high-quality academic (theoretical or empirical) and practical papers in the broad ranges of optimization and data mining related topics including, but not limited to the following:
Convex optimization for data mining problems
Multiple criteria and constraint programming for data mining problems
Nonconvex optimization (Gradient Descents, DC Programming…)
Linear and Nonlinear Programming based methods
Matrix/Tensor based methods (PCA, SVD, NMF, Parafac, Tucker…)
Large margin methods (SVM, Maximum Margin Clustering…)
Randomized algorithms (Random Projection, Random Sampling…)
Sparse algorithms (Lasso, Elastic Net, Structural Sparsity…)
Regularization techniques (L2 norm, Lp,q norm, Nuclear Norm…)
Combinatorial optimization
Large scale numerical optimization
Stochastic optimization
Graph analysis
Theoretical advances
11月18日
2017
会议日期
注册截止日期
2016年12月12日 西班牙 Barcelona,Spain
2016新兴数据挖掘优化技术研讨会2015年11月13日 美国
2015年新兴数据挖掘基础技术优化研讨会2013年12月08日 美国
The 8th Workshop on Optimization Based Techniques for Emerging Data Mining
留言