CityU Institutional Repository >
3_CityU Electronic Theses and Dissertations >
ETD  Dept. of Mathematics >
MA  Doctor of Philosophy >
Please use this identifier to cite or link to this item:
http://hdl.handle.net/2031/6096

Title:  Learning from data : Hermite scheme and normal estimation on Riemannian manifolds 
Other Titles:  Ji yu shu ju de xue xi : Aiermite suan fa yu Liman liu xing shang de fa xiang liang gu ji 基於數據的學習 : 埃爾米特算法與黎曼流形上的法向量估計 
Authors:  Shi, Lei (石磊) 
Department:  Department of Mathematics 
Degree:  Doctor of Philosophy 
Issue Date:  2010 
Publisher:  City University of Hong Kong 
Subjects:  Hermite polynomials. Riemannian manifolds. 
Notes:  CityU Call Number: QA404.5 .S44 2010 vii, 126 leaves 30 cm. Thesis (Ph.D.)City University of Hong Kong, 2010. Includes bibliographical references (leaves [117]126) 
Type:  thesis 
Abstract:  In this thesis, we investigate some algorithms in learning theory for purpose of regression,
manifold learning and data analysis. Their design and asymptotic performance
will be discussed in detail from the view point of approximation theory.
In the first part, the problem of learning from data involving function values and
gradients is studied in a framework of leastsquare regularized regression in reproducing
kernel Hilbert spaces. The algorithm is implemented by a linear system with the
coefficient matrix involving both block matrices for generating Graph Laplacians and
Hessians. The additional data for function gradients improve learning performance of
the algorithm. Error analysis is done by means of sampling operators for the sample
error and integral operators in Sobolev spaces for the approximation error.
Normal estimation is an important topic for processing point cloud data and surface
reconstruction in computer graphics. In the second part of the thesis, we consider the
problem of estimating normals for a (unknown) submanifold of a Euclidean space
of codimension 1 from random points on the manifold. We propose a kernel based
learning algorithm in an unsupervised form of gradient learning. The algorithm can be
implemented by solving a linear algebra problem. Error analysis is conducted under
conditions on the true normals of the manifold and the sampling distribution.
In the last part of this thesis, we consider the regression problem by learning with
a regularization scheme in a data dependent hypothesis space. For a given set of samples,
functions in this hypothesis space are defined to be linear combinations of basis functions generated by a kernel function and sample data, thus are entirely determined
by the combination coefficients. The data dependence nature of the kernelbased hypothesis
space provides flexibility and adaptivity for the learning algorithms. The regularization
scheme is essentially different from the standard one in a reproducing kernel
Hilbert space: the kernel function is not necessarily symmetric or positive semidefinite
and the regularizer, as a functional acting on the functions in such kinds of hypothesis
spaces, is taken to be the pth power of the lᵖnorm of the corresponding combination
coefficients. The differences lead to additional difficulty in the error analysis.
To be more specific, we mainly study two cases in this thesis: p = 1 and p = 2.
When p = 1, the l¹regularizer often leads to sparsity of the solution vector which can
greatly improve the efficiency of computations. When p = 2, the corresponding algorithm
is linear and is easy to implement by solving a linear system. Both algorithms
have been studied in the literature. In this thesis, we apply concentration techniques
with l²empirical covering numbers to get the best learning rate for the the learning
algorithms. Since our aim is a capacity dependent analysis, we also show that the
function spaces involved in the error analysis induced by the nonsymmetric kernel
function have nice behaviors in terms of the l²empirical covering numbers of its unit
ball. 
Online Catalog Link:  http://lib.cityu.edu.hk/record=b3947530 
Appears in Collections:  MA  Doctor of Philosophy

Items in CityU IR are protected by copyright, with all rights reserved, unless otherwise indicated.
