Ceolin, Simone Regina (2012) Facial Shape Space using Statistical Models from Surface Normal. PhD thesis, University of York.
Available under License Creative Commons Attribution-Noncommercial-No Derivative Works 2.0 UK: England & Wales.
The analysis of shape-variations due to changes in facial expression and gender difference is a key problem in face recognition. In this thesis, we use statistical shape analysis to construct shape-spaces that span facial expressions and gender, and use the resulting shape-model to perform face recognition under varying expression and gender. Our novel contribution is to show how to construct shape-spaces over fields of surface normals rather than Cartesian landmark points. According to this model face needle-maps (or fields of surface normals) are points in a high-dimensional manifold referred to as a shape-space. We start using a distance measure to discover the similarity between faces and gender difference, using a number of alternatives including geodesic, Euclidean and cosine distance between points on the manifold. For recognition we compare the perfomance distance between Euclidean, cosine and geodesic distance associated with the shape manifold. Here we explore if the distances used to distinguish gender and recognise the same for under different expressions. Also, to explore whether the Fisher-Rao metric can be used to characterise the shape changes due to differences expression and gender difference. We use a 2.5D representation based on facial surface normals (or facial needle-maps) for gender classification and face expression. The needle-map is a shape representation which can be acquired from 2D intensity images using shape-from-shading (SFS). Using the von-Mises Fisher distribution, we compute the elements of the Fisher information matrix, and use this to compute geodesic distance between fields of surface normals to construct a shape-space. We embed the fields of facial surface normals into a lower dimensional pattern space using a number of alternative methods including multidimensional scaling, heat kernel embedding and commute time embedding. We present results on clustering the embedded faces using the BU-3DFEDB, Max Planck and EAR databases. The thesis is divided into five chapters. In Chapter 1 we give a brief introduction and an outline of the thesis. Chapter 2 provides a literature review of aspects related to this topic. More specifically, it starts by describing the relevant literature of face expression recognition and gender difference in computer vision research and the approaches for facial shape recovery. Then, we give a detailed review of the methods developed for face expression and gender difference recognition. Also, we are working with facial shape recovery, we review some relevant techniques including the popular shape-from-shading methods. Are presented in Chapter 3 our first attempt to perform face expression recognition using facial needle maps which is recovered using shape-from-shading. We explore how the different distance measure can be used to measure different facets of shape including gender and expressions. We compute geodesic, euclidean, cosine and Mahalanobis distances for the long-vector representation of the needle-maps for the faces. In Chapter 4 we explore how the Fisher-Rao metric can be used to measure different facets of facial shape estimated from fields of surface normals using von-Mises Fisher (vMF) distribution. In particular we aim to characterise the shape changes due to differences in gender and due to different facial expression. We make use of the vMF distribution since we are dealing with surface normal data over the sphere. The thesis concludes with Chapter 5, where a summary of the contributions of the research work is presented. Also, overall conclusions of the work, as well as, possible future work are discussed.
|Item Type:||Thesis (PhD)|
|Academic Units:||The University of York > Computer Science (York)|
|Depositing User:||Mrs Simone Regina Ceolin|
|Date Deposited:||06 Feb 2012 12:08|
|Last Modified:||08 Aug 2013 08:48|