|
Modeling and Analysis of Neural Data (2006 - Present)
My long-term research on neural modeling is an interdisciplinary study in which I have been developing mathematical and statistical models to understand how the brain codes (or represents) information and how we can decode neural signals to rebuild internal or external behaviorals. My work in this field has focused on classical methods in dynamic systems such as Kalman filters, hidden Markov models, and point processes. Recently, I have started to investigate neural coding in the function space of neural spike trains and proposed a data-driven framework by treating each spike train as one point in an infinite dimensional "spike manifold". In this framework we aim to construct new tools for: 1) quantifying differences in spike trains using a "Euclidean distance", 2) based on this "Euclidean distance", computing summary statistics such as the mean and covariance of spike trains, and 3) performing statistical inferences such as confidence intervals, hypothesis tests, regressions, and PCA in the spike train space. This new set of statistical tools is expected to provide an alternative pathway to the commonly-used model-based methods (i.e. rate models and temporal models) in the community. Representative Publications:
My work in functional data anlaysis has been a collaboration with Anuj Srivastava. We aim to develop a comprehensive framework for a joint registration and analysis of functional data. The term functional data used here includes registration of real-valued functions, or Euclidean curves in higher dimensions. We also perform theoretical investigations on consitency and template estimation. The registration is mathematically characterized with the notion of time warping, which is well addressed with nonlinear methods for manifolds. In my recent study, I became interested to constructe a linear system for warping via an appropriate isometric isomorphism. Various statistical analysis and inference methods such as hypotheisis testings, principal component analysis, and regressions can be naturally examined in this new system. Representative Publications:
Statistical depth has been a useful tool to measure the center-outward rank of multivariate and functional data. To adopt this notion to temporal point process obeservations, we need to address two types of randomness: 1) the number of events, and 2) the distribution of these events conditioned on the number. We have proposed to define the depth as a weighted product of a marginal term and a conditional term, where the conditional depth can be given using distributions such as Gaussian, Dirichlet, or other parametric forms. Alternatively, we have provided a unified nonparametric framework on defining depth using function smoothing. We have recently extended the depth methods to multi-dimensional spatial point process. Representative Publications:
I had collaborated with Frank Johnson, Richard Bertram, and Rick Hyson in the FSU Neuroscience Program on birdsong analysis. We investigate the functional integration of three distinct brain pathways (auditory, pre-motor, and striatal) during juvenile learning and adult recitation of songbird vocal patterns. My work focuses on statistical methods to analyze experimental data and developing models to characterize song production. Representative Publications: |