{"@context":"http://iiif.io/api/presentation/2/context.json","@id":"https://repo.library.stonybrook.edu/cantaloupe/iiif/2/manifest.json","@type":"sc:Manifest","label":"Log-linear Model Based Tree and Latent Variable Model for Count Data","metadata":[{"label":"dc.description.sponsorship","value":"This work is sponsored by the Stony Brook University Graduate School in compliance with the requirements for completion of degree."},{"label":"dc.format","value":"Monograph"},{"label":"dc.format.medium","value":"Electronic Resource"},{"label":"dc.identifier.uri","value":"http://hdl.handle.net/11401/77432"},{"label":"dc.language.iso","value":"en_US"},{"label":"dc.publisher","value":"The Graduate School, Stony Brook University: Stony Brook, NY."},{"label":"dcterms.abstract","value":"Events that occur randomly over time or space result in count data. Poisson models are widely used for analyses. However, simple log-linear forms are often insufficient for complex relationship between variables. Thus we study tree-structured log-linear models and latent variables models for count data. First, we consider extending Poisson regression for independent observations. Decision trees exhibit the advantage of interpretation. Constant fits are too simple to interpret within strata nonetheless. We hence propose to embed log-linear models to decision trees, and use negative binomial distribution for overdispersion. Second, we consider latent variable models for point process observation in neuroscience. Neurons fire sequences of electrical spikes as signals which can naturally be treated as point processes disregarding the analog difference. Large scale neural recordings have shown evidences of low-dimensional nonlinear dynamics which describe the neural computations implemented by a large neuronal network. Sufficient redundancy of population activity would give us access to the underlying neural process of interest while observing only a small subset of neurons for understanding how neural systems work. Thus we propose a probabilistic method that recovers the latent trajectories non-parametrically under a log-linear generative model with minimal assumptions. Third, we are aim to model the continuous dynamics to further understand the neural computation. Theories of neural computation are characterized by dynamic features such as fixed points and continuous attractors. However, reconstructing the corresponding low-dimensional dynamical system from neural time series are usually difficult. Typical linear dynamical system and autoregressive models either are too simple to reflect complex features or sometimes extrapolate wildly. Thus we propose a flexible nonlinear time series model that directly learns the velocity field associated with the dynamics in the state space and produces reliable future predictions in a variety of dynamical models and in real neural data."},{"label":"dcterms.available","value":"2017-09-20T16:52:41Z"},{"label":"dcterms.contributor","value":"Ahn, Hongshik"},{"label":"dcterms.creator","value":"Zhao, Yuan"},{"label":"dcterms.dateAccepted","value":"2017-09-20T16:52:41Z"},{"label":"dcterms.dateSubmitted","value":"2017-09-20T16:52:41Z"},{"label":"dcterms.description","value":"Department of Applied Mathematics and Statistics"},{"label":"dcterms.extent","value":"113 pg."},{"label":"dcterms.format","value":"Monograph"},{"label":"dcterms.identifier","value":"http://hdl.handle.net/11401/77432"},{"label":"dcterms.issued","value":"2016-12-01"},{"label":"dcterms.language","value":"en_US"},{"label":"dcterms.provenance","value":"Made available in DSpace on 2017-09-20T16:52:41Z (GMT). No. of bitstreams: 1\nZhao_grad.sunysb_0771E_13014.pdf: 5503934 bytes, checksum: 8b167de7c6cc8d7b59d2edf4d6397168 (MD5)\n Previous issue date: 1"},{"label":"dcterms.publisher","value":"The Graduate School, Stony Brook University: Stony Brook, NY."},{"label":"dcterms.subject","value":"Count, Decision tree, Dynamics, Log-linear model, Variational Bayes"},{"label":"dcterms.title","value":"Log-linear Model Based Tree and Latent Variable Model for Count Data"},{"label":"dcterms.type","value":"Dissertation"},{"label":"dc.type","value":"Dissertation"}],"description":"This manifest was generated dynamically","viewingDirection":"left-to-right","sequences":[{"@type":"sc:Sequence","canvases":[{"@id":"https://repo.library.stonybrook.edu/cantaloupe/iiif/2/canvas/page-1.json","@type":"sc:Canvas","label":"Page 1","height":1650,"width":1275,"images":[{"@type":"oa:Annotation","motivation":"sc:painting","resource":{"@id":"https://repo.library.stonybrook.edu/cantaloupe/iiif/2/12%2F99%2F38%2F129938688796204531563041426686172408066/full/full/0/default.jpg","@type":"dctypes:Image","format":"image/jpeg","height":1650,"width":1275,"service":{"@context":"http://iiif.io/api/image/2/context.json","@id":"https://repo.library.stonybrook.edu/cantaloupe/iiif/2/12%2F99%2F38%2F129938688796204531563041426686172408066","profile":"http://iiif.io/api/image/2/level2.json"}},"on":"https://repo.library.stonybrook.edu/cantaloupe/iiif/2/canvas/page-1.json"}]}]}]}