{"@context":"http://iiif.io/api/presentation/2/context.json","@id":"https://repo.library.stonybrook.edu/cantaloupe/iiif/2/manifest.json","@type":"sc:Manifest","label":"Sequential Monte Carlo Methods for Inference and Prediction of Latent Time-series","metadata":[{"label":"dc.description.sponsorship","value":"This work is sponsored by the Stony Brook University Graduate School in compliance with the requirements for completion of degree."},{"label":"dc.format","value":"Monograph"},{"label":"dc.format.medium","value":"Electronic Resource"},{"label":"dc.identifier.uri","value":"http://hdl.handle.net/11401/77449"},{"label":"dc.language.iso","value":"en_US"},{"label":"dc.publisher","value":"The Graduate School, Stony Brook University: Stony Brook, NY."},{"label":"dcterms.abstract","value":"In the era of information-sensing mobile devices, the Internet-of-Things and Big Data, research on advanced methods for extracting information from data has become extremely critical. One important task in this area of work is the analysis of time-varying phenomena, observed sequentially in time. This endeavor is relevant in many applications, where the goal is to infer the dynamics of events of interest described by the data, as soon as new data-samples are acquired. This dissertation is on novel methods for sequential inference and prediction of latent time-series. We assume that a sequence of observations is a function of a hidden process of interest and the goal is to estimate the latent process from the observed data. We consider flexible models that capture the dynamics of real phenomena and can deal with many practical burdens. The embraced methodology is based on Bayesian theory and Monte Carlo algorithms. The former provides a consistent framework for the analysis of latent random processes. The latter allows for overcoming the inherent difficulties of nonlinear and non-Gaussian models. The goal is to propose methods that can extract the hidden dynamics from observed data in the most generic and challenging scenarios. We start by investigating short-memory processes, that is, time-series where most of the relevant information is contained only within the most recent past. In particular, we study latent ARMA processes with independent Gaussian innovations. We first assume that the parameters are known and then, we relax the assumptions until they are all unknown. The analysis of latent time-series is extended to processes with different memory characteristics, including those with long-memory features. On the one hand, we investigate latent processes that show self-similarity properties and are correlated in time, such as the fractional Gaussian process. On the other, we study extensions of the ARMA(p,q) model, by considering fractional differencing, which leads to FARIMA processes. We further generalize our work to allow for broad memory properties, by relaxing previous modeling and parameterization assumptions. We resort to wide-sense stationary time-series in general. Within this new framework, all the previously considered models are covered. As a result, a generic sequential Monte Carlo (SMC) method for inference of Gaussian wide-sense stationary latent time-series is proposed. We broaden our work by investigating a hierarchical model where correlation amongst multiple time-series is accommodated. Finally, we focus on model uncertainty; that is, we consider that one may not know the specific form of the underlying dynamics. For this problem, we investigate both Bayesian model selection and averaging-based solutions. The resulting outcome is a dynamically adjustable SMC method with improved accuracy. The contribution of the work in this dissertation is both on the theoretical Bayesian analysis of the models and on its application to computational algorithms for challenging problems of interest in practice. Short- and long-memory processes are examined and the modeling assumptions relaxed as much as possible for added flexibility and applicability. The performance of the proposed SMC methods is thoroughly evaluated via simulations of most challenging scenarios, and illustrative examples where they can be applied are provided."},{"label":"dcterms.available","value":"2017-09-20T16:52:43Z"},{"label":"dcterms.contributor","value":"Zhao, Yue"},{"label":"dcterms.creator","value":"Urteaga, Inigo"},{"label":"dcterms.dateAccepted","value":"2017-09-20T16:52:43Z"},{"label":"dcterms.dateSubmitted","value":"2017-09-20T16:52:43Z"},{"label":"dcterms.description","value":"Department of Electrical Engineering"},{"label":"dcterms.extent","value":"267 pg."},{"label":"dcterms.format","value":"Monograph"},{"label":"dcterms.identifier","value":"http://hdl.handle.net/11401/77449"},{"label":"dcterms.issued","value":"2016-12-01"},{"label":"dcterms.language","value":"en_US"},{"label":"dcterms.provenance","value":"Made available in DSpace on 2017-09-20T16:52:43Z (GMT). No. of bitstreams: 1\nUrteaga_grad.sunysb_0771E_13023.pdf: 3288993 bytes, checksum: 17a6b3b165e3d99e4f5103c8628bb827 (MD5)\n Previous issue date: 1"},{"label":"dcterms.publisher","value":"The Graduate School, Stony Brook University: Stony Brook, NY."},{"label":"dcterms.subject","value":"Bayes Theory, Estimation, Prediction, Sequential Monte Carlo, State-space model, Time-series"},{"label":"dcterms.title","value":"Sequential Monte Carlo Methods for Inference and Prediction of Latent Time-series"},{"label":"dcterms.type","value":"Dissertation"},{"label":"dc.type","value":"Dissertation"}],"description":"This manifest was generated dynamically","viewingDirection":"left-to-right","sequences":[{"@type":"sc:Sequence","canvases":[{"@id":"https://repo.library.stonybrook.edu/cantaloupe/iiif/2/canvas/page-1.json","@type":"sc:Canvas","label":"Page 1","height":1650,"width":1275,"images":[{"@type":"oa:Annotation","motivation":"sc:painting","resource":{"@id":"https://repo.library.stonybrook.edu/cantaloupe/iiif/2/14%2F33%2F08%2F143308456344616758342013871827528107611/full/full/0/default.jpg","@type":"dctypes:Image","format":"image/jpeg","height":1650,"width":1275,"service":{"@context":"http://iiif.io/api/image/2/context.json","@id":"https://repo.library.stonybrook.edu/cantaloupe/iiif/2/14%2F33%2F08%2F143308456344616758342013871827528107611","profile":"http://iiif.io/api/image/2/level2.json"}},"on":"https://repo.library.stonybrook.edu/cantaloupe/iiif/2/canvas/page-1.json"}]}]}]}