{"@context":"http://iiif.io/api/presentation/2/context.json","@id":"https://repo.library.stonybrook.edu/cantaloupe/iiif/2/manifest.json","@type":"sc:Manifest","label":"Collaborative and Heterogeneous Signal Processing Methodology for Mobile Sensor Based Applications","metadata":[{"label":"dc.description.sponsorship","value":"This work is sponsored by the Stony Brook University Graduate School in compliance with the requirements for completion of degree."},{"label":"dc.format","value":"Monograph"},{"label":"dc.format.medium","value":"Electronic Resource"},{"label":"dc.identifier.uri","value":"http://hdl.handle.net/1951/55391"},{"label":"dc.language.iso","value":"en_US"},{"label":"dc.publisher","value":"The Graduate School, Stony Brook University: Stony Brook, NY."},{"label":"dcterms.abstract","value":"Multiple object tracking and association are key capabilities in mobile sensor based applications (i.e., a large scale flexible surveillance system and multiple robots application system). Such systems track and identify multiple objects autonomously and intelligently without human operators. They also flexibly control deployed sensors to maximize resource utilization as well as system performance. Moreover, methodologies for the tracking and association should be robust against non-ideal phenomena such as false or failed data processing. In this thesis, we address various issues and present approaches to resolve them in collaborative and heterogeneous single processing for the applications.Multiple object association (finding the correspondence of objects among cameras) is an important capability in multiple cameras environment. We introduce a locally initiating line-based object association to support flexible camera movements. The method can be extended to support multiple cameras through pair-wise collaboration for the object association. While the pair-wise collaboration is effective for objects with the enough separation, the association is not well-established for objects without the enough separation and it may generate the false association. We extend the locally initiating homographic lines based association method to two different multiple camera collaboration strategies that reduce the false association. Collaboration matrices are defined with the required minimum separation for an effective collaboration. The first strategy uses the collaboration matrices to select the best pair out of many cameras having the maximum separation to efficiently collaborate on the object association. The association information in selected cameras is propagated to unselected cameras by the global information constructed from the associated targets. While the first strategy requires the long operation time to achieve the high association rate due to the limited view by the best pair, it reduces the computational cost using homographic lines. The second strategy initiates the collaboration process of objects association for all the pairing cases of cameras regardless of the separation. While the repetitive association processes improve the association performance, the transformation processes of homographic lines increase exponentially.Identification of tracked objects is achieved by using two different signals. The RFID tag is used for object identification and a visual sensor is used for estimating object movements. Visual sensors find the correspondence among cameras and localize them. An association of tracked positions with identifications utilizes object dynamics of crossing the modeled boundary of identification sensors. The proposed association method provides association recovery against tracking and association failure. We also consider coverage uncertainty induced by identification signal characteristics or multiple objects near the boundary of identification sensor coverage. A group and incomplete group association are introduced to resolve identification problems with coverage uncertainty. The simulation results demonstrate the stability of the proposed method against non-ideal phenomena such as false detection, false tracking, and inaccurate coverage model.Finally, a novel self localization method is presented to support mobile sensors. The algorithm estimates the coordinate and the orientation of mobile sensor using projected references on visual image. The proposed method considers the lens non-linearity of the camera and compensates the distortion by using a calibration table. The algorithm can be utilized in mobile robot navigation as well as positioning application where accurate self localization is necessary."},{"label":"dcterms.available","value":"2012-05-15T18:02:43Z"},{"label":"dcterms.contributor","value":"Hong, Sangjin"},{"label":"dcterms.creator","value":"Cho, Shung Han"},{"label":"dcterms.dateAccepted","value":"2012-05-15T18:02:43Z"},{"label":"dcterms.dateSubmitted","value":"2012-05-15T18:02:43Z"},{"label":"dcterms.description","value":"Department of Computer Engineering"},{"label":"dcterms.format","value":"Monograph"},{"label":"dcterms.identifier","value":"Cho_grad.sunysb_0771E_10195.pdf"},{"label":"dcterms.issued","value":"2010-08-01"},{"label":"dcterms.language","value":"en_US"},{"label":"dcterms.provenance","value":"Made available in DSpace on 2012-05-15T18:02:43Z (GMT). No. of bitstreams: 1\nCho_grad.sunysb_0771E_10195.pdf: 7284456 bytes, checksum: a045c05537ffa9ecfe90cfd34e3071b0 (MD5)\n Previous issue date: 1"},{"label":"dcterms.publisher","value":"The Graduate School, Stony Brook University: Stony Brook, NY."},{"label":"dcterms.subject","value":"Heterogeneous sensor network, Multiple camera collaboration, Multiple object association, Multiple object identification, Multiple object tracking, Self-localization"},{"label":"dcterms.title","value":"Collaborative and Heterogeneous Signal Processing Methodology for Mobile Sensor Based Applications"},{"label":"dcterms.type","value":"Dissertation"},{"label":"dc.type","value":"Dissertation"}],"description":"This manifest was generated dynamically","viewingDirection":"left-to-right","sequences":[{"@type":"sc:Sequence","canvases":[{"@id":"https://repo.library.stonybrook.edu/cantaloupe/iiif/2/canvas/page-1.json","@type":"sc:Canvas","label":"Page 1","height":1650,"width":1275,"images":[{"@type":"oa:Annotation","motivation":"sc:painting","resource":{"@id":"https://repo.library.stonybrook.edu/cantaloupe/iiif/2/16%2F48%2F77%2F164877359940649325652644127742790986649/full/full/0/default.jpg","@type":"dctypes:Image","format":"image/jpeg","height":1650,"width":1275,"service":{"@context":"http://iiif.io/api/image/2/context.json","@id":"https://repo.library.stonybrook.edu/cantaloupe/iiif/2/16%2F48%2F77%2F164877359940649325652644127742790986649","profile":"http://iiif.io/api/image/2/level2.json"}},"on":"https://repo.library.stonybrook.edu/cantaloupe/iiif/2/canvas/page-1.json"}]}]}]}