Detail publikace

Distributed Visual Sensor Network Fusion

CHMELAŘ, P.; ZENDULKA, J. Distributed Visual Sensor Network Fusion. 4th Joint Workshop on Multimodal Interaction and Related Machine Learning Algorithms. Brno: 2007. s. 0-0.
Název anglicky
Distributed Visual Sensor Network Fusion
Typ
abstrakt
Jazyk
česky
Autoři
Chmelař Petr, Ing.
Zendulka Jaroslav, doc. Ing., CSc. (UIFS)
Klíčová slova

Visual sensor, distributed network, metadata management, moving objects,
spatio-temporal data, Kalman filter, sensor fusion, object tracking, large area
surveillance system.

Abstrakt

The poster deals with a framework for distributed visual sensor network metadata
management system. It is assumed that data coming from many cameras is annotated
using computer vision modules to produce metadata representing moving objects in
their states. The data is supposed to be noisy, uncertain and some states might
be missing. Firstly, here is described the spatio-temporal data cleaning using
Kalman filter. Secondly, it copes with many visual sensors fusion and persistent
object tracking within a large area. Thirdly, it describes the data and
architecture model.

Rok
2007
Strany
2
Kniha
4th Joint Workshop on Multimodal Interaction and Related Machine Learning Algorithms
Konference
Strojové učení a multimodální interakce, Brno, CZ
Místo
Brno
BibTeX
@misc{BUT192634,
  author="Petr {Chmelař} and Jaroslav {Zendulka}",
  title="Distributed Visual Sensor Network Fusion",
  booktitle="4th Joint Workshop on Multimodal Interaction and Related Machine Learning Algorithms",
  year="2007",
  pages="2",
  address="Brno",
  note="abstract"
}
Nahoru