|
|
Line 1: |
Line 1: |
| '''Sensor fusion''' is the combining of [[sensor]]y data or data derived from sensory data from disparate sources such that the resulting information is in some sense ''better'' than would be possible when these sources were used individually. The term ''better'' in this case can mean more accurate, more complete, or more dependable, or refer to the result of an emerging view, such as [[stereoscopy|stereoscopic]] vision (calculation of depth information by combining two-dimensional images from two cameras at slightly different viewpoints).<ref>{{cite book|last = Elmenreich|first = W.|title = Sensor Fusion in Time-Triggered Systems, PhD Thesis|publisher = Vienna University of Technology|location = Vienna, Austria|year = 2002|page = 173|url=http://www.vmars.tuwien.ac.at/~wilfried/papers/elmenreich_Dissertation_sensorFusionInTimeTriggeredSystems.pdf}}</ref><ref>Haghighat, M. B. A., Aghagolzadeh, A., & Seyedarabi, H. (2011). [http://dx.doi.org/10.1016/j.compeleceng.2011.04.016 Multi-focus image fusion for visual sensor networks in DCT domain]. Computers & Electrical Engineering, 37(5), 789-797.</ref>
| | Greetings! I am Myrtle Shroyer. He is really fond of performing ceramics but he is having difficulties to find time for it. North Dakota is where me and my husband reside. Since she was eighteen she's been working as a receptionist but her marketing never comes.<br><br>Here is my web blog; at home std testing ([https://www.agryd.com/profile-905780/info/ simply click the next website page]) |
| | |
| The data sources for a fusion process are not specified to originate from identical sensors. One can distinguish ''direct fusion'', ''indirect fusion'' and fusion of the outputs of the former two. Direct fusion is the fusion of sensor data from a set of [[homogeneity and heterogeneity|heterogeneous]] or [[wiktionary:Homogeneous|homogeneous]] sensors, [[soft sensor]]s, and [[history value]]s of sensor data, while indirect fusion uses information sources like ''[[A priori and a posteriori|a priori]]'' knowledge about the environment and human input.
| |
| | |
| Sensor fusion is also known as ''(multi-sensor) [[Data fusion]]'' and is a subset of ''[[Information integration|information fusion]]''.
| |
| | |
| == Examples of sensors ==
| |
| | |
| * [[Radar]]
| |
| * [[Sonar]] and other acoustic
| |
| * Infra-red / thermal imaging camera
| |
| * [[Professional video camera|TV camera]]s
| |
| * [[Sonobuoy]]s
| |
| * [[Seismometer|Seismic sensor]]s
| |
| * [[Magnetometer|Magnetic sensor]]s
| |
| * Electronic Support Measures (ESM)
| |
| * [[Phased array]]
| |
| * [[Microelectromechanical systems|MEMS]]
| |
| * [[Accelerometer]]s
| |
| * [[Global Positioning System]] (GPS)
| |
| | |
| == Sensor fusion algorithms ==
| |
| | |
| Sensor fusion is a term that covers a number of methods and algorithms, including:
| |
| | |
| * [[Central limit theorem|Central Limit Theorem]]
| |
| * [[Kalman filter]]
| |
| * [[Bayesian network]]s
| |
| * [[Dempster-Shafer_theory|Dempster-Shafer]]
| |
| | |
| == Example sensor fusion calculations ==
| |
| | |
| Two example sensor fusion calculations are illustrated below.
| |
| | |
| Let <math>{\textbf{x}}_1</math> and <math>{\textbf{x}}_2</math> denote two sensor measurements with noise variances <math>\scriptstyle\sigma_1^2</math> and <math>\scriptstyle\sigma_2^2</math>
| |
| , respectively. One way of obtaining a combined measurement <math>{\textbf{x}}_3</math> is to apply the [[Central Limit Theorem]], which is also employed within the Fraser-Potter fixed-interval smoother, namely
| |
| <ref name="GAE12">{{cite book | author = Einicke, G.A.
| |
| | year = 2012
| |
| | title = Smoothing, Filtering and Prediction: Estimating the Past, Present and Future
| |
| | publisher = Intech
| |
| | location = Rijeka, Croatia
| |
| | isbn = 978-953-307-752-9
| |
| | url = http://www.intechopen.com/books/smoothing-filtering-and-prediction-estimating-the-past-present-and-future}}</ref>
| |
| | |
| : <math>{\textbf{x}}_3 = \scriptstyle\sigma_3^{2} (\scriptstyle\sigma_1^{-2}{\textbf{x}}_1 + \scriptstyle\sigma_2^{-2}{\textbf{x}}_2)</math> ,
| |
| | |
| where <math> \scriptstyle\sigma_3^{2} = (\scriptstyle\sigma_1^{-2} + \scriptstyle\sigma_2^{-2})^{-1}</math> is the variance of the combined estimate. It can be seen that the fused result is simply a linear combination of the two measurements weighted by their respective noise variances.
| |
| | |
| Another method to fuse together two measurements is to use the optimal [[Kalman filter]]. Suppose that the data is generated by a first-order system and let <math>{\textbf{P}}_k</math> denote the solution of the filter's [[Riccati equation]]. By applying [[Cramer's rule]]] within the gain calculation it can be found that the filter gain is given by <ref name="GAE12" />
| |
| | |
| : <math> {\textbf{L}}_k =
| |
| | |
| \begin{bmatrix}
| |
| \tfrac{\scriptstyle\sigma_2^{2}{\textbf{P}}_k}{\scriptstyle\sigma_2^{2}{\textbf{P}}_k + \scriptstyle\sigma_1^{2}{\textbf{P}}_k + \scriptstyle\sigma_1^{2} \scriptstyle\sigma_2^{2}} & \tfrac{\scriptstyle\sigma_1^{2}{\textbf{P}}_k}{\scriptstyle\sigma_2^{2}{\textbf{P}}_k + \scriptstyle\sigma_1^{2}{\textbf{P}}_k + \scriptstyle\sigma_1^{2} \scriptstyle\sigma_2^{2}} \end{bmatrix}.</math>
| |
| | |
| By inspection, when the first measurement is noise free, the filter ignores the second measurement and vice versa. That is, the combined estimate is weighted by the quality of the measurements.
| |
| | |
| == Centralized versus decentralized ==
| |
| | |
| In sensor fusion, centralized versus decentralized refers to where the fusion of the data occurs. In centralized fusion, the clients simply forward all of the data to a central location, and some entity at the central location is responsible for correlating and fusing the data. In decentralized, the clients take full responsibility for fusing the data. "In this case, every sensor or platform can be viewed as an intelligent asset having some degree of autonomy in decision-making."<ref>{{cite web|title=Multi-sensor management for information fusion: issues and approaches|url=http://www.elsevier.com/locate/inffus|author=N. Xiong |coauthors=P. Svensson|publisher = Information Fusion|year = 2002|page = 3(2):163–186}}</ref>
| |
| | |
| Multiple combinations of centralized and decentralized systems exist.
| |
| | |
| == Levels ==
| |
| | |
| There are several categories or levels of sensor fusion that are commonly used.<ref>http://defensesystems.com/articles/2009/09/02/c4isr1-sensor-fusion.aspx</ref>
| |
| | |
| * Level 0 – Data alignment
| |
| * Level 1 – Entity assessment (e.g. signal/feature/object).
| |
| ** Tracking and object detection/recognition/identification
| |
| * Level 2 – Situation assessment
| |
| * Level 3 – Impact assessment
| |
| * Level 4 – Process refinement (i.e. sensor management)
| |
| * Level 5 – User refinement
| |
| | |
| == Applications ==
| |
| | |
| One application of sensor fusion is [[GPS/INS]], where [[Global Positioning System]] and [[Inertial navigation system|Inertial Navigation System]] data is fused together using various different methods, e.g. the [[Extended Kalman filter|Extended Kalman Filter]]. This is useful, for example, in determining the altitude of an aircraft using low-cost sensors.<ref>{{cite journal|last=Gross|first=Jason|coauthors=Yu Gu, Matthew Rhudy, Srikanth Gururajan, and Marcello Napolitano|title=Flight Test Evaluation of Sensor Fusion Algorithms for Altitude Estimation|journal=IEEE Transactions on Aerospace and Electronic Systems|date=July 2012|volume=48|issue=3|pages=2128–2139|url=http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=6237583&tag=1|doi=10.1109/TAES.2012.6237583}}</ref>
| |
| | |
| == See also ==
| |
| | |
| * [[Information integration]]
| |
| * [[Data mining]]
| |
| * [[Data fusion]]
| |
| * [[Image fusion]]
| |
| * [[Information#Information is not data|Information: Information is not data]]
| |
| * [[Data (computing)]]
| |
| * [[multisensory integration|multimodal integration]]
| |
| * [[Fisher's method]] for combining independent tests of significance
| |
| * [[TransducerML|Transducer Markup Language]] (TML) is an XML based markup language which enables sensor fusion.
| |
| * [[Brooks – Iyengar algorithm]]
| |
| * [[Inertial navigation system]]
| |
| * [[Sensor grid|Sensor Grid]]
| |
| * [[Semantic_perception|Semantic Perception]]
| |
| | |
| == References ==
| |
| {{Reflist}}
| |
| | |
| * [http://www.infofusion.buffalo.edu/tm/Dr.Llinas'stuff/Rethinking%20JDL%20Data%20Fusion%20Levels_BowmanSteinberg.pdf Rethinking JDL Data Fusion Levels]
| |
| * E. P. Blasch and S. Plano, “Level 5: User Refinement to aid the Fusion Process”, Proceedings of the SPIE, Vol. 5099, 2003.
| |
| * {{cite conference | author1 = J. Llinas | author2 = C. Bowman | author3 = G. Rogova | author4 = A. Steinberg | author5 = E. Waltz | author6 = F. White | id = {{citeseerx|10.1.1.58.2996}} | title = Revisiting the JDL data fusion model II | conference = International Conference on Information Fusion | year = 2004 }}
| |
| * E. Blasch, "[http://www.iut-amiens.fr/~ricquebourg/these/fusion_2006/Papers/394.pdf Sensor, user, mission (SUM) resource management and their interaction with level 2/3 fusion]" International Conference on Information Fusion, 2006.
| |
| * J. L. Crowley and Y. Demazeau[http://www-prima.inrialpes.fr/Prima/Homepages/jlc/papers/SigProc-Fusion.pdf Principles and Techniques for Sensor Data Fusion] Signal Processing, Volume 32, Issues 1–2, May 1993, Pages 5–27
| |
| | |
| == External links ==
| |
| | |
| * [http://www.isif.org/ International Society of Information Fusion]
| |
| | |
| [[Category:Robotic sensing]]
| |
| [[Category:Computer data]]
| |
| [[Category:Sensors]]
| |
Greetings! I am Myrtle Shroyer. He is really fond of performing ceramics but he is having difficulties to find time for it. North Dakota is where me and my husband reside. Since she was eighteen she's been working as a receptionist but her marketing never comes.
Here is my web blog; at home std testing (simply click the next website page)