Data Fusion Methods
Data Fusion Methods
Sensor Fusion Categories
- Complementary. Each sensor provides data about different aspects or attributes of the environment. By combining the data from each of the sensors we can arrive at a more global view of the environment or situation. Since there is no dependency between the sensors combining the data is relatively easy.
- Competitive. Several sensors measure the same or similar attributes. The data from several sensors is used to determine the overall value for the attribute under measurement. The measurements are taken independently and can also include measurements at different time instants for a single sensor. This method is useful in fault tolerant architectures to provide increased reliability of the measurement
- Co-operative. When the data from two or more independent sensors in the system is required to derive information, then co-operative sensor networks are used since a sensor individually cannot give the required information regarding the environment.
Several other types of sensor networks exist such as corroborative, concordant, redundant, etc. Most of them are derived from the above mentioned sensor fusion categories.
Sensor Fusion Types
- Data In - Data Out (DAI-DAO) Fusion
- Data In - Feature Out (DAI-FEO) Fusion
- Feature In - Feature Out (FEI-FEO) Fusion
- Feature In - Decision Out (FIE-DEO) Fusion
- Decision In - Decision Out (DEI-DEO) Fusion
Sensor Fusion Topologies
- Centralized Architecture. A single node handles the fusion process. The sensors undergo preprocessing before they are sent to the central node for the fusion process to take place.
- Decentralized Architecture. Each of the sensor processes data at its node and there is no need for a global or central node. Since the information is processed individually at the node.
- Hierarchical Architecture. Combination of both centralized and distributed type.
Sensor Fusion Models
- JDL Fusion Architecture
- Waterfall Fusion Process Model
- Sensing ⭢ Signal Processing ⭢ Feature Extraction ⭢ Pattern Processing ⭢ Situation Assessment ⭢ Decision Making
Sensor Fusion Levels
- Signal Level Fusion
- Feature Level Fusion
- Decision Level Fusion
Signal Level Fusion
- Data from multiple sources (sensors) are combined to obtain better quality data and higher understanding of the environment being observed.
- Goals:
- Obtain a higher quality version of the input signals i.e. higher signal to noise ratio
- Obtain a feature or mid-level information about the system that a single measuring node cannot reveal.
- Common representation format
- spatial alignment ⭢ temporal alignment ⭢ normalization ⭢ scaling
Signal Level Fusion Methods
- Weighted Averaging. Taking an average of the various sensor signals measuring a particular parameter of the environment.
- Kalman Filter. Adaptive method of sensor fusion to remove redundancy in the system and to predict the state of the system. If there are two sensors and both of them sending data simultaneously, then $$ z(k) = [z1(k), z2(k)] = Hx(k)+v $$
- Track to Track Fusion. Track to track fusion methodology has local tracks generated by distinct local sensors. Then at a central node the tracks are fused. Track to track fusion methodology has local tracks generated by distinct local sensors. Then at a central node the tracks are fused the local track level. These states are then fused into a state vector that has combined information from all the local sensor nodes.
- Neural network. Data fusion models can be established using neural networks such that neurons and interconnecting weights are assigned based on the relationship between the multi-sensor data input and the signal output.
- Joint probability distribution and Gaussian distribution
- Bayesian estimator, least-square for feature extraction
- Adaptive observer
- Composite coherent spectrum (CCS), poly-CCS (see Yanusa Kaltungo), embedding both sensor and feature level fusion
Examples: - (amerineniFusionModelsGeneralized2021, link, DOI, zolib) varieties: input vector model, local matrices input etc.
Features Level Fusion
- YHHAZ/NetworkFusion: Detection algorithm for pigmented skin disease based on classifier-level and feature-level fusion (wanDetectionAlgorithmPigmented2022, link, DOI, zolib)
Decision Level Fusion
References
- (chandrasekaranSurveyMultisensorFusion2017, link, DOI)
- (niuDataDrivenTechnologyEngineering2017, link, DOI)
Tags
Edit this page
Last updated on 5/19/2023