In robotic systems, sensor fusion is vital for synthesizing data from different sensors to create a more accurate representation of the environment. Bayesian methods underpin this fusion by taking into account the uncertainties associated with each sensor measurement. For example, if a camera indicates an object at a distance of 1.2 m ± 0.1 m and LiDAR reports 1.3 m ± 0.3 m, the Bayesian approach prioritizes the more precise camera measurement. The Kalman Filter is a popular algorithm used to estimate the state of dynamic systems over time. It works through two main stages: prediction, which estimates the current state based on prior information, and update, which refines this estimate based on incoming sensor data. For non-linear systems, the Extended Kalman Filter modifies the state estimation process to accommodate non-linear dynamics. Applications of these techniques are prevalent in robotics, including IMU and GPS fusion for navigation, visual-inertial navigation, and pose estimation using multiple sensor inputs.