This study addresses the impact of adverse weather, specifically fog, on the perception systems of autonomous vehicles, which are critical for detecting and responding to traffic scenarios. Using over 10,000 images, an object recognition model was developed with Roboflow and YOLOv8, while fog disturbances were generated with GANs. The research simulates various traffic scenarios, comparing system performance under clear and foggy conditions. Results show that training models with a wider range of conditions enhances accuracy, highlighting the importance of diverse training for safe autonomous vehicle operation. This work offers insights for improving perception systems in autonomous vehicles.