Facial recognition technology, while pervasive, is marred by its inability to accurately identify faces, especially those of people of color. This poses the risk of innocent individuals being wrongly flagged for criminal activity. Studies have revealed significant racial bias, with AI models disproportionately trained on photos of white men, making them less accurate at identifying individuals from other racial and gender groups. The technology's deployment by major landlords and retailers, such as Walmart and Rite Aid, has sparked privacy and discrimination concerns. As the debate around facial recognition technology continues, the need for addressing its flawed accuracy and potential racial disparities remains paramount.