Toxic data in AI training opens loopholes for system manipulation
Data poisoning is a cyber attack in which attackers inject malicious or misleading data into an AI training dataset. The goal is to corrupt their behavior and produce distorted, biased or harmful results. A related danger is the creation of backdoors for malicious exploitation of AI/ML systems. These attacks pose a significant challenge to developers … Read more