AI Definitions: Data Poisoning

Data Poisoning – This is an attack on a machine-learning algorithm where malicious actors insert incorrect or misleading information into the data set being used to train an AI model in order to pollute the results. It also can be used as a defensive tool to help creators reassert some control over the use of their work. AI’s growing role in military operations has particularly created opportunities and vulnerabilities related to data poisoning of AI systems involved indecision-making, reconnaissance, and targeting.

More AI definitions here