Understanding AI Package Hallucination: The latest dependency security threat

Logo
Presented by

Mackenzie Jackson - Developer Advocate at GitGuardian

About this talk

In this video, we explore AI package Hallucination. This threat is a result of AI generation tools hallucinating open-source packages or libraries that don't exist. In this video, we explore why this happens and show a demo of ChatGPT creating multiple packages that don't exist. We also explain why this is a prominent threat and how malicious hackers could harness this new vulnerability for evil. It is the next evolution of Typo Squatting. Introduction: 0:00 What is AI package Hallucination: 0:12 Sacrifice to the YouTube Gods: 0:33 How AI models find relationships: 0:45 Lawyer uses hallucinated legal cases: 1:18 How we use open-source packages: 1:39 How ChatGPT promotes packages: 2:17 Example of AI Package Hallucination: 2:51 Why is package hallucination a security risk: 3:46 How many packages are hallucinated? 5:37 Protection measures against AI Package Hallucination: 6:18
Related topics:

More from this channel

Upcoming talks (1)
On-demand talks (18)
Subscribers (244)
Learn how software-driven organizations use GitGuardian to strengthen their overall security posture and comply with application security frameworks and standards. GitGuardian, founded in 2017, has become the leader in automated secrets detection and is now focused on providing a comprehensive code security platform. It's raised $56M from top investors, including co-founders of GitHub and Docker. Its policy engine helps security teams monitor and enforce rules across all their VCS, DevOps tools, and infrastructure-as-code configurations. GitGuardian offers Secrets Detection, Infra as Code Security, and Honeytoken capabilities all in one platform.