Always-sensing cameras are emerging in smartphones, home appliances, and other consumer devices, much like the always-listening Siri or Google voice assistants. Always-on technologies enable a more natural and seamless user experience, allowing such features as automatic locking and unlocking of the device or display adjustment based on the user's gaze. However, camera data has quality, richness, and privacy concerns which requires specialized Artificial Intelligence (AI) processing. However, existing system processors are ill-suited for always-sensing applications. Without careful attention to Neural Processing Unit (NPU) design, an always-sensing sub-system will consume excessive power, suffer from excessive latency, or risk the privacy of the user, all leading to an unsatisfactory user experience. To process always-sensing data in a power, latency, and privacy-friendly manner, OEMs are turning to specialized “LittleNPU” AI processors. In this webinar, we’ll explore the architecture of always-sensing, discuss use cases, and provide tips for how OEMs, chipmakers, and system architects can successfully evaluate, specify, and deploy an NPU in an always-on camera sub-system.