Security Cameras with AI will be the smart city norm for applications such as intelligent traffic management and preventive threat detection.
SINGAPORE, April 1, 2021 /PRNewswire/ — According to global tech market advisory firm, ABI Research, the global installed base of smart cameras with an Artificial Intelligence (AI) chipset will reach over 350 million in 2025.
The inclusion of AI chipsets in security cameras in smart cities will become a norm, with over 65% of camera shipped in 2025 expected to come with at least one AI chipset.
These cameras will feature deep learning (DL) models to automate and augment decision making in applications such as intelligent traffic management, autonomous asset, pedestrian flow monitoring and management, physical and perimeter security, and preventive threat detection.
“More and more city and county governments around the world are actively looking to utilize AI. This has led to a boom in the adoption of smart cameras with edge AI chipsets,” says Lian Jye Su, Principal Analyst of AI & Machine Learning at ABI Research.
Aside from low latency, data privacy concerns have also driven the adoption of AI at the edge as information can be processed without being sent to the cloud. Ambarella, HiSilicon, Intel, NVIDIA, Qualcomm, and Xilinx are some of the key AI chipset suppliers in the smart city space.
Increasingly, TinyML vendors that offer always-on machine vision are expected to play a key role in enabling always-on machine vision through battery-operated cameras, Lidar, infrared and other sensors.
At the moment, most of these workloads are performed by either DL models hosted in the cloud, offered by video analytics vendors such as SenseTime, Ipsotek, icentana, and Sentry AI, or DL inference in smart cameras and network video recorders, such as HikVision and Dahua.
Both deployment methods come with their own respective strengths and weaknesses. Two technology trends will likely further catalyze the deployment of DL-based machine vision. “The first one is edge computing. Instead of deploying specific DL models on smart cameras that are multiple times more expensive than legacy cameras, city and county governments can host DL models on gateways and on-premise servers. This allows data to be processed and stored at the edge, providing faster response time than relying on cloud infrastructure. The second is 5G. While network slicing won’t be commercially ready by 2023, the network slicing capability of 5G allows communication service providers to offer dedicated network resources to host microservices, six nines reliability service assurance, seamless device connectivity, and onboarding to support DL-based machine vision in the smart city,” Su explains.
“The shift to edge computing opens new market opportunities, especially for AI chipset startups that focus more on processing at gateways and standard servers such as companies Blaize, Hailo, and Kneron.”
Nonetheless, significant headwinds abound. Public trust and regulations related to adopting AI in public cameras is a big challenge facing their implementation. The public and human right advocates around the world are wary of misuse and have been pushing back against the adoption of facial recognition technologies.
The use of facial recognition is even banned by city councils in the United States. As one of the key markets for DL-based machine vision technology, China has been under heavy scrutiny over its nationwide rollout of facial recognition technology to maintain national security and public order, particularly in ethnic minority territories.
“Trust is a critical component in public safety technologies. ABI Research encourages developers, vendors, authorities and the general public to focus on constant dialogue and the introduction of common technology platform for transparency, as well as AI ethic and governance frameworks that can minimize biases. Moving forward, the technology vendors that are successful in the smart city are those which will be able to demonstrate transparent and explainable DL models and those who show willingness to embrace open and common standards and ethical frameworks,” concludes Su.