AI at the edge: Three tips to consider before deploying
As artificial intelligence (AI) matures, adoption continues to increase. According to recent research, 35% of organizations are using artificial intelligence and 42% are exploring its potential. While AI is well understood and deployed in large numbers in the cloud, it is still nascent at the edge and faces some unique challenges.
Many people use artificial intelligence throughout the day, from navigating their cars to tracking their steps to talking to digital assistants. Even though users frequently access these services on mobile devices, the results of the calculations still exist in the cloud usage of AI. More specifically, a person requests information, the request is processed by a central learning model in the cloud, and the results are sent back to the person's local device.
Edge AI is understood and deployed less frequently than cloud AI. From the beginning, AI algorithms and innovations have relied on a fundamental assumption—that all data can be sent to a central location. In this central location, algorithms have full access to the data. This allows the algorithm to build its intelligence like a brain or central nervous system, with full access to computation and data.
But AI at the edge is different. It distributes intelligence across all cells and nerves. By pushing intelligence to the edge, we empower these edge devices with agency. This is critical in many applications and fields such as healthcare and industrial manufacturing.
Reasons to deploy AI at the edge
There are three main reasons for deploying AI at the edge.
Protecting Personally Identifiable Information (PII)
First, some organizations that handle PII or sensitive IP (intellectual property) prefer to keep the data at its source - In imaging machines in hospitals or manufacturing machines on factory floors. This reduces the risk of "drift" or "leakage" that may occur when data is transmitted over the network.
Minimize bandwidth usage
The second is the bandwidth issue. Sending large amounts of data from the edge to the cloud would clog the network and be impractical in some cases. It is not uncommon for imaging machines in health environments to generate files so large that transferring them to the cloud is impossible or takes days to complete.
Simply processing data at the edge is more effective, especially when the insights are aimed at improving proprietary machines. In the past, computing was much more difficult to move and maintain, so this data needed to be moved to the computing location. This paradigm is being challenged now that data is often more important and harder to manage, leading to use cases that warrant moving computation to the location of the data.
Avoid Latency
The third reason for deploying AI at the edge is latency. The internet is fast, but not real time. If there are situations where milliseconds matter, such as robotic arms assisting surgeries or time-sensitive production lines, organizations may decide to run AI at the edge.
Challenges of AI at the edge and how to solve them
Despite these benefits, there are still some unique challenges associated with deploying AI at the edge. Here are some tips you should consider to help manage these challenges.
Good and bad results of model training
Most AI technologies use large amounts of data to train models. However, in edge industrial use cases this often becomes more difficult, as most manufactured products are defect-free and therefore labeled or annotated as good. The resulting imbalance between "good results" and "bad results" makes it more difficult for the model to learn to identify problems.
Pure AI solutions that rely on data classification without contextual information are often not easy to create and deploy due to the lack of labeled data and even the occurrence of rare events. Adding context to AI, otherwise known as a data-centric approach, often brings benefits in terms of accuracy and scale of the final solution. The truth is that while AI can often replace mundane tasks performed manually by humans, it greatly benefits from human insight when building models, especially when there isn’t a lot of data to work with.
Get a commitment from experienced subject matter experts, working closely with the data scientists who build the algorithms, to get a jumpstart on AI learning.
AI cannot magically solve or provide an answer to every problem
There are usually many steps that go into the output. For example, a factory floor may have many workstations, and they may be dependent on each other. Humidity in one area of the factory during one process may affect the results of another process later in a different area of the production line.
People often think that artificial intelligence can magically piece together all these relationships. While this is possible in many cases, it can also require large amounts of data and a long time to collect, resulting in very complex algorithms that do not support interpretability and updating.
Artificial intelligence cannot live in a vacuum. Capturing these interdependencies will push the envelope from a simple solution to one that can scale over time and across different deployments.
Lack of stakeholder support limits the scale of AI
It’s difficult to scale AI throughout an organization if a group of people in the organization are skeptical of its benefits. The best (perhaps only) way to gain widespread support is to start with a high-value, difficult problem and then use AI to solve it.
At Audi, we consider solving the problem of frequency of replacement of welding gun electrodes. But the low cost of electrodes doesn't eliminate any mundane tasks humans are doing. Instead, they chose a welding process that is universally recognized as a difficult problem across the industry and significantly improved the quality of the process through artificial intelligence. This sparked the imagination of engineers across the company, who looked at how AI could be used in other processes to improve efficiency and quality.
Balancing the Benefits and Challenges of Edge AI
Deploying AI at the edge can help organizations and their teams. It has the potential to transform facilities into intelligent edges, improve quality, optimize manufacturing processes, and inspire developers and engineers across the organization to explore how they can integrate AI or advance AI use cases, including predictive analytics, recommendations to improve efficiency, or anomaly detection. But it also brings new challenges. As an industry, we must be able to deploy it while reducing latency, increasing privacy, protecting IP and keeping the network running smoothly.
The above is the detailed content of AI at the edge: Three tips to consider before deploying. For more information, please follow other related articles on the PHP Chinese website!

Hot AI Tools

Undresser.AI Undress
AI-powered app for creating realistic nude photos

AI Clothes Remover
Online AI tool for removing clothes from photos.

Undress AI Tool
Undress images for free

Clothoff.io
AI clothes remover

Video Face Swap
Swap faces in any video effortlessly with our completely free AI face swap tool!

Hot Article

Hot Tools

Notepad++7.3.1
Easy-to-use and free code editor

SublimeText3 Chinese version
Chinese version, very easy to use

Zend Studio 13.0.1
Powerful PHP integrated development environment

Dreamweaver CS6
Visual web development tools

SublimeText3 Mac version
God-level code editing software (SublimeText3)

Hot Topics











The top ten cryptocurrency exchanges in the world in 2025 include Binance, OKX, Gate.io, Coinbase, Kraken, Huobi, Bitfinex, KuCoin, Bittrex and Poloniex, all of which are known for their high trading volume and security.

Bitcoin’s price ranges from $20,000 to $30,000. 1. Bitcoin’s price has fluctuated dramatically since 2009, reaching nearly $20,000 in 2017 and nearly $60,000 in 2021. 2. Prices are affected by factors such as market demand, supply, and macroeconomic environment. 3. Get real-time prices through exchanges, mobile apps and websites. 4. Bitcoin price is highly volatile, driven by market sentiment and external factors. 5. It has a certain relationship with traditional financial markets and is affected by global stock markets, the strength of the US dollar, etc. 6. The long-term trend is bullish, but risks need to be assessed with caution.

MeMebox 2.0 redefines crypto asset management through innovative architecture and performance breakthroughs. 1) It solves three major pain points: asset silos, income decay and paradox of security and convenience. 2) Through intelligent asset hubs, dynamic risk management and return enhancement engines, cross-chain transfer speed, average yield rate and security incident response speed are improved. 3) Provide users with asset visualization, policy automation and governance integration, realizing user value reconstruction. 4) Through ecological collaboration and compliance innovation, the overall effectiveness of the platform has been enhanced. 5) In the future, smart contract insurance pools, forecast market integration and AI-driven asset allocation will be launched to continue to lead the development of the industry.

Currently ranked among the top ten virtual currency exchanges: 1. Binance, 2. OKX, 3. Gate.io, 4. Coin library, 5. Siren, 6. Huobi Global Station, 7. Bybit, 8. Kucoin, 9. Bitcoin, 10. bit stamp.

The top ten cryptocurrency trading platforms in the world include Binance, OKX, Gate.io, Coinbase, Kraken, Huobi Global, Bitfinex, Bittrex, KuCoin and Poloniex, all of which provide a variety of trading methods and powerful security measures.

The top ten digital currency exchanges such as Binance, OKX, gate.io have improved their systems, efficient diversified transactions and strict security measures.

Using the chrono library in C can allow you to control time and time intervals more accurately. Let's explore the charm of this library. C's chrono library is part of the standard library, which provides a modern way to deal with time and time intervals. For programmers who have suffered from time.h and ctime, chrono is undoubtedly a boon. It not only improves the readability and maintainability of the code, but also provides higher accuracy and flexibility. Let's start with the basics. The chrono library mainly includes the following key components: std::chrono::system_clock: represents the system clock, used to obtain the current time. std::chron

Handling high DPI display in C can be achieved through the following steps: 1) Understand DPI and scaling, use the operating system API to obtain DPI information and adjust the graphics output; 2) Handle cross-platform compatibility, use cross-platform graphics libraries such as SDL or Qt; 3) Perform performance optimization, improve performance through cache, hardware acceleration, and dynamic adjustment of the details level; 4) Solve common problems, such as blurred text and interface elements are too small, and solve by correctly applying DPI scaling.
