The adoption of internet-connected devices is growing exponentially over the past few years. This trend will continue in the future. Gartner estimates that the average CIO will manage more than three times as many endpoints in 2023 than they did in 2018. To support such an increase, however, it would be necessary to scale up the cloud infrastructure and provide substantial network capacity. This may not be financially feasible.
Edge computing may be a viable solution in such situations, as it provides the necessary resources such as storage, computing, and network closer to the data source.
Edge computing is gaining popularity across all industries because it provides insight that is immediate and actionable. The benefits of edge computing are well-known. In a previous article, I highlighted some uses and benefits.
It’s only a matter of time before edge computing becomes mainstream. This was evident in a recent IDC survey, which found that 73% of respondents viewed edge computing as a strategic investment. All stakeholders, including cloud providers and telecom services providers, are working together to strengthen the edge computing ecosystem and accelerate its adoption.
Web app developers should be aware of the advantages offered by these tailwinds and develop an edge adoption strategy to increase their agility and maximize the edge’s potential to improve user engagement.
Edge computing is gaining popularity across all industries, with benefits such as near-real-time insight and low latency. Also, cloud server bandwidth usage is reduced. An edge-computing architecture can be used for web applications to increase productivity, reduce costs, save bandwidth, and create new revenue streams.
There are four key enablers to edge computing that I believe help architects and web developers get started.
Also read: Top 10 IoT Mobile App Development Trends to Expect in 2021Multiple components make up the edge ecosystem, including devices, gateways, and edge servers. Depending on peak load and availability, web applications should be able to run on edge ecosystem components.
There are, however, specific uses, such as detecting poaching activity by drone in dense forests with low or no connectivity. This requires the development of applications native to edge devices and gateways.
This approach is best illustrated by containers, service meshes, and microservices. These features allow loosely coupled systems to be resilient, manageable, observable, and easily accessed. These features allow engineers to make frequent, high-impact changes with little effort.
Adopting edge computing would begin with a cloud-native architecture. This could be for the application, or for the service.
Cloud Service Providers, or CSPs, offers computing and storage services that are local to a particular region or zone. These act as mini/regional data centers managed by CSPs. This infrastructure can easily be used to deploy applications and services that follow the “develop once, deploy everywhere” principle.
AWS (outpost snowball), Azure, GCP (Anthos), IBM (cloud satellite), and Azure (edge zones) are some examples of CSPs that have extended their fully managed services to the on-premises setting. These hybrid cloud solutions can be used by enterprises and startups in growth stages to quickly deploy edge solutions quicker and with greater security, as long as they are able to afford it.
New cellular 5G technology could provide significant latency benefits for applications that run on mobile phones and rely on cellular connectivity. CSPs are also deploying compute and storage resources closer to the telecom carrier’s network. This allows mobile apps such as gaming and virtual reality to benefit from increased connectivity.
Also read: Forgot Notes Password? 7 Quick Way To Reset Notes Password on iPhone/iPadContent Delivery Networks have established Points of Presence (PoPs) to help web applications serve their content faster. Many PoPs have JavaScript (v8) language runtimes, which allow program execution to be performed closer to the edge. It also increases security by migrating client-side program logic to the edge.
When enabled with these services, web applications such as online shopping portals can provide a better customer experience and lower latency. Applications can, for example, move cookies manipulation logic to CDN Edge Processing rather than hitting the origin server. This could be useful in cases of high traffic, such as Cyber Monday and Black Friday.
This method can also be used to run A/B tests. A fixed number of users can be served with an experimental version while the rest of the participants receive a different version.
In the past few years, there has been a rapid increase in the number of neural network models and frameworks. This has led developers to share neural network models across a wide range of tools, frameworks, runtimes, and compilers. Developers and entrepreneurs need to find a way to standardize the AI/ML model formats on different edge devices before they can be used.
Open deep learning format formats such as Open Neural Network Exchange are emerging as a solution. It supports interoperability with commonly used deep-learning frameworks. It allows you to export models from various frameworks to the ONNX format. ONNX Runtime can also be used in JavaScript and other languages. Both runtimes and models are compatible with a variety of platforms, even low-ower-edge devices.
For machine learning applications, the traditional approach is to create AI/ML models in a cloud-intensive computing environment and then use that model to infer. It is possible to infer browser-based applications using AI/ML JavaScript frameworks. These frameworks may also allow you to train models in the browser or JavaScript backend.
I’ve worked with many startups and found that early adoption of emerging technologies such as edge computing can make a difference in the success of business decisions.
To be successful in adopting new technology, you need to plan and prepare. You can integrate edge computing seamlessly and sustainably to create web-based applications by following these enablers.
Tuesday November 19, 2024
Tuesday November 12, 2024
Tuesday November 5, 2024
Monday October 21, 2024
Monday October 7, 2024
Friday September 20, 2024
Tuesday August 27, 2024
Monday August 26, 2024
Thursday August 22, 2024
Tuesday June 11, 2024