April 23, 2024

APIs at the Edge of Modern Cloud Apps

So much of our day-to-day experience with the internet happens invisibly – we simply trust our services to function seamlessly and efficiently at scale with very little awareness of how each piece works in context. One of the most incredible elements of the ecosystem, however, is also one of the most misunderstood.

APIs at the Edge of Modern Cloud Apps

The cloud might be a popular talking point, but the edge of the cloud – where APIs and distributed computing intersect – is the true unsung hero of modern cloud application development and deployment. Today, we’re going to look at APIs at the edge of modern cloud apps, and dive into how they function, how they can be secured, and what lessons we can learn from examples of poor execution in the market.

Understanding Modern Cloud Applications

Modern cloud applications might look like a singular service, but in actuality, they are often driven by an ecosystem of services, applets, and systems that make a cohesive offering. To fully understand how to secure these systems, it helps to start with some basic definitions.

What Is Cloud Computing?

Firstly, we should consider what it means to be “in the cloud.” When we refer to the cloud, what we are really referring to is a collection of computational and storage resources that are physically located outside the service in question. Whereas traditional offerings often had local servers – sometimes in the same building! – computing and storing data, cloud computing has moved this nexus of storage and compute power away from the requester and the provider to a remote location.

Cloud applications leverage this technology through two distinct design concepts – the frontend and the backend. The client sees the frontend as their sole entry point into the system, representing where they make their requests, login, etc. For the provider, the backend contains the business logic and code that orchestrates the cloud execution and resource allocation.

For the typical workflow, this is good enough. But what happens when that centralization becomes a blocker?

Defining Edge Computing

In some cases, the processing of this data in the cloud is desired, but the centralization of resources into a remote location is problematic. This is a consistent problem for services which require real-time or near real-time processing – when nanoseconds make the difference, sending data in blobs to a remote server for execution becomes just as bad as the traditional monolithic structure. In such a case, the question then becomes where this data processing can actually happen to both meet the need as well as provide the benefits of cloud computing and storage.

Edge computing solves this problem by distributing the infrastructure which handles the data, locating the storage and processing closer to where the requests arise. In other words, much of the edge computing eschews the traditional cloud computing approach of having a centralized warehouse, instead utilizing data storage and processing on systems at the edge of the user network. Put another way, edge computing is kind of like a corner store while cloud computing is a bit like a grocery delivery service – centralized and low friction but delayed by distance.

The Role of APIs in Cloud and Edge Applications

In order to facilitate this new paradigm of data service, providers have once again turned to the everyday hero of modern data and services – the API.

APIs drive cloud computing solutions, allowing for RESTful APIs, WebSockets, and other technologies to efficiently transit data, trigger new resource creation, orchestrate on-demand servers and serverless installations, and much more! The API is the workhorse of cloud computing, existing as the infrastructural solution to transit and orchestration.

Where these APIs play an especially important role is in edge computing. Edge computing is, by its nature, a sort of microservice-oriented take on cloud computing. Instead of centralizing everything into a singular body in a remote cloud server, edge computing distributes these resources, relying on APIs to manage the flow of information, the collation of response, and the ultimate organization and orchestration of the greater system to effectively manage requests.

But deploying APIs in this way comes with both benefits and challenges.

Benefits of Edge APIs

Edge APIs are located closer to where the resources and requests live, and because of this, they typically offer much better performance and reduced latency at scale. The improvement in response time is so important in a field where time gaps are often measured in nanoseconds, offering high efficiency and throughput regardless of where the requests originate.

These systems also result in increased scalability and flexibility, as they decentralize resource demands away from the development environment and instead locate them where the requests and interactions live. This means that creating a new function is much simpler and more dependent on the resource allocation near where the resource is likely to be used, reducing overhead and “wait time” for unused resources.

Ultimately, this also opens up new opportunities for security and privacy as well by keeping the data closer to the originator – instead of sending the data out to be processed in some data warehouse, data can be processed locally to the requestor, in theory reducing the number of potential prying eyes!

Drawbacks of Edge APIs

All of that said, Edge APIs do carry some significant risks and drawbacks. First and foremost, they are much more complicated to build and run – they require development that is friendly to a wide variety of environments and network conditions, and when those conditions are not unified, this can make for complex codebases.

This also introduces a much higher need for visibility. Poor visibility on edge services is all but a death knell, and can quickly lead to a loss of all the benefits above with a wider attack surface area that is harder to detect and thus harder to mitigate.

Case Studies

One of the best examples of edge computing in the modern web can be found in Meta’s efforts with game streaming. Game streaming is a complex thing to do, requiring on-device rendering, server streaming, cloud computing, and more. Meta has leveraged edge computing to deliver stability and high performance on their gaming products, dramatically reducing latency and improving quality on device:

"The best way for our cloud gaming infrastructure to provide great latency for players is to bring it as close to them as possible in terms of network distance. Meta’s data centers alone cannot provide the level of ultra-low latency we require for cloud gaming. So we rely on edge computing, where we bring the cloud gaming infrastructure closer to players by deploying in edges that are in metropolitan areas close to large populations.

As we increase the number of edge computing sites, we can also improve latency for players.

Our goal within each edge computing site is to have a unified hosting environment to make sure we can run as many games as possible as smoothly as possible. Today’s games are designed for GPUs, so we partnered with NVIDIA to build a hosting environment on top of NVIDIA Ampere architecture-based GPUs. As games continue to become more graphically intensive and complex, GPUs will provide us with the high fidelity and low latency we need for loading, running, and streaming games.

To run games themselves, we use Twine, our cluster management system, on top of our edge computing operating system. We build orchestration services to manage the streaming signals and use Twine to coordinate the game servers on edge."

This approach means a better user experience with higher quality, but it also introduces more complexity and moving parts that edge providers often have limited control over. Meta notes this in their documentation, pointing towards improvements in edge and cloud computing that are outside their control:

As we scale our cloud gaming platform, we’re continuously working to upgrade and improve our cloud gaming infrastructure. [...] We’re also working with mobile network operators and carriers to significantly improve the latency in their access network. On the hardware end, we’re working with chipset makers to improve latency in user devices. We’re also working on new container technologies to provide better streaming efficiency. And, of course, there will always be a continuous push on security as the system grows and improves.

Best Practices for Implementing Edge APIs

Security is obviously a prime consideration in this field – luckily, there are some best practices that you can implement today to ensure the health and future of your edge and cloud APIs.

  1. Design with edge and cloud computing in mind. It’s no longer good enough to make an internet-connected product that is “sort of cloud connected” – if you’re going to do it, dive into it headfirst and invest in a proper support apparatus. By approaching this development as cloud or edge-first, you are investing in the ethos and modality of the service, and ultimately are setting yourself up for success.
  2. Consider optimization. Deploying low latency frameworks and high efficiency data solutions can help improve the pieces of computing still affected by those issues in the cloud and edge model. These models are the sum of their parts, so target these improvements first to see major improvements over the collective implementation.
  3. Keep in mind that distributed does not mean insecure. APIs must still be developed with a security mindset. Ensure encryption at rest and in transit, and ensure that you are using adequate authentication and authorization schemes to keep your environment safe and reliable. Utilize trusted solutions and continually audit to ensure you are compliant.
  4. Invest in observability and visibility. Edge and cloud computing is impacted quite heavily by visibility issues due to their fundamental natures, so find a trusted partner that can give you audit trails, visibility, deep context, and other systems to actually understand your holistic security posture. This will become even more important as emerging technologies such as AI LLMs, high-power mobile devices, and Internet of Things devices become more ubiquitous and widespread. Develop for the future, not the current, and you will surely be ahead of the curve.

The Bottom Line

Understanding edge and cloud computing is vital to understanding cloud applications. Thankfully, with a few proven approaches and a firm understanding of the underlying technology, you can implement these solutions at scale in a secure, reliant, and resilient manner.

It’s important to find a partner you can trust to deliver visibility, the most valuable asset in cloud and edge computing. Thankfully, FireTail can deliver this, unlocking massive visibility, auditing, security, and overall context in edge and cloud applications. To find out how you can secure your APIs and cloud applications with FireTail, schedule a 30-minute demo with us today!