Building modern internet services securely in the age of APIs
The modern web is more connected than ever before. With so many services connecting to one another to provide functionality we could barely imagine several decades ago, concerns about securing these systems have often fallen to the wayside for the newest and brightest to shine. Now, with so many of these services becoming a fundamental part of our day to day life, security – and, relatedly, privacy – has become just as important to the average consumer as the product-purpose and fit.
Today, we’ll look at how this system came to be, and begin to define the current state of interconnected systems. We’ll look at some specific considerations for securing these systems, and then offer some best practices for securing modern internet services now and into the future.
In the early days of the internet, everything operated within a small set of servers. Scientists and researchers had created an interesting messaging protocol, but these systems were akin to city states – they existed on their own, with every service holding its own resources and its own systems in order to serve a specific process.
When the number of people using the nascent internet could be counted in the hundreds, this was an acceptable way of doing things. As this number grew, however, the systems serving the newly growing populace of connected end users thus became more complex. All of the sudden, services that were only ever designed to serve tens of people were asked to serve thousands.
Throughout the development of the early internet – and the related World Wide Web that would use the internet to connect users across the world – these systems came to be defined by a singular term.
The “monolith” was everything you could ever want under the same service – a “one stop shop” where everything is centralized and packed as a single offering.
This approach granted incredible control, allowing the process owner to control a singular body of services and systems. What it did not grant, however, was extensibility and scalability.
The benefit of the monolith was having everything in one package. The great drawback of the monolith, however, was – well, having everything in one package. When you utilize a monolithic architecture, you package everything together in an interconnected system. What this means in practice is that any new development, any new demand, any new iteration, all of these must occur across the entire system simultaneously.
If a developer wanted to add a new function to the collection of services, every service had to be aware. If they wanted to add a new server, all systems must know of the server, or at least know how to route requests. Development crawled along, and with increased complexity came increased demands and inefficiencies in the base approach.
The solution was simple – break out the functions into smaller, individual components of the monolith, each self-contained and referential, so that new systems could be added, scaled or removed as needed.
This approach flipped the paradigm of the monolith on its head. Whereas the monolith was centralized and systemically linked to a singular system, the microservice architecture was distributed, pushing for additional services and systems instead of a singular owned monolithic body of code and service.
This change was much more than technical, however – it was a change that fundamentally altered the relationship of systems to other systems. The monolith was siloed – everything was a city state, an isolated, walled-off kingdom that operated on its own without respect to other services. While this allowed for a great amount of control, it resulted in a system that was always constrained by the walls of the monolith. Everything was siloed into itself, and communication with other systems was not done well by default.
The microservice architecture, on the other hand, was connected, not siloed. Microservices don’t just communicate well – they communicate out of necessity. Because nothing can get done in the microservice realm without lines of communication, these systems communicated externally just as well as internally, leveraging standards and protocols to generalize and standardize these interactions.
What we got out of this was something akin to a “metaservice” – when microservices could connect to other services, they all became a sort of ecosystem of service.
While this brought many benefits, it also introduced some new risks.
One important element of the microservice is its relation to the API. In modern microservice design, APIs are not just pieces of the greater part – they are the fundamental core building block that makes the service possible. Without APIs, microservices are just separate parts, neither complete enough to offer a full service nor connected enough to be made useful. With APIs, the sum is greater than the total of the parts.
This is vitally important to remember – APIs are not just "nice to have" infrastructure, but are instead the very lifeblood of the system. Accordingly, they demand a certain amount of caution, awareness, respect, and understanding to truly be useful and aligned with the business case.
In the same way that a city-state comes with its own heightened security protocols and guardrails, the monolith did have its own security due to the relative seclusion of the siloed systems. When microservices became more popular, and interconnected and interdependent services more common, there were suddenly many more ways to be hacked, many more avenues to secure, and more ways for failures to cascade.
Consider an API microservice that provides booking information for a hotel. In a typical request, this microservice might touch user provisioning APIs, multiple hotel APIs, geolocation APIs, advertising APIs, and much more. In this flow, there could be hundreds of potential attack vectors, let alone considerations for out of date security standards, bad configurations, etc.
Accordingly, interconnected systems are a substantial trade off – while the risk is greater than the monolith (in theory – the monolith also put all the eggs in one basket, so to speak), the benefit is substantial.
The need to secure interconnected systems should be thought of as an absolute requirement for the modern web rather than a “nice to have."
With that in mind, let’s look at some best practices for securing modern internet services. We’ll start with the most obvious ones.
Authentication and Authorization is a huge part of securing any system, but with interconnected microservices, a higher level of quality and trust is required. Interconnected systems are only as secure as their weakest link, so these authentication and authorization systems must be battle-tested, well-designed, and limited in nature to prevent escalation of authority attacks from being successful.
Remember that authorization and authentication are very different things. Authorization proves that somebody has the right to access or do something. Authentication proves that somebody is who they say they are. Using one as the other or blending these use cases can make for insecure processes – accordingly, keep in mind this division and plan your design ahead of time to segment domains of interest, secure vital systems, and intelligently prevent attacks.
Your security posture is ultimately a snapshot of a current state. Security can wax and wane according to new threats, requiring security experts to be current and aware. Staying up to date on new threat vectors, updating systems to their more secure variants, and being aware of what is out there is a huge part of the equation.
Security is like a strong fence. Fences ultimately fall without constant maintenance – boards break, nails and screws can shear, etc. As much as a fence must be constantly maintained, so too should the staff and systems underpinning your posture.
Encryption is an incredibly strong option for securing interconnected systems. Providing encryption both at-rest (meaning in storage) and in-transit (as data is sent or received) allows for data to be secured against external eyes, mitigating threats no matter how exposed the data is.
A big caveat here is that your encryption is only as good as your protocol. Just like a key is only as good as the style, number of teeth, dimples, etc., encryption requires proven protocols and transmission. Accordingly, choosing the right systems will determine the effectiveness of your encryption approach.
Simple steps can be taken during the planning stages to deploy more secure systems. For instance, choosing adequate rate limiting systems, separating data domains clearly, or even deciding what data is useful to store or collect can pay huge long-term dividends in the cost, efficiency, and effectiveness of your security apparatus.
Notably, this planning must be forward thinking. A secure solution that works for today is only rated for today – you must constantly audit and revisit this process to ensure that you are developing for the reality of the system rather than the past of the system.
Relatedly, consider a principle such as Zero Trust. Zero Trust essentially is an agreement that nothing, no matter where it’s come from, is to be trusted unless it can be validated and verified. In such a system, every request is tested and validated, every auth token is checked, everything is suspect, and nothing is accepted on face value.
What this grants is several layers of security, enforcing double checks and trust validation. There is some inefficiency in such a process, but the gains in security are far more than any sort of loss in efficiency.
You must be more aware of your system – and to get to that point, you must use effective tracking and monitoring solutions. Being able to define what you are tracking and then track the resultant data across the lifecycle of the product will give you unprecedented clarity over the entirety of the system. This clarity is worth its weight in gold, and should be considered a primary source of future development, iteration, and security opportunity.
You must have a firm understanding of your systems – in order to get that, you need to effectively inventory your APIs. Failing to document and track APIs is going to result in a security posture that is really only effective for a miniscule part of your ecosystem. Accordingly, deploy an effective inventory management system to track APIs, endpoints, and services.
Thankfully, much of this can be achieved by choosing and implementing a strong partnership with a security partner. FireTail is one such partner, providing incredible features that can help any organization be more secure.
These features include:
API Security can be achieved with a trusted partner. If you’re interested in gaining a partner with proven success in the industry, reach out and request a free demo today!