Platform Update: Build the Future on Akamai
Today, technology is infused into nearly everything we do. The data behind personalized recommendations, connected devices, and wearables has changed how we engage with the world around us — whether we’re driving to a new destination, purchasing from a new retailer, or monitoring our health.
This means that businesses depend on computer-driven processes — and, more specifically, the developers who design, maintain, and innovate their infrastructures and applications — to engage ever-wider audiences and compete for market share. Developers are building the future.
The applications the developers are building are increasingly distributed, intelligent, and complex. Modern experiences are more social and interactive — incorporating audio, video, augmented reality, and virtual reality. These new architectures require massive amounts of compute power, scale, and intelligence. As increasingly real-time, visceral, individualized experiences are created, the amount of data that must be processed has grown exponentially. But it’s not just the compute power that is novel; it is the need to distribute that power from the core to the edge that has created a need that Akamai is looking to address.
Quality of experience is critical to capturing and retaining users. Lyn Cantor, CEO of Sandvine, describes it this way: “App Quality of Experience is a critical part of service providers’ brands, as consumers and enterprises care most about how their apps are performing.” Akamai’s expertise has long been in combining network optimization and application performance with edge computing and cybersecurity solutions. As the company that powers and protects life online, we appreciate that in order to create the best experience, businesses must balance resilience, speed, and security.
As we evolve Akamai to manage the applications of the future, there are two critical capabilities that we believe will attract developers to build on Akamai. The first will be a continuum of compute, built on the simple, accessible compute platform that Linode has developed, distributed across Akamai’s global network, and integrated with our serverless edge computing technology. The second will be an intelligent app optimization fabric that will seamlessly connect Akamai’s compute offerings with users, devices, and the cloud.
Today, 37% of enterprises say their annual cloud spend exceeds $12 million and 80% report that their annual cloud spend exceeds $1.2 million. That spend is tied up in core compute resources that still leave a performance gap with users, especially the farther they are from centralized cloud data center deployments. Alternative cloud providers have been creating ways for developers to save on some of these costs, but they struggle to deliver the scale and full complement of support and service resources required by business-critical applications.
Not surprisingly, cloud cost savings is the top initiative reported among organizations, and skills gaps continue to threaten competitiveness and future growth as they migrate more workloads to the cloud. Those trends have created a significant market opportunity for Akamai. Linode has developed a compelling offering, built on open source, with excellent price performance. This means that more developers can build on Linode without the burden of proprietary implementations or the vendor lock-in they create. And as we distribute it across Akamai’s global network, the massive scalability, globally distributed points of presence, and integrations with tier 1 transit providers will provide unparalleled reach for the apps built on Akamai.
Beyond distribution, developers will increasingly require intelligence to remain agile. Akamai is building a capability to intelligently optimize applications based on intent data. The fabric will allow developers to chain services in order to seamlessly blend edge and cloud across functions, containers, and virtual machines. The service will connect to a business's existing service registries, discover and dynamically update applications, and learn the performance patterns of each. So, if a use case requires acceleration to maximize performance, or localized data management to comply with data sovereignty requirements, Akamai will intelligently apply optimizations to meet business needs.
These capabilities will allow more developers to take advantage of compute capabilities from the cloud to the edge to attack “unicorn” problems. This was one of the reasons that Macrometa chose Linode, and why their CEO is excited about the distributed compute that we are building. The combination of Akamai’s scale and network have helped us to create two fast-growing compute and security businesses. As we continue to evolve Akamai into the world’s most-distributed compute platform, we are excited to see developers build the future on Akamai.