Skip to main content Skip to Footer


API analytics: an essential tool kit for creating value, improving performance

Success with most any API effort hinges on several factors. Our 2013 Technology Vision lays out a “design” plan based on sound principles.



Analytics provide the channels to scale a business, innovate and even represent a brand.

This also applies to APIs, or Application Programming Interfaces (a set of commands, functions and protocols programmers use when building software for specific operating systems).

Analytics help ensure success of APIs by providing insights to assess value, measure success and improve performance.

In our 2013 Technology Vision, we identified a “Design for Analytics” to uncover critical business insights. Our researchers found businesses frequently lacked the data needed to answer critical questions.

Designing for analytics starts with asking business managers certain questions. Then, we find answers by gathering data and feeding it into analytics.

Our researchers found businesses frequently lacked the data needed to answer critical questions.


Usage is the core driver of success for any API program. In fact, API Analytics looks at these complex mechanisms as if they are products, a key mindset to adopt.

This raises a few, related questions:

  • Is it the right product for the purpose?

  • Is it functioning the way it should?

  • Are people using it in the way it was designed and intended to be used?

  • Is it easy to use and differentiated from other API products?

For answers, our API Analytics goes beyond traditional trend and volume reports that divulge only what happened. We dig deeper, seeking more mission-critical questions, like what’s happening and why?

For example, consider early detection of low API volumes. Then, add visibility into whether the problem stems from IT issues or from confusion due to poor documentation.

Such a combination allows for a speedier and more efficient resolution before the problem becomes widespread.

Analytics delivers the insights to improve an API product in terms of design, functionality, support and other factors, helping pave the way to success.


What constitutes a good API product varies. It often depends on perspectives along the “value chain,” which includes the professionals who design and operate APIs and the application developers who use the products.

Let’s break this chain down into a few key “links.”

  • API product managers—Oversee an API product and need to meet certain business goals. The goals vary, but a common thread is that success hinges on making the API product appealing to consumers.

  • API operations side—IT focuses on keeping APIs running and meeting service-level agreements. This requires real-time insights to troubleshoot performance and security problems, as well as long-term insights for capacity planning.

  • Application developers—The direct consumers of an API product. They determine whether to use an API within their applications by evaluating functionality, reliability and ease of use. If one or more of these factors fail the test, application developers may switch to another option or forgo the API altogether.


For better insight, application developers and API providers need to help each other.

Consider differentiating an API product by offering application developers direct visibility into performance and availability, along with explicit notification of issues or degradation.

Such insight helps application developers rule out provider issues in troubleshooting their applications. In return, application developers can collect error logs, communicate use cases and embed the capability into the application in order to trace an API call from the deployed application.

Working together allows the API provider to offer better support and quicker resolution of issues. Consider a few nuggets of insight awaiting discovery:

  • Availability and performance monitoring

  • Volumes and trends

  • Access patterns

  • Visibility into the API end-to-end conversation.

The key takeaway is not to confuse access to metrics as sufficient to measure and evolve an API program. Instead, start with identifying the questions and then designing the analytics to obtain the right data to attain answers.

Businesses getting started with API enablement may not see the immediate value of analytics. When the number of APIs is small and the use cases are well-known, analytics may not seem necessary.

But as use opens up and more APIs are created, organizations will quickly require a more industrialized model for analytics to properly measure and ensure success.


As maturity increases, API Analytics leverages a greater abundance of data and automation to pluck out relevant insights. This can answer a few questions:

  • Is the program meeting the defined goals?

  • How should we factor in changes in usage?

  • How can we effectively allocate resources, and introduce updates?

We recommend a tiered maturity model of API Industrialization for Analytics:

  1. Ad-hoc—Initial API forays often start with ad-hoc development and siloed use within parts of the organization. No standard use of analytics or measures.

  2. Organized—This is driven by an API strategy and business case. Where there’s an identifiable business outcome, it requires a standard definition of the metrics and measures needed to quantify success.

  1. Tactical—A common organization directs API projects by working out a standard cost-benefit analysis, along with the end-to-end instrumentation required to generate visualization and reporting.

  2. Critical—Implementation of mission-critical services occurs via a mature API design, development process and platform. API analytics are integrated with IT systems for the insight-to-action automation needed to support and evolve.

  3. Industrial—Using APIs as the fabric of business operations and the connected enterprise. These programs leverage predictive models and visibility beyond the API program to understand what is happening and why.

Ultimately, API Analytics becomes a proactive tool for forging a path to success. It’s an ongoing process, not a one-time assessment. And it all starts with a “design.”