Microservices is the prevailing architecture style of choice for building software products from reusable APIs. The organizational and technical approaches it champions are a wiser version of Service Oriented Architecture (SOA) of years past that also incorporates the Representational State Transfer (REST) architecture style — the bar by which industry uses to assess the goodness of API design.

The subject of a future blog post is The Failings of SOA, the Microservices Architecture Style as Successor will tell an interesting story and provides the context for how we build software from reusable components today and why.

Another related and forthcoming blog post is REST Architecture Style Adaptations for Practical API Design, a story advocating for “impure” adoption of REST out of necessity.

Just as with SOA, the microservices architecture style is broad in scope, recommending many organizational, architectural, and developmental best practices. This makes it a bit challenging to understand and even harder to achieve as an enterprise organizational competancy. The purpose of the Microservices Scorecard is to distill this architecture style down to its essence as a means for introducing it to new adopters, and to provide a framework for assessing the degree to which an enterprise and its products adhere to the style.

The scorecards is based on the foundational work of James Lewis and Martin Fowler, Microservices, A Definition of this New Architectural Term*.

Scorecard Philosophy

The scorecards enables a ballpark estimate of architecture adherence using easy to understand criteria that is organized and structured to facilitate gap analysis and the elaboration of remediation plans. The scorecard is purposefully straightforward to use — it avoids being overly prescriptive about implementation, nor does it attempt to imply there is a correct sequence for which to mature an architecture, nor does it say which criteria of the architecture are most important to achieve. Past experience with SOA provides pretty good indication that the goals of organizations adopting an architecture style vary considerably and it naturally follows that it is reasonable, in fact recommended, to customize an architecture based on unique organizational needs.

Key Terminology

Without being exhaustive, or exhausting to the reader, let us baseline a few terms. Terms are defined specifically for this scorecard but some parts of their descriptions you will recognize from more general and broad definitions found in industry for REST, software reuse, etc.

  • Service. A logical grouping of APIs that are functionally cohesive, operating on or in the context of resource state that is informationally (data) cohesive. A service is a unit of ownership on the provider side and a unit of onboarding on the consumer side.
  • Resource. A collection of informationally cohesive data typically describing a noun, from a key business domain of an organization, as a hierarchical set of attributes. Resources are typically allocated to one-and-only-one service, with their ownership and onboarding shadowing that of its service.
  • API Contract. The technical documentation, possibly machine-readable, that informs client developers how to make an API call including request requirements, behavior to expect, and the range of responses returned. A good contract hides API implementation details from client developers.
  • API. The implementation of an API contract that is remotely invoked over HTTP(S) by client-side software with behavior semantics set by the HTTP verb and the point of its application set by a resource URL, that is, the identity of the resource subjected to the executing API. An API is a unit of software reuse that is remotely accessed and executed on a host process outside of its caller.
  • Software Library. A unit of software reuse, software typically invoked within the caller’s process and built from the same technology.
  • Practice Area. A logical grouping of tasks performed by an organization whose execution improves over time as the organization becomes more competent and mature.

Scorecard

The scorecard organizes criteria around practice areas akin to past SOA maturity models. For those unfamiliar with microservices architecture, practice areas provide a useful high level understanding of its scope. Practice areas are purposefully ordered with those at the top being more intrinsic to the definition of microservices than those at the bottom, which tend to be better understood as new complimentary practices of the day. Each practice area includes criteria that further characterize the properties of a microservices architecture. An organization can score the degree to which it meets each criterion:

  • Criterion Not Assessed (un-scored). The microservices initiative has not been assessed with respect to this criterion.
  • Criterion Met (2). The microservices initiative has largely or completely met this criterion.
  • Criterion Partially Met (1). The microservices initiative has made some noteworthy progress toward fulfilling this criterion but still has more work to do.
  • Criterion Not Met (0). The microservices initiative has made little to no progress meeting this criterion.

 

 

Granularity and Reuse

A microservices architecture decomposes software into reusable APIs whose granularity facilitates building a variety of experiences through the mixing-and-matching of alternative API combinations.

  • APIs are more coarse-grain than in-process methods. APIs generally offer more behavior per invocation than would a typical local object-oriented API call. APIs do not require chatty interactions with client software.
  • API granularity enables client-side composition. Consumers like the size and functionality offered by of APIs, finding them efficient building blocks for constructing their applications.
  • APIs are the primary means of software reuse. Software reuse across the enterprise is achieved primarily through the consumption of APIs built once and used everywhere.
  • Software libraries are shared as a secondary means of reuse. Software libraries are utilized to achieve reuse as well, albeit at a lower level of abstraction found inside API implementations. These libraries are reused across organizational boundaries in good measure with reuse facilitated by a developer portal as is done for services.
  • An open source contribution model enables reuse. APIs are developed using an open source contribution model where code is contributed by members of the (internal and/or external) developer community at large.

Independence

Inline with the “design for change” principle, a microservices architecture decomposes software into independent modules that enable graceful (less expensive) software evolution.

  • An API is independently releaseable. Any API can be solely deployed to a production environment.
  • An API independently replaceable. Any API can be changed / updated in a production environment without having to change / update other APIs.
  • An API is independently scaleable. An API can service an increasing number of requests through software scaling that does not change the scaling of other APIs.
  • An API can be built from its own unique technology stack. The provider of an API is not subject to any technical limitations that constrain using their technology stack of choice.

Service Architecture

A microservices architecture carefully decomposes software functionality into APIs whose implementation is hidden from consumers and follows a layered architecture that further separates and encapsulates software according to service-oriented best practices.

  • Composition across API calls is transactionless. Transactions are not used to manage failback across API calls.
  • Each layer of the architecture follows a service-oriented decomposition. API implementations follow standard SOA best practices for implementation including the allocation of software responsibilities to each layer SOA prescribes.
  • Communication mechanisms are lightweight. Minimal business logic is implemented in communications middleware. Instead, business logic is placed predominantly in one or more API implementation layers.
  • Software encapsulation minimizes the impact of change. Throughout the API implementation, software encapsulation is used to localize the impact of change, particularly for clients software consuming APIs.
  • APIs have an explicit contract. A technical API contract written for client software developers is always available.
  • API contracts are designed first. API contracts are designed in advance of implementation. Contract design considers the services strategy and the interdependencies it has within the portfolio of services an organization intends to offer.
  • API contracts are designed for change. API providers anticipate future requirements and the range of changes an API may need, then proactively design API contracts that can be evolved through backward compatible changes.
  • API contracts hide implementation details. API contracts provide an abstract interface for client software to consume that is API implementation independent. The underlying API implementation technology can be changed without impacting client software.
  • Features are added to API contracts only at the moment when there is a true client need. API features are introduced only as needed to avoid software “gold-plating” and prematurely committing to designs that might jeopardize backward compatibility in the future.

Alignment

Much as with SOA, a microservices architecture identifies and defines a set of services to build that are strategic to the enterprise. There is a clear linkage of services to their customer value proposition and to the value of building once and reusing.

  • Services are built around business capabilities. The functional domains of the business are used to organize and guide the decomposition of software into services.
  • Services are developed and delivered as products. The APIs of a service are treated as products in that (1) they are distinct offerings with a compelling value proposition for consumers, and (2) they are managed as first-class strategic corporate assets to develop and deliver.
  • API contract design is driven by consumer needs. API contract design focuses on meeting the requirements of consumer use cases, avoiding the inclusion of unnecessary functionality.

Ownership

Much as with SOA, a microservices architecture prescribes clear and consistent ownership of services.

  • Service team ownership is explicit and clear. The design and implementation responsibilities for the APIs of a service are allocated to a single team at any given moment of time, affording the development organization a clear charter for completing software engineering and for clear service provider accountability to its API consumers.
  • service ownership is consistent over time. Service ownership is sufficiently stable over time, enabling teams to become world class experts in the functional domains of the services they own, to be known service providers for these domains, and to take pride in development that leads to higher service quality.
  • service ownership applies across the full lifecycle. Teams own services across their full lifecycle, from conception to implementation to operation to retirement and all of the phases in between. The same team does the work required for each lifecycle phase of the services it owns.

Automation

A microservice architecture advocates for significant use of automation throughout development and software operation.

  • API builds are automated. The process used to turn source code and supporting artifacts into executable software is automated.
  • API testing is automated. The process used to validate API implementations is automated.
  • API deployments are automated. The process used to deploy the executable software for an API is automated.
  • API failures are detected. The operational infrastructure detects API failures.
  • API failure recovery is automated. The operational infrastructure can automatically respond to an API failure, fixing or removing problems, returning API execution back to a nominal state.
  • API execution logging is robust. Logging provides sufficient visibility into the key events that happen during API execution.
  • API execution monitoring is robust. There are sufficient visualizations and/or other forms of monitoring that provide operational insight into API execution.

Governance

A microservice architecture advocates for balanced governance where a central authority and individual service providers share the responsibility for conceiving of and doing the right things for their customers and their enterprise.

  • Centralized and federated governance is optimally balanced. Cross-enterprise standards enable more successful business unit software development. Central governance delegates the right degree of standard setting responsiblity to business units, enabling business units to set best practices tuned to their unique needs, thereby allowing them more opportunity to own their own success than if standards were heavily dictated to them.
  • Federated governance of the design space enables local optimization. Design decisions are made at each level of the software stack by organizations best positioned to make them, driving up the suitability and quality of architecture and detailed designs.
  • Best practice adherence is achieved through tooling more than written standards. The enterprise invests sufficiently in tooling that ensures best practices are followed, preferring this automation to employees reading and following published standards. Wherever best practice adherence can be automated, the enterprise has the needed tooling.