There are numerous advantages of using microservices over the monolithic application structure.
Microservices, though, unlike monoliths don't have the established development patterns.
A lot of problems are still unsolved and we yet to witness the emergence of de-facto standards of "the microservices way" of development.
Testing is not an exception. For monoliths, there is unit testing, component testing, integration testing. The boundaries are clear, the way to write tests is clear as well.
What about microservices?
Say, you use rest REST over HTTP(s) between microservices as your communication layer.
In a typical application a (micro)service has a set of dependencies, probably other (micro)services.
Like in unit testing, the first idea that comes to mind is mocking.
But what's a good way to mock microservices?
Or should you always run the real instance of the dependency with the test data (or fixtures) to support testing?
We thought of another way.
For the microservices reference application[1] we defined multiple levels of tests.
This is the familiar unit testing for the application, not much to say. Depend on the implementation language.
Test the service without external dependencies. Use data fixtures.
Test the service as a container. This includes controlled injection of (mocked) dependencies and testing the behaviour of a service under different circumstances. Test the exposed API.
API specification and testing endpoints
If you are serious about continuous integration of your microservice zoo you would consider writing a specification for the API.
Having a specification allows you to establish a contract between the producer and an API consumer.
This is an essential piece towards maintainability and continuous integration.
We chose OpenAPI (Swagger) to describe our microservices.
Now that we have the spec, the first logical step is to integrate automated API testing into our testing workflow.
For this, we chose an outstanding tool Dredd [2].
Dredd is simple and effective.
It takes your Swagger (or APIBlueprint) specification and the endpoint that provides an API that complies to the specification.
It then runs the tests against this endpoint and makes sure that it acts exactly the way the specification describes.
This is crucial, now we have a way to automatically validate our APIs.
We use containers for running our microservices, also for running our test suite.
Each level of testing is a directory with a set of tests for the level.
Let's take a look at the container level API test:
def setUp(self):
command = ['docker', 'run',
'-d',
'--name', AccountsContainerTest.container_name,
'-h', AccountsContainerTest.container_name,
'weaveworksdemos/accounts:' + self.TAG]
Docker().execute(command)
[...]
def test_api_validated(self):
out = Dredd().test_against_endpoint("accounts/accounts.json", AccountsContainerTest.container_name, "http://accounts/", "mongodb://accounts-db:27017/data", self.mongo_container_name)
self.assertGreater(out.find("0 failing"), -1)
self.assertGreater(out.find("0 errors"), -1)
print(out)
Here we are running Dredd against the API endpoint.
class Dredd:
image = 'weaveworksdemos/openapi'
# start the testing container and run it against the endpoint
def test_against_endpoint(self, json_spec, container_name, api_endpoint):
command = ['docker', 'run',
'-h', 'openapi',
'--name', 'openapi',
'--link', container_name,
Dredd.image,
"/usr/src/app/{0}".format(json_spec),
api_endpoint,
"-f",
"/usr/src/app/hooks.js"]
out = Docker().execute(command)
Docker().kill_and_remove('openapi')
return out
The routine starts the Dredd container and gives it location of the spec and the endpoint with the running API.
Dredd is supplied with the hooks.js file that seeds the database with fixtures for the service.
You can learn more about the Dredd Docker image we created on our repository [3] and read more on Dredd hooks in the documentation[4].
With this workflow we have defined the microservices testing levels and integrated API endpoints testing against the specification into continuous integration stage.
There's a lot more work to do.
For example, it would be nice to introduce versioning for the API.
Also, currently we have to manually write and update the specification which gets tedious fast. But it’s a necessary evil as we use different technologies for our microservices and we didn't achieve full automation yet.
But it's a good start and gives us more confidence as we keep deploying our services.
[1] http://thenewstack.io/start-socks-towards-cloud-native-reference-application/
[2] https://github.com/apiaryio/dredd
[3] https://github.com/microservices-demo/microservices-demo/tree/master/openapi
[4] https://dredd.readthedocs.io/en/latest/hooks/