Writing unit tests is good. It actually forms the basis of all high-quality development, in order to specifically ensure the non-regression of current and future code. However, being able to analyse your unit tests is even better. And that is the aim of Pytest-Monitor, a plugin for the Pytest framework, which allows you to analyse the use of the resources of the machine executing the tests, and of its companion, Monitor-Server-API. At present, three resources are monitored and their history logged by this extension:
As each result is attached to a runtime environment, it is easy to compare the impact of the hardware used to run your tests. With Pytest-Monitor, developers are directly able to measure the impact of a contribution or a development on performance, to check that a version upgrade of dependencies does not have an impact on the program’s footprint. Or integrators are even able to measure the impact of a change of hardware.
During the course of this conference, we will
explain the origin of this plugin and its opening up by CFM
demonstrate the ease of implementation
perform a quick demonstration based on an actual case
The presentation will also showcase Monitor-server-API, which allows you to go a step further by collecting and logging the history of analysis results in a centralised manner for one or more projects using Pytest-Monitor and enabling comparisons to be made between them. We will highlight actual and real examples of requests via a REST API or a library enabling complex requests in a few lines of code.