The definition and implement of metrics across a large set of product teams in multiple institutions is a very complicated problem to both implement and conceptualise. Within the EMI project the integration of more than one build system and bug-trackers further complicates the generation of a set of understandable metrics.
A quality assurance group endeavours to obtain unbiased metrics by examining all available data in the same format from all software code or change management tracker. This is made possible using strictly controlled XML schemas adhered to by each product team according to policy guidelines set down by the quality assurance group.
Defining strict policies implemented through schemas paved the way for building a metric framework using as input: change management information, static analysers and product-to-tracker category mapping functions to produce meaningful results per product. The results are presented using dashboards and automatically generated plots.
The planning process for the implementation of metrics for a set of software products is laid down in the SQuaRE ISO 25000 standard. However, because EMI already contained over 40 products at the beginning of the project, only the ISO 9126 subset related to evaluation proved a meaningful standard to apply. This helped simplify the planning process after additionally grouping the metrics according to the McCall software quality factors, reducing some of the complexity of the process. A number of key metrics were prioritized in the first year of the project.
In the case of the change management trackers, the information was assembled from many defect and request trackers, something that has not been accomplished in many projects. One of the more notable previous works is a book by Rex Black presenting the work of Entomology by Matt Barringer describing a tracker client supporting Mantis, Bugzilla and Trac. However, in EMI there are more than three bug-trackers to be integrated including RT, SourceForge, Savannah, StoRM, Trac and Bugzilla. The newly introduced framework and strict schemas now provide the ability to integrate many more trackers.
Attempting to assess the usefulness of individual analysers per product is almost meaningless in the overall scheme of a large project. For this reason a number of overarching analysers where generated to interpret the results of the static analysers produced by each product. This was achieved quite successfully and materialised as daily generated reports yielding comparative and overall project performance metrics.
A Java framework was generated to assemble and interpret data from a verification/test dashboard in an XML format, the common change management files per tracker in an XML format, analysers from the daily build results and mapping functions to handle the differences between tracker and product content. The results were presented as plots and data to a number of customers.
The metric reports are generated using a mixture of HTML/PHP for dashboards and Java/Python for the exposing of metric values. The structure for producing such metrics has proved to be both resilient and stable over the 3rd year of the EMI project.
The definition of a common XML schema and exchange format resulted in an implementation of a dashboard used every week in the executive management team (EMT) meeting where the open immediate and high priority defects/features can be assembled in seconds using a simple user interface.
The schema for the common change management exchange format defined in EMI can be used to produce a new standardization document around the subject of standard practise for exporting change management data to other change management systems.
The metric values and plots produced by the quality assurance group in EMI are immediately useful to a number of customers within the EMI project. The metrics provide product developers with plots and statistics highlighting static analyser results and number of defects/requests per product. They provide a snapshot of the most urgently pending tickets to the release manager to be discussed on a weekly basis. They provide density metrics, regression/functional test and analyser plots for the quality control group.
Internally within the quality assurance group the metrics must be continuously monitored to search for deviations outside the accepted schemas. Exceptions and errors are pinpointed by failed validations on XML files generated daily by the ETICS build system and common change management exchange format.