header icons

Documentation Metrics


All metrics used in the maturity assessment process are described thereafter, with useful information and references. They are classified according to their source.



  • Average number of attributes ( ATTRS )

    Provided by: SonarQube

    Used by: ALL_DATA

    Scale: 1 < 1 ≤ 2 < 2 ≤ 3 < 3 ≤ 4 < 4 ≤ 5

    Average number of attributes defined in a class. This can be compared to the total number of operands (TOPD) considered at the class level, and is representative of the data complexity of the class (not including the complexity of methods).

  • Percentage of branches covered by tests ( BRANCH_COVERAGE )

    Provided by: SonarQube

    Used by: TST_COV

    Scale: 1 < 0 ≤ 2 < 15 ≤ 3 < 36.5 ≤ 4 < 73.3 ≤ 5

    The percentage of branches that is exercised by the tests. The SonarQube measure used for this purpose is branch_coverage.

    On a given line of code, Line coverage simply answers the following question: Has this line of code been executed during the execution of the unit tests?

  • Comment rate ( COMMENT_LINES_DENSITY )

    Provided by: polarsys_sonarqube

    Used by: CODE_DOC

    Scale: 1 < 10 ≤ 2 < 15 ≤ 3 < 20 ≤ 4 < 30 ≤ 5

    The ratio of source lines of code on the number of comment lines of code.

    Density of comment lines = Comment lines / (Lines of code + Comment lines) * 100. With such a formula, 50% means that the number of lines of code equals the number of comment lines and 100% means that the file only contains comment lines. The SonarQube measure used for this purpose is comment_lines_density.

    See also SonarQube's page on comment lines metrics: http://docs.sonarqube.org/display/SONAR/Metrics+-+Comment+lines.

  • Number of downloads on the web site ( DL_REPO_1M )

    Provided by: Unknown

    Used by: DOWNLOADS

    Scale: 1 < 1 ≤ 2 < 2 ≤ 3 < 3 ≤ 4 < 4 ≤ 5

    The number of downloads on the web site of the project (download page) during last three months. Example of download page (for CDT) : http://www.eclipse.org/downloads/ .

  • Number of downloads on update site ( DL_UPDATE_SITE_1M )

    Provided by: Unknown

    Used by: DOWNLOADS

    Scale: 1 < 1 ≤ 2 < 2 ≤ 3 < 3 ≤ 4 < 4 ≤ 5

    The number of downloads on the update site of the project during last three months. Is computed using successful and failed installs from marketplace.

  • Cloning density ( DUPLICATED_LINES_DENSITY )

    Provided by: polarsys_sonarqube

    Used by: CODE_CLONE

    Scale: 1 < 50 ≤ 2 < 40 ≤ 3 < 30 ≤ 4 < 10 ≤ 5

    The amount of duplicated lines in the code, divided by the number of lines. This is expressed as a percentage; The SonarQube metric used for this purpose is duplicated_lines_density.

    See also SonarQube's definition for duplications: http://docs.sonarqube.org/display/SONAR/Metric+definitions#Metricdefinitions-Duplications. There is a longer description of the code cloning algorithm here: http://docs.sonarqube.org/display/SONAR/Duplications.

  • Average Cyclomatic Complexity ( FUNCTION_COMPLEXITY )

    Provided by: polarsys_sonarqube

    Used by: CPX

    Scale: 1 < 5.3 ≤ 2 < 3.5 ≤ 3 < 2.8 ≤ 4 < 2.3 ≤ 5

    The average number of execution paths found in functions.

    Basically, the counter is incremented every time the control flow of the function splits, with any function having at least a cyclomatic number of 1. The sum of cyclomatic numbers for all functions is then divided by the number of functions. In SonarQube the measure is function_complexity.

    The cyclomatic number is a measure borrowed from graph theory and was introduced to software engineering by McCabe in [McCabe1976]. It is defined as the number of linearly independent paths that comprise the program. To have good testability and maintainability, McCabe recommends that no program modules (or functions as for Java) should exceed a cyclomatic number of 10. It is primarily defined at the function level and is summed up for higher levels of artefacts.

    See also Wikipedia's entry on cyclomatic complexity: http://en.wikipedia.org/wiki/Cyclomatic_complexity.

    See also SonarQube's definition for code complexity: http://docs.sonarqube.org/display/SONAR/Metrics+-+Complexity. There is also a discussion about its meaning here: http://www.sonarqube.org/discussing-cyclomatic-complexity/

    See also the Maisqual wiki for more details on complexity measures: http://maisqual.squoring.com/wiki/index.php/Category:Complexity_Metrics.

  • IP log code coverage ( IP_LOG_COV )

    Provided by: Unknown

    Used by: IP_LOG

    Scale: 1 < 1 ≤ 2 < 2 ≤ 3 < 3 ≤ 4 < 4 ≤ 5

    Percentage of contributions covered by an appropriate IP log.

    Eclipse foundation has a very strict IP policy and this measure should always be 100%.

  • ITS authors ( ITS_AUTH_3M )

    Provided by: polarsys_grimoire

    Used by: ITS_USAGE

    Scale: 1 < 2 ≤ 2 < 4 ≤ 3 < 9.75 ≤ 4 < 80 ≤ 5

    Number of distinct identities updating tickets during the last three months, in the issue tracking system.

    The subset of tickets considered from the issue tracking system are those specified in the project documentation, for example by specifying a tracker or a project name appearing in ticket data. Update means an operation on a ticket changing its state, or adding information. Opening and closing a ticket are considered as updates. Identities of updaters are the character strings found in the corresponding field in the ticket information. Time range is measured as three calendar months period starting the day before the data retrieval (example: if retrieval is on Feb 3rd, period is from Jan 3rd to Feb 2nd, both included).

  • Defect density ( ITS_BUGS_DENSITY )

    Provided by: polarsys_grimoire

    Used by: ITS_REL

    Scale: 1 < 3.36826347305 ≤ 2 < 0.0755602867593 ≤ 3 < 0.0335380213652 ≤ 4 < 0.009367219332 ≤ 5

    Ratio of the total number of tickets in the issue tracking system to the size of the source code in KLOC, at the time of the data retrieval.

    The subset of tickets considered from the issue tracking system are those specified in the project documentation, for example by specifying a tracker or a project name appearing in ticket data. Source code is all source code found in the source code management repository.

  • Number of open bugs ( ITS_BUGS_OPEN )

    Provided by: polarsys_grimoire

    Used by: ITS_REL

    Scale: 1 < 500 ≤ 2 < 200 ≤ 3 < 100 ≤ 4 < 50 ≤ 5

    Number of tickets marked as still open at the time of the data retrieval, in the issue tracking system.

    The subset of tickets considered from the issue tracking system are those specified in the project documentation, for example by specifying a tracker or a project name appearing in ticket data. Open means the state in which further actions are usually expected until the ticket is closed.

  • Median time to fix bug ( ITS_FIX_MED_3M )

    Provided by: polarsys_grimoire

    Used by: ITS_REL

    Scale: 1 < 1001.13 ≤ 2 < 40.2625 ≤ 3 < 11.62 ≤ 4 < 3.82 ≤ 5

    Median period from when a ticket is open to when a ticket is closed, for all tickets closed during the last three months, in the issue tracking system.

    The subset of tickets considered from the issue tracking system are those specified in the project documentation, for example by specifying a tracker or a project name appearing in ticket data. Closed means the state in which no further action is usually performed in the ticket. Time range is measured as three calendar months period starting the day before the data retrieval (example: if retrieval is on Feb 3rd, period is from Jan 3rd to Feb 2nd, both included). Unit is days.

  • ITS updates ( ITS_UPDATES_3M )

    Provided by: polarsys_grimoire

    Used by: ITS_USAGE

    Scale: 1 < 4 ≤ 2 < 13 ≤ 3 < 37 ≤ 4 < 596 ≤ 5

    Number of updates to tickets during the last three months, in the issue tracking system.

    The subset of tickets considered from the issue tracking system are those specified in the project documentation, for example by specifying a tracker or a project name appearing in ticket data. Update means an operation on a ticket changing its state, or adding information. Opening and closing a ticket are considered as updates. Time range is measured as three calendar months period starting the day before the data retrieval (example: if retrieval is on Feb 3rd, period is from Jan 3rd to Feb 2nd, both included).

  • License identification ( LIC_IDENT )

    Provided by: Unknown

    Used by: LIC

    Scale: 1 < 1 ≤ 2 < 2 ≤ 3 < 3 ≤ 4 < 4 ≤ 5

    Is there a list of the different licences used in the product?

    This measure may use an external tool like Ninka (ninka.turingmachine.org) to identify the licence of components used in the project. Another way would be to identify a file named LICENCES in the root folder.

  • Percentage of lines of code covered by tests ( LINE_COVERAGE )

    Provided by: SonarQube

    Used by: TST_COV

    Scale: 1 < 0 ≤ 2 < 20 ≤ 3 < 48.89 ≤ 4 < 87.7 ≤ 5

    The percentage of source lines of code that is exercised by the tests. The SonarQube measure used for this purpose is line_coverage.

  • Depth of Inheritance Tree ( MDIT )

    Provided by: SonarQube

    Used by: CPX

    Scale: 1 < 1 ≤ 2 < 2 ≤ 3 < 3 ≤ 4 < 4 ≤ 5

    The maximum depth of inheritance tree of a class within the inheritance hierarchy is defined as the maximum length from the considered class to the root of the class hierarchy tree and is measured by the number of ancestor classes. In cases involving multiple inheritance, the MDIT is the maximum length from the node to the root of the tree [Chidamber1994].

    A deep inheritance tree makes the understanding of the object-oriented architecture difficult. Well structured OO systems have a forest of classes rather than one large inheritance lattice. The deeper the class is within the hierarchy, the greater the number of methods it is likely to inherit, making it more complex to predict its behavior and, therefore, more fault-prone [Chidamber1994]. However, the deeper a particular tree is in a class, the greater potential reuse of inherited methods [Chidamber1994].

    See also SonarQube's page on the depth in tree: http://docs.sonarqube.org/display/SONAR/Metrics+-+Depth+in+Tree

  • Average number of methods ( METHS )

    Provided by: SonarQube

    Used by: ALL_DATA

    Scale: 1 < 1 ≤ 2 < 2 ≤ 3 < 3 ≤ 4 < 4 ≤ 5

    Average number of methods defined in the class. This can be compared to the total number of operators (TOPT) considered at the class level.

  • Number of favourites on the Marketplace ( MKT_FAV )

    Provided by: Marketplace

    Used by: MKT_FEEDBACK

    Scale: 1 < 10 ≤ 2 < 30 ≤ 3 < 70 ≤ 4 < 300 ≤ 5

    The number of favourites on the Eclipse Marketplace for the project.

    This measure uses the MarketPlace REST API to retrieve the number of registered users who marked this project as a favourite. If the project is not registered on the Marketplace, the metric should be zero.

  • Number of successfull installs on the Marketplace ( MKT_INSTALL_SUCCESS_1M )

    Provided by: Marketplace

    Used by: INSTALL

    Scale: 1 < 100 ≤ 2 < 500 ≤ 3 < 1000 ≤ 4 < 10000 ≤ 5

    The number of successful installs for the project as reported on the Marketplace during the last three months.

    This measure uses the list of successful installs available on the Marketplace for every registered project. Here is an example of log page, metrics tab. If the project is not registered on the Marketplace, the metric should be zero.

  • Developer ML authors ( MLS_DEV_AUTH_3M )

    Provided by: polarsys_grimoire

    Used by: MLS_DEV_DIVERSITY

    Scale: 1 < 1.5 ≤ 2 < 3 ≤ 3 < 7.5 ≤ 4 < 26 ≤ 5

    Number of distinct senders for messages dated during the last three months, in developer mailing list archives.

    Developer mailing list is the list or lists considered as 'for developers' in the project documentation. The date used is the mailing list server date, as stamped in the message. Time range is measured as three calendar months period starting the day before the data retrieval (example: if retrieval is on Feb 3rd, period is from Jan 3rd to Feb 2nd, both included). Distinct senders are those with distinct email addresses. Email addresses used are the strings found in 'From:' fields in messages.

    Note that developer communications are measured through the mailing list, and user communications are measured through the forums.

  • Developer ML response ratio ( MLS_DEV_RESP_RATIO_3M )

    Provided by: polarsys_grimoire

    Used by: MLS_DEV_SUPPORT

    Scale: 1 < 0.833333333333 ≤ 2 < 1.24285714286 ≤ 3 < 2.5125 ≤ 4 < 10 ≤ 5

    Average number of messages in thread, minus one, for all threds of messages dated during the last three months, in developer mailing list archives.

    Threads are identified using 'In-Reply-To' message headers. Developer mailing list is the list or lists considered as 'for developers' in the project documentation. The date used is the mailing list server date, as stamped in the message. Time range is measured as three calendar months period starting the day before the data retrieval (example: if retrieval is on Feb 3rd, period is from Jan 3rd to Feb 2nd, both included).

    Note that developer communications are measured through the mailing list, and user communications are measured through the forums.

  • Developer ML response time ( MLS_DEV_RESP_TIME_MED_3M )

    Provided by: polarsys_grimoire

    Used by: MLS_DEV_RESPONSIVENESS

    Scale: 1 < 42.93 ≤ 2 < 0.94 ≤ 3 < 0.26 ≤ 4 < 0.0425 ≤ 5

    Median period from first message in thread to second message in thread, for all threds with at least two messages dated during the last three months, in developer mailing list archives.

    Threads are identified using 'In-Reply-To' message headers. Developer mailing list is the list or lists considered as 'for developers' in the project documentation. The date used is the mailing list server date, as stamped in the message. Time range is measured as three calendar months period starting the day before the data retrieval (example: if retrieval is on Feb 3rd, period is from Jan 3rd to Feb 2nd, both included). Unit is days.

    Note that developer communications are measured through the mailing list, and user communications are measured through the forums.

  • Developer ML subjects ( MLS_DEV_SUBJ_3M )

    Provided by: polarsys_grimoire

    Used by: MLS_DEV_SUPPORT

    Scale: 1 < 1 ≤ 2 < 5 ≤ 3 < 15 ≤ 4 < 30 ≤ 5

    Number of threads of messages dated during the last three months, in developer mailing list archives.

    Threads are identified using 'In-Reply-To' message headers. Developer mailing list is the list or lists considered as 'for developers' in the project documentation. The date used is the mailing list server date, as stamped in the message. Time range is measured as three calendar months period starting the day before the data retrieval (example: if retrieval is on Feb 3rd, period is from Jan 3rd to Feb 2nd, both included).

    Note that developer communications are measured through the mailing list, and user communications are measured through the forums.

  • Developer ML posts ( MLS_DEV_VOL_3M )

    Provided by: polarsys_grimoire

    Used by: MLS_DEV_ACTIVITY

    Scale: 1 < 2 ≤ 2 < 6 ≤ 3 < 22 ≤ 4 < 103 ≤ 5

    Number of messages dated during the last three months, in developer mailing list archives.

    Developer mailing list is the list or lists considered as 'for developers' in the project documentation. The date used is the mailing list server date, as stamped in the message. Time range is measured as three calendar months period starting the day before the data retrieval (example: if retrieval is on Feb 3rd, period is from Jan 3rd to Feb 2nd, both included).

    Note that developer communications are measured through the mailing list, and user communications are measured through the forums.

  • User ML authors ( MLS_USR_AUTH_3M )

    Provided by: polarsys_grimoire

    Used by: MLS_USR_DIVERSITY

    Scale: 1 < 1.5 ≤ 2 < 3 ≤ 3 < 7.5 ≤ 4 < 26 ≤ 5

    Number of distinct senders for messages dated during the last three months, in user mailing list archives.

    User mailing list is the list or lists considered as 'for users' in the project documentation. The date used is the mailing list server date, as stamped in the message. Time range is measured as three calendar months period starting the day before the data retrieval (example: if retrieval is on Feb 3rd, period is from Jan 3rd to Feb 2nd, both included). Distinct senders are those with distinct email addresses. Email addresses used are the strings found in 'From:' fields in messages.

    Note that developer communications are measured through the mailing list, and user communications are measured through the forums.

  • User ML response ratio ( MLS_USR_RESP_RATIO_3M )

    Provided by: polarsys_grimoire

    Used by: MLS_USR_SUPPORT

    Scale: 1 < 0.833333333333 ≤ 2 < 1.24285714286 ≤ 3 < 2.5125 ≤ 4 < 10 ≤ 5

    Average number of messages in thread, minus one, for all threds of messages dated during the last three months, in user mailing list archives.

    Threads are identified using 'In-Reply-To' message headers. User mailing list is the list or lists considered as 'for users' in the project documentation. The date used is the mailing list server date, as stamped in the message. Time range is measured as three calendar months period starting the day before the data retrieval (example: if retrieval is on Feb 3rd, period is from Jan 3rd to Feb 2nd, both included).

    Note that developer communications are measured through the mailing list, and user communications are measured through the forums.

  • User ML response time ( MLS_USR_RESP_TIME_MED_3M )

    Provided by: polarsys_grimoire

    Used by: MLS_USR_RESPONSIVENESS

    Scale: 1 < 42.93 ≤ 2 < 0.94 ≤ 3 < 0.26 ≤ 4 < 0.0425 ≤ 5

    Median period from first message in thread to second message in thread, for all threds with at least two messages dated during the last three months, in user mailing list archives.

    Threads are identified using 'In-Reply-To' message headers. User mailing list is the list or lists considered as 'for users' in the project documentation. The date used is the mailing list server date, as stamped in the message. Time range is measured as three calendar months period starting the day before the data retrieval (example: if retrieval is on Feb 3rd, period is from Jan 3rd to Feb 2nd, both included). Unit is days.

    Note that developer communications are measured through the mailing list, and user communications are measured through the forums.

  • User ML subjects ( MLS_USR_SUBJ_3M )

    Provided by: polarsys_grimoire

    Used by: MLS_USR_SUPPORT

    Scale: 1 < 1 ≤ 2 < 5 ≤ 3 < 15 ≤ 4 < 30 ≤ 5

    Number of threds of messages dated during the last three months, in user mailing list archives.

    Threads are identified using 'In-Reply-To' message headers. User mailing list is the list or lists considered as 'for users' in the project documentation. The date used is the mailing list server date, as stamped in the message. Time range is measured as three calendar months period starting the day before the data retrieval (example: if retrieval is on Feb 3rd, period is from Jan 3rd to Feb 2nd, both included).

    Note that developer communications are measured through the mailing list, and user communications are measured through the forums.

  • User ML posts ( MLS_USR_VOL_3M )

    Provided by: polarsys_grimoire

    Used by: MLS_USR_ACTIVITY

    Scale: 1 < 2 ≤ 2 < 6 ≤ 3 < 22 ≤ 4 < 103 ≤ 5

    Number of messages dated during the last three months, in user mailing list archives.

    User mailing list is the list or lists considered as 'for users' in the project documentation. The date used is the mailing list server date, as stamped in the message. Time range is measured as three calendar months period starting the day before the data retrieval (example: if retrieval is on Feb 3rd, period is from Jan 3rd to Feb 2nd, both included).

    Note that developer communications are measured through the mailing list, and user communications are measured through the forums.

  • Average number of non-conformities for analysability ( NCC_ANA_IDX )

    Provided by: polarsys_rules

    Used by: PRACTICES_ANA

    Scale: 1 < 1 ≤ 2 < 0.5 ≤ 3 < 0.3 ≤ 4 < 0.1 ≤ 5

    The total number of violations of rules that impact analysability in the source code, divided by the number of thousands of lines of code.

    See also the page on practices for a list of checked rules and their associated impact on quality characteristics.

  • Average number of non-conformities for changeability ( NCC_CHA_IDX )

    Provided by: polarsys_rules

    Used by: PRACTICES_CHA

    Scale: 1 < 3 ≤ 2 < 1 ≤ 3 < 0.5 ≤ 4 < 0.3 ≤ 5

    The total number of violations of rules that impact changeability in the source code, divided by the number of thousands of lines of code.

    See also the page on practices for a list of checked rules and their associated impact on quality characteristics.

  • Average number of non-conformities for reliability ( NCC_REL_IDX )

    Provided by: polarsys_rules

    Used by: PRACTICES_REL

    Scale: 1 < 1 ≤ 2 < 0.5 ≤ 3 < 0.3 ≤ 4 < 0.1 ≤ 5

    The total number of violations of rules that impact reliability in the source code, divided by the number of thousands of lines of code.

    See also the page on practices for a list of checked rules and their associated impact on quality characteristics.

  • Average number of non-conformities for reusability ( NCC_REU_IDX )

    Provided by: polarsys_rules

    Used by: PRACTICES_REU

    Scale: 1 < 3 ≤ 2 < 1 ≤ 3 < 0.5 ≤ 4 < 0.3 ≤ 5

    The total number of violations of rules that impact reusability in the source code, divided by the number of thousands of lines of code.

    See also the page on practices for a list of checked rules and their associated impact on quality characteristics.

  • Source Lines Of Code ( NCLOC )

    Provided by: polarsys_sonarqube

    Used by: CODE_SIZE

    Scale: 1 < 915828 ≤ 2 < 198290.25 ≤ 3 < 97961.5 ≤ 4 < 38776.25 ≤ 5

    Number of physical lines that contain at least one character which is neither a whitespace or a tabulation or part of a comment. This is mapped to SonarQube's ncloc metric.

    See also SonarQube's definition for size metrics: http://docs.sonarqube.org/display/SONAR/Metric+definitions#Metricdefinitions-Size. More details can be seen here: http://docs.sonarqube.org/display/SONAR/Metrics+-+Lines+of+code

  • Number of milestones ( PLAN_MILESTONES_VOL )

    Provided by: polarsys_pmi

    Used by: PLAN_RELEASES

    Scale: 1 < 1 ≤ 2 < 5 ≤ 3 < 10 ≤ 4 < 20 ≤ 5

    The number of milestones that occured during the last five releases.

    Milestones are retrieved from the PMI file and are counted whatever their target release is. Milestones are useful to assess the maturity of the release and improves predictability of the project's output, in terms of quality and time.

  • Project is on time for next milestone ( PLAN_NEXT_MILESTONE )

    Provided by: PMI

    Used by: PLAN_ON_TIME

    Scale: 1 < 1 ≤ 2 < 2 ≤ 3 < 3 ≤ 4 < 4 ≤ 5

    Is the project on time for the next milestone?

  • Reviews success rate ( PLAN_REVIEWS_SUCCESS_RATE )

    Provided by: polarsys_pmi

    Used by: PLAN_RELEASES

    Scale: 1 < 20 ≤ 2 < 40 ≤ 3 < 60 ≤ 4 < 80 ≤ 5

    What is the percentage of successful reviews over the past 5 releases?

    The reviews are retrieved from the project management infrastructure record, and are considered successful if the entry is equal to success. If there are less than 5 releases defined, the percentage is computed on these only.

  • Public API ( PUBLIC_API )

    Provided by: polarsys_sonarqube

    Used by: PUBLIC_DATA

    Scale: 1 < 48347 ≤ 2 < 14018 ≤ 3 < 6497.5 ≤ 4 < 2420.75 ≤ 5

    Number of public Classes + number of public Functions + number of public Properties. The SonarQube measure used for this purpose is public_api.

    See also SonarQube's definition for size metrics: http://docs.sonarqube.org/display/SONAR/Metric+definitions#Metricdefinitions-Size. More details can be seen here: http://docs.sonarqube.org/display/SONAR/Metrics+-+Public+API

  • Number of publications ( PUB_CONF_VOL )

    Provided by: Manual

    Used by: PUB_CONF

    Scale: 1 < 1 ≤ 2 < 2 ≤ 3 < 3 ≤ 4 < 4 ≤ 5

    The number of papers published in conferences, journals, and public web sites.

    This metric is not yet active because its retrieval is being debated.

  • ITS information ( PUB_ITS_INFO_PMI )

    Provided by: polarsys_pmi

    Used by: PUB_ITS_INFO

    Scale: 1 < 2 ≤ 2 < 3 ≤ 3 < 4 ≤ 4 < 5 ≤ 5

    Is the bugzilla info correctly filled in the PMI records?

    The project management infrastructure file holds information about one or more bugzilla instances. This test checks that at least one bugzilla instance is defined, with a product identifier, a create_url to enter a new issue, and a query_url to fetch all the issues for the project.

  • SCM information ( PUB_SCM_INFO_PMI )

    Provided by: polarsys_pmi

    Used by: PUB_SCM_INFO

    Scale: 1 < 1 ≤ 2 < 2 ≤ 3 < 3 ≤ 4 < 4 ≤ 5

    Is the source_repo info correctly filled in the PMI records?

    The project management infrastructure file holds information about one or more source repositories. This test checks that at least one source repository is defined, and accessible.

  • Adherence to analysability rules ( ROKR_ANA )

    Provided by: RuleChecking

    Used by: PRACTICES_ANA

    Scale: 1 < 10 ≤ 2 < 30 ≤ 3 < 50 ≤ 4 < 75 ≤ 5

    Ratio of conform analysability practices on the number of checked analysability practices.

  • Adherence to changeability rules ( ROKR_CHA )

    Provided by: RuleChecking

    Used by: PRACTICES_CHA

    Scale: 1 < 10 ≤ 2 < 30 ≤ 3 < 50 ≤ 4 < 75 ≤ 5

    Ratio of conform changeability practices on the number of checked changeability practices.

  • Adherence to reliability rules ( ROKR_REL )

    Provided by: RuleChecking

    Used by: PRACTICES_REL

    Scale: 1 < 10 ≤ 2 < 30 ≤ 3 < 50 ≤ 4 < 75 ≤ 5

    Ratio of conform reliability practices on the number of checked reliability practices.

  • Adherence to reusability rules ( ROKR_REU )

    Provided by: RuleChecking

    Used by: PRACTICES_REU

    Scale: 1 < 10 ≤ 2 < 30 ≤ 3 < 50 ≤ 4 < 75 ≤ 5

    Ratio of conform reusability practices on the number of checked reusability practices.

  • SCM Commits ( SCM_COMMITS_3M )

    Provided by: polarsys_grimoire

    Used by: SCM_ACTIVITY , SCM_USAGE

    Scale: 1 < 2 ≤ 2 < 5 ≤ 3 < 13 ≤ 4 < 121 ≤ 5

    Total number of commits in source code management repositories dated during the last three months.

    Source code management repositories are those considered as such in the project documentation. Commits in all branches are considered. Date used for each commit is 'author date' (when there is a difference between author date and committer date). Time range is measured as three calendar months period starting the day before the data retrieval (example: if retrieval is on Feb 3rd, period is from Jan 3rd to Feb 2nd, both included).

  • Files committed ( SCM_COMMITTED_FILES_3M )

    Provided by: polarsys_grimoire

    Used by: SCM_ACTIVITY

    Scale: 1 < 3 ≤ 2 < 19 ≤ 3 < 95.75 ≤ 4 < 2189 ≤ 5

    Total number of files touched by commits in source code management repositories dated during the last three months.

    Source code management repositories are those considered as such in the project documentation. Commits in all branches are considered. A file is 'touched' by a commit if its content or its path are modified by the commit. Date used for each commit is 'author date' (when there is a difference between author date and committer date). Time range is measured as three calendar months period starting the day before the data retrieval (example: if retrieval is on Feb 3rd, period is from Jan 3rd to Feb 2nd, both included).

  • SCM committers ( SCM_COMMITTERS_3M )

    Provided by: polarsys_grimoire

    Used by: SCM_DIVERSITY , SCM_USAGE

    Scale: 1 < 1 ≤ 2 < 2 ≤ 3 < 3 ≤ 4 < 18 ≤ 5

    Total number of identities found as authors of commits in source code management repositories dated during the last three months.

    Source code management repositories are those considered as such in the project documentation. Commits in all branches are considered. Date used for each commit is 'author date' (when there is a difference between author date and committer date). An identity is considered as author if it appears as such in the commit record (for systems logging several identities related to the commit, authoring identity will be considered). Time range is measured as three calendar months period starting the day before the data retrieval (example: if retrieval is on Feb 3rd, period is from Jan 3rd to Feb 2nd, both included).

  • File stability index ( SCM_STABILITY_3M )

    Provided by: polarsys_grimoire

    Used by: SCM_STABILITY

    Scale: 1 < 3.79487179487 ≤ 2 < 1.4430952381 ≤ 3 < 1.14285714286 ≤ 4 < 1 ≤ 5

    Average number of commits touching each file in source code management repositories dated during the last three months.

    Source code management repositories are those considered as such in the project documentation. Commits in all branches are considered. A file is 'touched' by a commit if its content or its path are modified by the commit. Date used for each commit is 'author date' (when there is a difference between author date and committer date). Time range is measured as three calendar months period starting the day before the data retrieval (example: if retrieval is on Feb 3rd, period is from Jan 3rd to Feb 2nd, both included).

  • Installation survey ( SURVEY_INSTALL )

    Provided by: Manual

    Used by: INSTALL

    Scale: 1 < 1 ≤ 2 < 2 ≤ 3 < 3 ≤ 4 < 4 ≤ 5

    A survey stating how much the software has been installed: number of users, geography (are the teams co-located or distributed?), etc.

  • Test success density ( TEST_SUCCESS_DENSITY )

    Provided by: SonarQube

    Used by: TST_SUCCESS

    Scale: 1 < 50 ≤ 2 < 65 ≤ 3 < 80 ≤ 4 < 95 ≤ 5

    The percentage of failed tests during the last execution of the test plan. The SonarQube measure used for this purpose is test_success_density.

    Computation is as follows: Test success density = (Unit tests - (Unit test errors + Unit test failures)) / Unit tests * 100, where Unit test errors is the number of unit tests that have failed, Unit test failures is the number of unit tests that have failed with an unexpected exception.

  • Number of tests relative to the code size ( TST_VOL_IDX )

    Provided by: SonarQube

    Used by: TST_VOL

    Scale: 1 < 2 ≤ 2 < 5 ≤ 3 < 7 ≤ 4 < 10 ≤ 5

    The total number of test cases for the product, divided by the number of thousands of SLOC.

    Metric is computed from SonarQube tests and ncloc metrics.