Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
The Overview page's Key Metrics and Detailed Analysis section includes six cumulative charts highlighting significant trends and patterns within your analytics data.
The cumulative charts help you compare different metrics on single charts. This comparative analysis helps identify relationships and draw meaningful conclusions.
The six cumulative charts show quick snapshots of the analytical data and the detailed analysis chart that helps you with the more profound analysis.
The Issue Metric measures the number of issues reported and tracked within a specified period. It compares the number of issues opened, the number of issues closed, and the total commits for the selected time period.
The metric is based on the following activity types:
issues-closed
issues-opened
The analytics tool employs a combined chart (a line chart and bar charts) on its dashboard to analyze the Issue Metric. The line on the chart connects the data points, allowing you to observe trends and patterns over time.
The dashboard shows the issues (opened +
closed issues) in a snapshot and a detailed chart (open, closed, and the total issues).
On the Overview page, select the project and repositories (1) for which you want to see the data.
Select the specific time period using the filter option (2).
The high-level tile (3) shows you the total issues (open + closed) for the selected time range.
The detailed analysis chart shows you the open issues, closed issues, and the cumulative count of total issues for the selected period. On the left side, the chart shows the chart trend summary (4).
Hover over chart (5) to see the open issues, closed issues, and total issues for the selected month.
This interactive download feature (6) enables you to download the chart in CSV and PNG file formats.
Issues Tracking and Management: By visualizing the data on a line chart, it becomes easier to identify the increase or decrease in issue activity, allowing for effective resource allocation and prioritization.
Performance Evaluation: The Issue Metric helps in evaluating the performance of the development team and the project as a whole. Changes in issue count over time indicate improvements in software quality, bug-fixing efficiency, or the impact of development efforts.
Community Engagement: A higher number of reported issues indicates the active participation and involvement of the community in the open source project.
The Star Metric measures and analyzes the number of stars a project receives on a code hosting platform like GitHub.
The metric gives you a real-time data analysis of projects' popularity, community engagement, and overall project visibility.
Stars represent a way for you to bookmark or indicate your interest in and appreciation for a particular project. Each star serves as a measure of the project's popularity.
To analyze the Star Metric, the analytics tool employs a line chart on its dashboard. The line connecting the data points on the chart showcases the trend and changes in the number of stars over time.
When you hover over a specific point on the line chart, detailed information about the number of stars for that particular month within the selected period is displayed.
The metric helps you analyze your project's popularity. A higher number of stars generally suggests a widely recognized and appreciated project, potentially attracting more contributors.
The Fork Metric measures and analyzes the number of times a project has been forked by other developers.
Forking is the process of creating a copy of a project's source code repository to either modify and enhance the project or use it as a starting point for a new project.
The bar chart on the dashboard represents the analysis, displaying the number of forks over time. Hover over a specific bar to access the detailed fork information for that particular month within the selected period.
The interactive download feature (Icon) enables you to download the chart in CSV and PNG file formats.
How popular is the project? The Fork Metric provides insights into the popularity of your project. A higher number of forks generally indicates that the developers find your project useful and valuable enough to build it or adapt it to their specific needs.
Code Reuse: By analyzing the Fork Metric, you can get data on code reuse and identify potential opportunities for improvement.
Community Engagement: A growing number of forks indicates an active and involved community, contributing to the project's growth.
Project Evolution: By monitoring forks over time, you can identify significant milestones.
The Commits metric refers to the analysis of contributor's code commits within a specified timeframe. A code commit represents a unit of change to the software's source code repository.
Each commit includes the following:
committed-commit
("Default Branch" only)
In this chart, only commits are counted, not the Roles. Each commit with a unique Commit SHA is counted as one Commit. The roles do not matter here.
The dashboard shows the commits snapshot and a detailed chart. The detailed chart is a combined chart (line chart and bar chart) that shows new commits vs. total commits.
On the Overview page, select the project and repositories (1) for which you want to see the data.
Select the specific period using the filter option (2).
The high-level tile (3) shows you the total commits for the selected time range.
The detailed analysis chart shows you the New commits and the count of total commits for the selected period. On the left side, the chart shows the chart trend summary (4).
Hover over the chart (5) to see the new commits and the total commits for the selected month.
This interactive download feature (6) enables you to download the chart in CSV and PNG file formats.
The metric enables project maintainers and stakeholders to gain valuable insights into code changes and progress within a specified period.
It provides insights into the volume and frequency of code changes made by contributors. By visualizing commit data in a bar chart, you can track the progress of development efforts over time.
Changes in commit counts provide periods of intense development, periods of slower activity, or the impact of specific events or milestones on the project.
A Contribution Leadership board visualization displays the contributions made by individual contributors to an open source project. It ranks contributors based on the number of code commits, pull requests, issues closed, or other metrics and visually represents their relative activity levels and impact on the project.
This chart displays individual identities, not merge contributors, as in tool. Even if certain identities are combined into one contributor in CM, they will still appear as separate entities in the Insights V3 leaderboard charts. This distinction exists because merging contributors means combining the contributions of a single individual working under different accounts or identities. The separation of identities in Insights is maintained for privacy reasons and GDPR compliance.
Recognition and Motivation: The Contributor Leaderboard recognizes and acknowledges the efforts of individual contributors. It highlights their contributions, encourages ongoing engagement, and motivates contributors to continue their valuable work.
Community Engagement: It creates a sense of community and healthy competition, encouraging collaboration and inspiring others to contribute and improve their ranking on the leaderboard.
Collaboration Opportunities: The leaderboard helps project maintainers and community members identify potential collaborators or subject-matter experts within the project. It will be easier to identify the most active contributors and connect with them.
ID: analytics
Project websites provide some web analytics.
This check passes if:
A Google Analytics 3 (Universal Analytics) Tracking ID is found in the source of the website configured in GitHub. Regexps used:
A Google Analytics 4 Measurement ID is found in the source of the website configured in Github. Regexps used:
The HubSpot tracking code is found in the source of the website configured in Github. Regexps used:
ID: artifacthub_badge
Projects can list their content on Artifact Hub to improve their discoverability.
This check passes if:
An Artifact Hub
badge is found in the repository’s README
file. Regexps used:
ID: cla
The CLA defines the conditions under which intellectual property is contributed to a business or project.
This check passes if:
A CLA check is found in the latest merged PR on GitHub. Regexps used:
This check will be automatically marked as exempt if the DCO check passes but this one does not.
ID: community_meeting
Community meetings are often held to engage community members, hear more voices, and get more viewpoints.
This check passes if:
A reference to the community meeting is found in the repository’s README
file. Regexps used:
ID: dco
Mechanism for contributors to certify that they wrote or have the right to submit the code they are contributing.
This check passes if:
The last commits in the repository have the DCO signature (Signed-off-by). Merge pull request and merge branch commits are ignored for this check.
A DCO check is found in the latest merged PR on GitHub. Regexps used:
This check will be automatically marked as exempt if the CLA check passes, but this one does not.
ID: github_discussions
Projects should enable GitHub discussions in their repositories.
This check passes if:
A discussion that is less than one year old is found on GitHub.
ID: openssf_badge
The Open Source Security Foundation (OpenSSF) Best Practices badge is a way for Free/Libre and Open Source Software (FLOSS) projects to show that they follow best practices.
This check passes if:
An OpenSSF
(CII) badge is found in the repository’s README
file. Regexps used:
ID: openssf_scorecard_badge
Scorecard assesses open source projects for security risks through a series of automated checks. For more information about the Scorecard badge please see https://github.com/marketplace/actions/ossf-scorecard-action#scorecard-badge.
This check passes if:
An OpenSSF
Scorecard badge is found in the repository’s README
file. Regexps used:
ID: recent_release
The project should have released at least one version in the last year.
This check passes if:
A release that is less than one year old is found on GitHub.
ID: slack_presence
Projects should have presence in the CNCF Slack or Kubernetes Slack.
This check passes if:
A reference to the CNCF Slack or Kubernetes Slack is found in the repository’s README
file. Regexps used:
This check determines whether the project has generated executable (binary) artifacts in the source repository. For more details, see the check documentation.
ID: code_review
This check determines whether the project requires code review before pull requests (merge requests) are merged. For more details, see the check documentation.
ID: dangerous_workflow
This check determines whether the project’s GitHub Action workflows has dangerous code patterns. For more details, see the check documentation.
ID: dependency_update_tool
This check tries to determine if the project uses a dependency update tool, specifically dependabot or renovatebot. For more details, see the check documentation.
ID: maintained
This check determines whether the project is actively maintained. For more details, see the check documentation.
ID: sbom
List of components in a piece of software, including licenses, versions, etc.
This check passes if:
The latest release on Github includes an asset which name contains sbom. Regexps used:
The repository’s README
file contains a SBOM section that explains where they are published to, format used, etc. Regexps used to locate the title header:
ID: security_policy
Documented security processes explaining how to report security issues to the project.
This check passes if:
A security policy file is found in the repository. Globs used:
A security policy reference is found in the repository’s README
file. This can be in the form of a title header or a link. Regexps used:
A security policy file is found in the default community health files repository.
ID: signed_releases
This check tries to determine if the project cryptographically signs release artifacts. For more details, see the check documentation.
ID: token_permissions
This check determines whether the project’s automated workflows tokens are set to read-only by default. For more details, see the check documentation.
ID: adopters
List of organizations using this project in production or at stages of testing.
This check passes if:
An adopters file is found in the repository. Globs used:
An adopters reference is found in the repository’s README
file. This is in the form of a title header or a link. Regexps used:
ID: changelog
A curated, chronologically ordered list of notable changes for each version.
This check passes if:
A changelog file is found in the repository. Globs used:
A changelog reference is found in the repository’s README
file. This can be in the form of a title header or a link. Regexps used:
A changelog reference is found in the last GitHub release content body. Regexps used:
ID: code_of_conduct
Adopt a code of conduct to establish community standards, promote an inclusive and welcoming initiative, and outline procedures for handling abuse.
This check passes if:
A code of conduct file is found in the repository. Globs used:
A code of conduct reference is found in the repository’s README
file. This can be in the form of a title header or a link. Regexps used:
A code of conduct file is found in the default community health files repository, for example.
ID: contributing
A contributing file in your repository provides potential project contributors with a short guide to how they can help with your project.
This check passes if:
A contributing file is found in the repository. Globs used:
A contributing reference is found in the repository’s README
file. This can be in the form of a title header or a link. Regexps used:
A contributing file is found in the default community health files repository.
ID: governance
Document that explains how the governance and committer process works in the repository.
This check passes if:
A governance file is found in the repository. Globs used:
A governance reference is found in the repository’s README
file. This can be in the form of a title header or a link. Regexps used:
The maintainers file contains a list of the current maintainers of the repository.
This check passes if:
A maintainers file is found in the repository. Globs used:
A maintainers reference is found in the repository’s README
file. This can be in the form of a title header or a link. Regexps used:
ID: readme
The readme file introduces and explains a project. It contains information that is commonly required to understand what the project is about.
This check passes if:
A readme file is found in the repository. Globs used:
ID: roadmap
Defines a high-level overview of the project’s goals and deliverables ideally presented on a timeline.
This check passes if:
A roadmap file is found in the repository. Globs used:
A roadmap reference is found in the repository’s README
file. This can be in the form of a title header or a link. Regexps used:
ID: summary_table
The Projects Summary Table is a CNCF Business Value Subcommittee initiative to supplement the CNCF Landscape and include further information about CNCF projects for the wider Cloud Native community.
This check passes if:
At least one of the summary_* fields has been set in the project's extra section in the Landscape YAML file.
ID: website
A url
that users can visit to learn more about your project.
This check passes if:
A website url
is configured in the GitHub repository.
Calculating a global score for a best practice score in an open-source project involves evaluating various aspects of the project against predefined best practices and assigning weights to those aspects based on their importance. Let's understand this with the sample example.
Define the following set of best practices that are important for the success and quality of the open-source project. Each category should have a set of criteria that can be evaluated.
Assign weights to each category based on their relative importance. These weights should add up to 100%. The weights reflect how much each category contributes to the overall quality of the project.
Evaluate Each Criterion
For each criterion within a category, evaluate the project and assign a score.
Use a numerical scale (0–10) or any other suitable scale.
Code of conduct: 8
Governance: 9
Maintainer: 8
Website: 7
Analytics: 9
GitHub Discussion: 10
Community meetings: 8
Binary Artifacts: 8
Dangerous Workflow: 9
Approved Licenses: 9
It is calculated by the average score* weights
Documentation : ((8+9+8+7)/4)*.4= 3.2
Standards: ((9+10+8)/3)*.30= 2.7
Security: ((8+9)/2)*.20= 1.7
Legal: 9*.10= .9
Calculate Global Score
Sum up the category scores to obtain the global score for the best practice score of the open-source project.
Documentation+ Standards+Security+Legal= 3.2+2.7+1.7+.9 = 15.58
On a regular basis, a number of checks are performed on each repository listed in the database.
Checks are grouped into check sets.
One or more check sets
are applied to a single repository, and each check set specifies the number of checks that will be performed on the repository.
The check’s file must declare the following information:
ID
: check identifier.
WEIGHT
: weight of this check, used to calculate scores.
CHECK_SETS
: check sets this new check belongs to.
The data visualization on the overview page shows real-time data on the total number of contributors and the total number of active contributors across all monitored repositories during the selected time period.
On the Overview page, select the project and repositories (1) for which you want to see the data.
Select the specific time period using the filter option (2).
The high-level tile (3) shows you the total unique contributors (calculated based on their member ID) for the selected time range.
The detailed analysis chart shows you the active contributors and the cumulative count of total contributors for the selected period. On the left side, the chart shows the chart trend summary (4).
Hover over the chart (5) to see the number of active contributors and the total contributors for the selected month.
This interactive download feature (6) enables you to download the chart in CSV and PNG file formats.
When you want to see the health of your open source project, the Contributor Chart is a crucial project performance indicator.
Visualizing the number of contributors over time makes it easier to identify trends, patterns, and overall community interest. The trend helps project maintainers and other stakeholders act based on the charts.
Tracking the number of contributors can provide insights into the health and vitality of your project.
By analyzing changes in the contributor count, project managers can gain insights into the effectiveness of their community outreach and development strategies.
Contributor Dependency measures and analyzes the dependencies or relationships between different contributors within a project. It explores how contributors rely on each other, collaborates, and interact in terms of code contributions, reviews, and other collaborative activities.
Contributor dependency shows the relationship between contributors or entities within a project, where the actions or outputs of one contributor depend on the inputs or outputs of another.
This chart displays individual identities, not merge contributors, as in tool. Even if certain identities are combined into one contributor in CM, they will still appear as separate entities in the Insights leaderboard charts. This distinction exists because merging contributors means combining the contributions of a single individual working under different accounts or identities. The separation of identities in Insights is maintained for privacy reasons and GDPR compliance.
Each activity is considered a separate contribution. For example, opening a PR and closing a PR are counted as two distinct contributions.
Collaboration: It identifies which contributors frequently interact, exchange ideas, review each other's work, and collaborate on code changes.
Knowledge Sharing and Expertise: Understanding these dependencies can help project maintainers identify subject matter experts, encourage knowledge sharing, and allocate resources effectively.
Project Health and Sustainability: By analyzing Contributor Dependency, project maintainers can evaluate the health and sustainability of the project. Dependencies that are concentrated around a few contributors may pose risks if those contributors become less active or leave the project.
The Organization Leaderboard ranks organizations based on their contributions to the project. The leaderboard provides insights into organizations' collective efforts to drive your projects' success and growth.
It helps you determine if your project has a healthy contribution from multiple organizations and if new organizations are coming to contribute to the project.
This chart displays individual identities, not merge contributors, as in tool. Even if certain identities are combined into one contributor in CM, they will still appear as separate entities in the Insights leaderboard charts. This distinction exists because merging contributors means combining the contributions of a single individual working under different accounts or identities. The separation of identities in Insights is maintained for privacy reasons and GDPR compliance.
Each activity is considered a separate contribution. For example, opening a PR and closing a PR are counted as two distinct contributions.
Recognition: The Organization Leaderboard recognizes and showcases the contributions made by various organizations.
Project Sustainability: The Organization Leaderboard evaluates the involvement of organizations and assesses the project's long-term sustainability and growth potential.
Trust and Credibility: When organizations are actively engaged in your projects and their contributions are recognized through the leaderboard, it enhances the overall trust and credibility of the project.
The Pull Requests Metric measures and analyzes the three key activities related to pull requests:
Pull requests opened
Pull requests closed
Pull requests merged
Pull requests are a mechanism for proposing changes to a codebase, allowing developers to collaborate, review, and merge code changes into the project.
Analyzing the high-level tile (1) representing unique pull requests (opened, closed, and merged) provides valuable insights into the health of the codebase.
The detailed chart displays data related to pull requests opened, closed-unmerged, closed-merged, and the total cumulative pull requests over the selected time period. On the left side, the chart shows the chart trend summary (4).
Collaboration and Code Review: It provides insights into the active participation of developers and the effectiveness of the code review process. If the number of Pull Requests opened is high, the user can complement this data with other Pull Request metrics such as first time to Review, and Pull Request Cycle Time to find out the cause of the high number of Pull Requests open but not acted upon/closed/merged.
Community Engagement: A higher number of pull requests indicates an engaged community that actively contributes to the project.
Quality and Maintenance: By analyzing the number of pull requests opened, closed, and merged, you can assess the health of the codebase, identify areas that need attention, and ensure timely reviews and merging of contributions.
ID: license_approved
Whether the repository uses an approved license or not.
This check passes if:
The license identified matches any of the following:
ID: license_scanning
License scanning software scans and automatically identifies, manages, and addresses open source licensing issues.
This check passes if:
A FOSSA
or Snyk
link is found in the repository’s README
file. Regexps used:
ID: Apache_2.0
A permissive license whose main conditions require preserving copyright and license notices. Contributors provide an express grant of patent rights. Licensed works, modifications, and larger works may be distributed under different terms and without source code.
ID: trademark_disclaimer
Project sites should have the Linux Foundation trademark disclaimer.
This check passes if:
The Linux Foundation trademark disclaimer is found in the content of the website configured in Github. Regexps used:
A best practice score visualization is a tool that helps project leads and managers assess the overall health and quality of an open source software project.
It typically evaluates the project against a set of best practices or standards for software development, such as the categories Documentation
,
Standards
,
Security
`and `
Legal`
.
It generates a score or rating based on how well the project meets these criteria.
On the Overview page, select the project and repositories for which you want to see the best practice score.
Select the specific time period using the filter option.
Scroll down to find the best practice score dashboard.
You can see the aggregated score (3) and each category's score on the dashboard.
Click the Download icon to download the dashboard.
Click on any category to see the expanded page where you can see the detailed analysis for each repository.
Click the Create Issue button to create an issue for each repository.
The Active Days metric measures the number of days a contributor has made at least one contribution to a project. It counts the number of days on which a contributor has been actively engaged in the project's development.
The Active Day chart also displays two bars for the current data and the previous data, allowing you to compare them.
The Active Days dashboard provides you with the following insights:
You can monitor progress and identify trends. This information can be used to set goals and benchmarks for the project and measure success.
The active days metric provides a quick snapshot of the project's activity level. It helps determine whether the project is actively maintained or not.
The visualization can be used to quickly assess the activity level of a repository. A repository with a high number of active days is likely to be more active and healthy than one with a low number of active days.
By highlighting the importance of active days, project managers can encourage new contributors to become more involved in the project.
Organization Dependency Metric shows the analysis of how much a project's contributions depend on or are associated with different organizations.
With Organization Dependency Metrics, you can assess which organizations are significantly contributing to your project.
Each activity is considered a separate contribution. For example, opening a PR and closing a PR are counted as two distinct contributions.
Engagement Assessment: For organizations involved in the project, this metric helps assess their level of engagement and impact. It can encourage healthy competition among contributors, resulting in greater involvement.
Risk Management: Dependency on a single organization for contributions can be risky. If that organization reduces its involvement, the project might face challenges.
A link pointing to the license scanning results is provided in the metadata file.
Licensed under the Apache License, Version 2.0 (the "License"); you may not use except in compliance with the License. You may obtain a copy of the License at
Insights often incorporates a Geographical Distribution metric to provide insights into the locations from which contributions originate.
Geographical Distribution analyzes and visualizes the contributions made by contributors across different regions around the world. It provides a breakdown of the top regions based on the total number of contributors, providing a clear understanding of the project's global engagement and scope.
Hover over the chart to view the number of contributors for each region during the selected period. This information provides a more granular view of contributor activity within specific regions.
Global Impact: Geographical Distribution allows you to assess the global impact of the open source project by providing insights into the regions where contributions are coming from.
Regional Comparison: Compare the contribution numbers across different regions to identify any notable variations. Assess whether certain regions show consistent contribution levels or if there are fluctuations that require further investigation.
Top Contributing Regions: It helps to identify the top five regions based on the total number of contributions. These regions represent areas where the project has significant engagement and impact.
The metric provides insights into the technical contribution breakdown across weekdays and weekends. It shows the time of day when most of the contributions happen so that you have maximum participation in the project. Each day is categorized by its level of activity, indicating low to high contribution levels.
Commits are recorded in the individual contributors' local time zone.
Only commit data is used for this dashboard. Each Commit is counted only once in this metric.
Commits
authored commit
Activity Level Assessment: Work Time Distribution allows you to assess the technical activities across different days of the week. By analyzing the chart, project managers can identify contribution patterns and trends, such as peak activity days or days with lower participation.
Productivity Monitoring: Work Time Distribution helps you to monitor contributors' productivity and engagement. By analyzing the breakdown of contributions, you can identify periods of high productivity and low engagement.
Work Optimization: By understanding the distribution of contributions across weekdays and weekends, project managers can identify potential collaboration challenges due to varying availability.
Weekday vs. Weekend Contributions: Compare the contribution levels between weekdays and weekends. Assess significant activities differences, and identify any patterns or preferences in contributor engagement during these periods.
Maximum Participation: As an Executive Director or Maintainer, when you want to set up a community call for your project, you view the time when most of the contributions happen so that you can have maximum participation.