Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Date: February 29, 2024
Insights V3 is an open-source analytical tool that provides insights from analyzing open source software (OSS) projects.
Insights V3 helps project leads and technical managers understand their team members' engagement and participation in open source projects, and identify the most active and productive contributors.
No new features have been added in this release.
UI Enhancements
Text changes:
Column names on organization-related tables from 'Name' to 'Organization'.
Column name: Foundation overview -> Project Velocity table from 'Authors' to 'Contributors'.
changed the card description for the Contributions Outside Work Hours component.
Fix column alignment issues on leaderboard tables
Enable the filtering of the table in the Project Velocity section on the Foundation overview page
Fix issues with downloading PNG on the Reports -> Activities page components
Fix the issue with French Polynesia not showing up on the activities by Geographical Distribution metrics
Some of the projects are still onboarding, so the data may not be correct.
The date range filter in the dashboard module may not always display the correct data when selecting custom date ranges.
You may encounter occasional inconsistencies in data synchronization between Insights V3 and external data sources, leading to discrepancies in reporting.
Insights is an open source analytical tool that provides insights from analyzing open source software (OSS) projects.
Insights helps project leads and technical managers understand their team members' engagement and participation in open source projects, and identify the most active and productive contributors.
The tool has many features that can be very useful in several ways, such as:
Data Visualization and Reporting: Insights will use data visualization techniques to make it easier to understand the metrics and reports generated by the software.
Code Analysis: Insights analyzes the project's codebase for metrics such as code complexity, and code quality that help project leads and managers identify potential areas for improvement.
Development metrics: Metrics such as commit frequency, pull request acceptance rates, and time-to-resolution for issues can provide insights into the project's development process and help project leads and managers identify areas for improvement.
Integration with other tools: Integration with other tools commonly used in software development, such as Git, GitHub, or Jira, can provide a more competent view of the project's development and make it easier for contributors and managers to track progress.
Customization and flexibility: Allowing users to customize the analytics tool to fit their specific needs and workflows can increase its usefulness and adoption.
Insights is the perfect tool for you if you:
Track the performance of open source projects in real time.
Want to analyze data quickly?
Are looking for an online reporting tool.
Want to download the reports in CSV or any other format?
Compare the reports for the selected time period.
Measure the project's growth and the team's performance.
Track historical data to identify trends and patterns.
Detect potential issues early and take corrective actions.
Date: April 08, 2024
Insights is an open-source analytical tool that provides insights from analyzing open source software (OSS) projects.
Insights helps project leads and technical managers understand their team members' engagement and participation in open source projects, and identify the most active and productive contributors.
No new features have been added in this release.
Backend Performance: Strengthened backend performance for Organization Dependency, Contributor Dependency, and Leaderboard tables on the Overview page by integrating DBT Platinum models, significantly improving data processing times.
UI Enhancements: Initiated a clean-up of the plugin's interface to streamline user experience.
Project Metrics: Enhanced the efficiency of fetching key project metrics data, ensuring more accurate and timely insights.
UI Enhancements
Active Days Chart Width and Sizing Adjustments: Modified to align with the sizing of other bar charts for visual consistency.
New Sidebar Navigation: Implemented an enhanced sidebar navigation system for improved user experience.
Progress Bar Styling Updates: Revised progress bar designs to match our latest stylistic preferences.
UI Card Alignment: Updated user interface cards to adhere to the new LFX Style Guide, ensuring a cohesive look and feel.
Resolved issues with exporting PNG files in the Geographical Distribution section, mailing list components, and velocity charts.
Bug fixes in Pagination for Organization and Contributor Dependency charts
Resolved issues where the Active Contributors chart wasn't loading in the Reports Contributors section.
Fixed bug on Best Practices flyout sidebar error.
Some of the projects are still onboarding, so the data may not be correct.
The date range filter in the dashboard module may not always display the correct data when selecting custom date ranges.
You may encounter occasional inconsistencies in data synchronization between Insights and external data sources, leading to discrepancies in reporting.
Date: February 22, 2024
Insights V3 is an open-source analytical tool that provides insights from analyzing open-source software (OSS) projects.
Insights V3 helps project leads and technical managers understand their team members' engagement and participation in open-source projects, and identify the most active and productive contributors.
No new features have been added in this release.
Optimized Individual Project Card Data Retrieval: The data fetching for individual project cards has been significantly optimized, resulting in quicker load times and an enhanced user experience.
Enhanced Loading Performance for Foundation -> Projects Page: Loading times for the Foundation > Projects page have been drastically reduced through the implementation of advanced loading techniques, ensuring smoother user navigation.
Integration of DBT Models: All data within Insights V3 now utilize DBT (Data Build Tool) models, offering more scalable and robust data handling capabilities.
Model rendering and performance optimization with Cube Cloud: All dbt models are mapped to cubes and views with defined pre-aggregations in Cube Cloud’s semantic layer, optimizing load times.
Overall System Performance Improvements: A series of system optimizations have been carried out, leading to noticeable improvements in the tool's performance and reliability.
UI Enhancements
Enhanced Responsiveness: We have improved the alignment of cards across various screen sizes for a seamless viewing experience.
Report Filters Redesign: Filters in the Reports section that were previously considered "extra" have now been integrated into the top filter box for easier access.
New Default Period: The default period for viewing data has been updated to the Last 12 months, allowing for more relevant insights.
Color Palette Refresh: The site now follows a new color palette to maintain consistency and enhance visual appeal.
UI Consistency Improvements: Adjustments in alignment, padding, and margins across the platform ensure a unified and more polished look for all cards.
Project Status Indicator: Projects still in the onboarding phase will now be highlighted with a subtle red color for better visibility.
Updated Navigation: Clicking the LFX Logo will now redirect users to the Insights landing page, streamlining navigation.
Clarified Chart Descriptions: The description for the "Contribution outside working hours" chart has been refined for clarity.
Rectify the Best Practices category mismatch issue
Update the tooltip data for the Reports -> Active Contributors chart to reflect the Last 10 Years time range accurately
Ensure the presence of the Hyperledger Foundation logo on the Hyperledger card
Address the rounding-off error in the Software Value calculation
Fix the pagination bug on the Foundation -> Projects page
Eliminate the 100-row loading limit for sub-projects
Some of the projects are still onboarding, so the data may not be correct.
The date range filter in the dashboard module may not always display the correct data when selecting custom date ranges.
You may encounter occasional inconsistencies in data synchronization between Insights V3 and external data sources, leading to discrepancies in reporting.
To see the detailed definition, click > to expand.
The Landing Page provides all the important analytics about your foundations and projects. It is designed to give you a quick overview of your data and help you navigate the tool easily.
This page focuses on the Foundation Cards and the individual Project Cards, which serve as the core navigational elements, presenting the key data metrics.
Select the Projects and the Foundations: The search box at the top of the main menu helps you find a particular project or repository.
Foundation Cards: Foundation Cards are like summary cards that provide key insights into different open source foundations. When you click on one of these cards, you will be redirected to a Foundation Overview dashboard specifically dedicated to that foundation.
When you click on a foundation card that has only one project, you will be redirected to the Project Overview page.
Project Cards: On the main page, you will see the project cards. Each card represents an individual open source project. When you click on a project card, it takes you to a dedicated Overview Page for that project. These cards show you real-time data about each project, such as important numbers and updates.
To use the new Insights user interface, follow these steps:
Visit the Insights web URL. You will be redirected to the Insights home page.
The Insights Dashboard is the default dashboard.
You can see all the foundation and project cards on the main page. Alternatively, search the project or a foundation using the Search Bar.
Beta Version
Welcome to Insights, an open-source project analytics tool that empowers you with valuable data-driven insights.
Disclaimer: Our affiliation data is based on current information and may be approximate. While we aim for accuracy, some data may not be entirely precise. Our team is actively improving this data, but minor inaccuracies might remain due to the complexity of the task.
Important: Insights is currently in the beta phase, which means it is actively being developed and refined to provide you with the best experience possible.
Insights is a pre-release software version made available to users for testing and feedback. This means you may encounter occasional changes to the user interface, features, and functionality.
Encourage you to actively participate in the Insights community by providing feedback, reporting bugs, and suggesting improvements.
Your feedback plays a crucial role in shaping the direction of feature development and prioritization.
The tool fetches data from various sources, including your open-source project repositories, and updates this information regularly.
You may see some delays in real-time data because the tool fetches data from various sources.
Note: The Community Management tool focuses exclusively on publicly available GitHub, Git, or Gerrit repositories. Forks and certain repositories are purposely excluded from monitoring to streamline the data integration process.
Insights has a user-friendly interface that is easy to navigate. The tool is designed to be intuitive, which means that you can quickly learn how to use it and start gaining insights from your data.
The Foundation Overview page in Insights provides a comprehensive snapshot of your open source foundation, enabling you to gain valuable insights into your projects' performance and growth.
Disclaimer: Not all the foundations need to follow the same maturity level categorization. So, the Foundation Overview page may look different for your foundation.
At the top of the page, you will find the header section, which includes the following elements:
The name of your foundation is displayed prominently at the top of the page, providing clear identification.
This feature allows you to search for specific projects within your foundation, making finding and accessing project information easy. Select a project and go to the project overview page.
At the top, you will see the following four high-level metrics:
Projects: The metric shows the total projects within the foundation.
Contributors: It shows the total number of contributors among all the projects within the foundation.
Lines of Code: Displays the total lines of code written for all the projects within the foundation.
Organizations: The metric shows the total number of organizations that have contributed to the projects within the foundation.
Using this search box, you can select another foundation or a project.
Each activity is considered a separate contribution. For example, opening a PR and closing a PR are counted as two distinct contributions.
Here's the table with more detailed descriptions for each activity type:
ACTIVITY_CATEGORY | ACTIVITY_TYPE | Description |
---|---|---|
On the Foundation Overview page, you will find the Project Ecosystem Metrics. This section includes two informative charts.
Project Ecosystem Metrics in an open source foundation represent quantitative measurements that provide insights into the health, growth, and diversity of projects within the foundation's ecosystem. These metrics involve data analysis across various dimensions, such as:
Project Maturity Levels: Categorization of projects based on their developmental stage, community engagement, and stability.
Growth Trends: Analysis of the number of projects being accepted over time, indicating the expansion and attraction of the foundation's ecosystem.
Diversity Indices: Evaluation of the diversity within projects and their communities, assessing the inclusiveness and global reach of the foundation's ecosystem.
Sustainability Indicators: Insights into the long-term viability of projects, including funding, resource allocation, and project continuity plans.
Disclaimer: It is not necessary that all the foundations follow the same maturity level categorization. So, the Foundation Overview page may look different for your foundation.
You will see the total number of projects of the foundation as per their maturity level.
The chart enables visualization of growth and acceptance patterns for new projects.
Hovering over the chart reveals the count of projects accepted during specific time frames.
It presents a historical trend of project acceptances into your foundation over time.
Provides analysis of acceptance rates to identify periods of high or low project acceptance.
The subsequent chart illustrates the trend of projects approved by your foundation.
Distribution of project based on maturity level and rating
The metric categorizes your projects based on their maturity level and rating.
This helps you see how projects are distributed across different maturity levels and ratings, allowing you to make informed decisions about resource allocation and project management.
At each maturity level, projects are further segregated as per rating.
For example: Click on the pie chart under the Incubating Projects card to see the projects' categorization as per the ratings.
Click on a Foundation Card from the Landing Page or search (2) for the foundation using the search box at the top.
Scroll down to see all the listed foundations and projects.
On the Projects page, you can see the project cards of the selected foundation with their project maturity tags.
You can filter the project cards using the Maturity Level, Rating, and Accepted filter options.
On the landing page, the project cards are designed to show you real-time data and key metrics related to each project.
A project card displays the following key metrics:
Key metrics on a project card may vary as per the data sources. Projects with Git data sources will have fewer metrics.
When you click on a project card, it opens up an overview page dedicated to that specific project. This overview page provides more detailed information about the project, such as in-depth analytics, charts, and other relevant data.
The Project Card has the following details:
GitHub Icon: Click the (1) GitHub icon to go to the GitHub repositories of the project.
Aggregated data: it shows the real-time data of contributions, commits, PRs, issues, stars, and forks for the project.
Info Icon: shows the date and the time when the Best Practice Score was last updated.
Software Value: Constructive Cost Modal (COCOMO) is a procedural cost estimate model for software projects.
On the landing page, the foundation cards are designed to show you real-time data and key metrics related to the foundation and its projects.
A foundation card displays the following key metrics:
When you click on a foundation card, it opens up a dedicated to that foundation. Here, you will find more detailed information about the foundation's contributions to the open source community.
A foundation card has the following details:
Click on theicon to open the GitHub page of the foundation.
Click on the icon to open the foundation's webpage.
Hover over the to see the inception year of the foundation.
It shows the key metrics of the foundation.
On the top of each card, you can see the icons of the integrated data source. For example, on the above card, GitHub is an integrated data source.
From the left navigation pane, click the icon to return to the Landing Page.
Download Icon: click the icon (2) to download the project card.
Download Icon: click icon (2) to download the foundation card.
Software Value: is a procedural cost estimate model for software projects.
Commit
authored-commit
Commits where the user is the primary author of the code changes.
Commit
co-authored-commit
Commits where the user has contributed alongside other authors.
Commit
committed-commit
Commits where the user has applied changes on behalf of another author.
Issue
issues-closed
Issues that were resolved and closed by the user.
Issue
issues-opened
New issues that were created and opened by the user.
Patch Set
patchset-created
Collections of changes (patch sets) that were created by the user.
Patch Set
patchset_comment-created
Comments made by the user on patch sets.
Pull Request
changeset-abandoned
Changesets that were started but later abandoned by the user.
Pull Request
changeset-created
New changesets initiated by the user.
Pull Request
changeset-merged
Changesets that were successfully merged into the main codebase.
Pull Request
pull_request-closed
Pull requests that were closed by the user without merging.
Pull Request
pull_request-merged
Pull requests that were reviewed and merged by the user.
Pull Request
pull_request-opened
New pull requests that were opened by the user.
Review
patchset_approval-created
Approvals given by the user to changes in patch sets.
Review
pull_request-comment
Comments were added by the user on pull requests.
Review
pull_request-review-thread-comment
Threaded comments were added by the user in pull request reviews.
Review
pull_request-reviewed
Pull requests that were reviewed and provided feedback on by the user.
NULL
issue-comment
Comments made by the user on issues.
The target audience for Insights can vary depending on its specific features and functionalities. Here are some of the key target audience groups that can benefit from this tool:
The Overview page's Key Metrics and Detailed Analysis section includes six cumulative charts highlighting significant trends and patterns within your analytics data.
The cumulative charts help you compare different metrics on single charts. This comparative analysis helps identify relationships and draw meaningful conclusions.
The six cumulative charts show quick snapshots of the analytical data and the detailed analysis chart that helps you with the more profound analysis.
Constructive Cost Model
The COCOMO (Constructive Cost Model) is a widely used model that estimates the effort, time, and cost associated with software development projects.
The model takes into account factors such as project size, complexity, team experience, and development environment.
The COCOMO model consists of three different levels or modes:
Basic COCOMO: This mode is used for early-stage project estimates and focuses on estimating effort based on lines of code (LOC). It uses a simple formula to calculate the effort required for a project, taking into account the project size in KLOC (thousands of lines of code).
Insights V3 uses the basic model to calculate the software estimates for the selected open source projects.
Constants based on Software Project Types (stored in the DB):
For more information, see:
Within Insights, the "Filter the Date" feature allows you to customize your analytics view based on specific date ranges. This feature provides flexibility and control over the period for which data is displayed.
Follow these steps to utilize the date-filtering feature:
On the right side of the analytics dashboard, locate the "Date Filter" section.
Click on the "Date Filter" section to expand the options.
Choose from the predefined date range options.
Select the desired option by clicking on it.
Click the Bots checkbox to hide the bots' data from the analytics.
To specify a custom date range, click on the Custom option within the date range selection menu.
Select the start and end dates for your custom range on the calendar widget.
The analytics dashboard will automatically update to display data within the selected custom date range.
After selecting a predefined date range or setting a custom date range, click the Apply button to apply the date filter.
The analytics dashboard will refresh to reflect the chosen date range, displaying data only for the selected period.
Click on the Clear Dates button to display data for the entire available range and remove the date filter.
Project velocity in open source projects refers to the rate at which development tasks are completed and features are delivered. It measures the amount of work completed in a specific amount of time.
A higher velocity suggests increased efficiency and progress, while a lower velocity may indicate challenges or bottlenecks.
Monitoring project velocity helps teams assess their performance and plan future tasks accordingly, ensuring steady project advancement.
The Project Velocity chart displays data from the last calendar year.
On the Y-axis, there's a logarithmic scale representing PRs and Issues.
On the X-axis, there's a logarithmic scale representing commits.
The chart visualizes the correlation between code changes and collaboration.
To further understand the project's velocity, create a leaderboard. This ranks projects based on their commit numbers and provides a comparative view of their commits, PRs, and issues. This leaderboard can help in identifying the most active projects at a glance.
Review the top projects based on their commit numbers.
Compare their commit count, PRs, and issues in a single view.
The data visualization on the overview page shows real-time data on the total number of contributors and the total number of across all monitored repositories during the selected time period.
On the Overview page, select the project and repositories (1) for which you want to see the data.
Select the specific time period using the filter option (2).
The high-level tile (3) shows you the total unique contributors (calculated based on their member ID) for the selected time range.
The detailed analysis chart shows you the active contributors and the cumulative count of total contributors for the selected period. On the left side, the chart shows the chart trend summary (4).
Hover over the chart (5) to see the number of active contributors and the total contributors for the selected month.
This interactive download feature (6) enables you to download the chart in CSV and PNG file formats.
When you want to see the health of your open source project, the Contributor Chart is a crucial project performance indicator.
Visualizing the number of contributors over time makes it easier to identify trends, patterns, and overall community interest. The trend helps project maintainers and other stakeholders act based on the charts.
Tracking the number of contributors can provide insights into the health and vitality of your project.
By analyzing changes in the contributor count, project managers can gain insights into the effectiveness of their community outreach and development strategies.
The Issue Metric measures the number of issues reported and tracked within a specified period. It compares the number of issues opened, the number of issues closed, and the total commits for the selected time period.
The metric is based on the following activity types:
issues-closed
issues-opened
The analytics tool employs a combined chart (a line chart and bar charts) on its dashboard to analyze the Issue Metric. The line on the chart connects the data points, allowing you to observe trends and patterns over time.
The dashboard shows the issues (opened +
closed issues) in a snapshot and a detailed chart (open, closed, and the total issues).
On the Overview page, select the project and repositories (1) for which you want to see the data.
Select the specific time period using the filter option (2).
The high-level tile (3) shows you the total issues (open + closed) for the selected time range.
The detailed analysis chart shows you the open issues, closed issues, and the count of total issues for the selected period. On the left side, the chart shows the chart trend summary (4).
Hover over chart (5) to see the open issues, closed issues, and total issues for the selected month.
This interactive download feature (6) enables you to download the chart in CSV and PNG file formats.
Issues Tracking and Management: By visualizing the data on a line chart, it becomes easier to identify the increase or decrease in issue activity, allowing for effective resource allocation and prioritization.
Performance Evaluation: The Issue Metric helps in evaluating the performance of the development team and the project as a whole. Changes in issue count over time indicate improvements in software quality, bug-fixing efficiency, or the impact of development efforts.
Community Engagement: A higher number of reported issues indicates the active participation and involvement of the community in the open source project.
The Star Metric measures and analyzes the number of stars a project receives on a code hosting platform like GitHub.
The metric gives you a real-time data analysis of projects' popularity, community engagement, and overall project visibility.
Stars represent a way for you to bookmark or indicate your interest in and appreciation for a particular project. Each star serves as a measure of the project's popularity.
To analyze the Star Metric, the analytics tool employs a line chart on its dashboard. The line connecting the data points on the chart showcases the trend and changes in the number of stars over time.
When you hover over a specific point on the line chart, detailed information about the number of stars for that particular month within the selected period is displayed.
The metric helps you analyze your project's popularity. A higher number of stars generally suggests a widely recognized and appreciated project, potentially attracting more contributors.
A Contribution Leadership board visualization displays the contributions made by individual contributors to an open source project. It ranks contributors based on the number of code commits, pull requests, issues closed, or other metrics and visually represents their relative activity levels and impact on the project.
This chart displays individual identities, not merge contributors, as in tool. Even if certain identities are combined into one contributor in CM, they will still appear as separate entities in the Insights V3 leaderboard charts. This distinction exists because merging contributors means combining the contributions of a single individual working under different accounts or identities. The separation of identities in Insights is maintained for privacy reasons and GDPR compliance.
Recognition and Motivation: The Contributor Leaderboard recognizes and acknowledges the efforts of individual contributors. It highlights their contributions, encourages ongoing engagement, and motivates contributors to continue their valuable work.
Community Engagement: It creates a sense of community and healthy competition, encouraging collaboration and inspiring others to contribute and improve their ranking on the leaderboard.
Collaboration Opportunities: The leaderboard helps project maintainers and community members identify potential collaborators or subject-matter experts within the project. It will be easier to identify the most active contributors and connect with them.
ID: analytics
Project websites provide some web analytics.
This check passes if:
A Google Analytics 3 (Universal Analytics) Tracking ID is found in the source of the website configured in GitHub. Regexps used:
A Google Analytics 4 Measurement ID is found in the source of the website configured in Github. Regexps used:
The HubSpot tracking code is found in the source of the website configured in Github. Regexps used:
ID: artifacthub_badge
Projects can list their content on Artifact Hub to improve their discoverability.
This check passes if:
An Artifact Hub
badge is found in the repository’s README
file. Regexps used:
ID: cla
The CLA defines the conditions under which intellectual property is contributed to a business or project.
This check passes if:
A CLA check is found in the latest merged PR on GitHub. Regexps used:
This check will be automatically marked as exempt if the DCO check passes but this one does not.
ID: community_meeting
Community meetings are often held to engage community members, hear more voices, and get more viewpoints.
This check passes if:
A reference to the community meeting is found in the repository’s README
file. Regexps used:
ID: dco
Mechanism for contributors to certify that they wrote or have the right to submit the code they are contributing.
This check passes if:
The last commits in the repository have the DCO signature (Signed-off-by). Merge pull request and merge branch commits are ignored for this check.
A DCO check is found in the latest merged PR on GitHub. Regexps used:
This check will be automatically marked as exempt if the CLA check passes, but this one does not.
ID: github_discussions
Projects should enable GitHub discussions in their repositories.
This check passes if:
A discussion that is less than one year old is found on GitHub.
ID: openssf_badge
The Open Source Security Foundation (OpenSSF) Best Practices badge is a way for Free/Libre and Open Source Software (FLOSS) projects to show that they follow best practices.
This check passes if:
An OpenSSF
(CII) badge is found in the repository’s README
file. Regexps used:
ID: openssf_scorecard_badge
Scorecard assesses open source projects for security risks through a series of automated checks. For more information about the Scorecard badge please see https://github.com/marketplace/actions/ossf-scorecard-action#scorecard-badge.
This check passes if:
An OpenSSF
Scorecard badge is found in the repository’s README
file. Regexps used:
ID: recent_release
The project should have released at least one version in the last year.
This check passes if:
A release that is less than one year old is found on GitHub.
ID: slack_presence
Projects should have presence in the CNCF Slack or Kubernetes Slack.
This check passes if:
A reference to the CNCF Slack or Kubernetes Slack is found in the repository’s README
file. Regexps used:
This check determines whether the project has generated executable (binary) artifacts in the source repository. For more details, see the check documentation.
ID: code_review
This check determines whether the project requires code review before pull requests (merge requests) are merged. For more details, see the check documentation.
ID: dangerous_workflow
This check determines whether the project’s GitHub Action workflows has dangerous code patterns. For more details, see the check documentation.
ID: dependency_update_tool
This check tries to determine if the project uses a dependency update tool, specifically dependabot or renovatebot. For more details, see the check documentation.
ID: maintained
This check determines whether the project is actively maintained. For more details, see the check documentation.
ID: sbom
List of components in a piece of software, including licenses, versions, etc.
This check passes if:
The latest release on Github includes an asset which name contains sbom. Regexps used:
The repository’s README
file contains a SBOM section that explains where they are published to, format used, etc. Regexps used to locate the title header:
ID: security_policy
Documented security processes explaining how to report security issues to the project.
This check passes if:
A security policy file is found in the repository. Globs used:
A security policy reference is found in the repository’s README
file. This can be in the form of a title header or a link. Regexps used:
A security policy file is found in the default community health files repository.
ID: signed_releases
This check tries to determine if the project cryptographically signs release artifacts. For more details, see the check documentation.
ID: token_permissions
This check determines whether the project’s automated workflows tokens are set to read-only by default. For more details, see the check documentation.
Calculating a global score for a best practice score in an open-source project involves evaluating various aspects of the project against predefined best practices and assigning weights to those aspects based on their importance. Let's understand this with the sample example.
Define the following set of best practices that are important for the success and quality of the open-source project. Each category should have a set of criteria that can be evaluated.
Assign weights to each category based on their relative importance. These weights should add up to 100%. The weights reflect how much each category contributes to the overall quality of the project.
Evaluate Each Criterion
For each criterion within a category, evaluate the project and assign a score.
Use a numerical scale (0–10) or any other suitable scale.
Code of conduct: 8
Governance: 9
Maintainer: 8
Website: 7
Analytics: 9
GitHub Discussion: 10
Community meetings: 8
Binary Artifacts: 8
Dangerous Workflow: 9
Approved Licenses: 9
It is calculated by the average score* weights
Documentation : ((8+9+8+7)/4)*.4= 3.2
Standards: ((9+10+8)/3)*.30= 2.7
Security: ((8+9)/2)*.20= 1.7
Legal: 9*.10= .9
Calculate Global Score
Sum up the category scores to obtain the global score for the best practice score of the open-source project.
Documentation+ Standards+Security+Legal= 3.2+2.7+1.7+.9 = 15.58
The Fork Metric measures and analyzes the number of times a project has been forked by other developers.
Forking is the process of creating a copy of a project's source code repository to either modify and enhance the project or use it as a starting point for a new project.
The bar chart on the dashboard represents the analysis, displaying the number of forks over time. Hover over a specific bar to access the detailed fork information for that particular month within the selected period.
The interactive download feature (Icon) enables you to download the chart in CSV and PNG file formats.
How popular is the project? The Fork Metric provides insights into the popularity of your project. A higher number of forks generally indicates that the developers find your project useful and valuable enough to build it or adapt it to their specific needs.
Code Reuse: By analyzing the Fork Metric, you can get data on code reuse and identify potential opportunities for improvement.
Community Engagement: A growing number of forks indicates an active and involved community, contributing to the project's growth.
Project Evolution: By monitoring forks over time, you can identify significant milestones.
The Commits metric refers to the analysis of contributor's code commits within a specified timeframe. A code commit represents a unit of change to the software's source code repository.
Each commit includes the following:
committed-commit
("Default Branch" only)
In this chart, only commits are counted, not the Roles. Each commit with a unique Commit SHA is counted as one Commit. The roles do not matter here.
The dashboard shows the commits snapshot and a detailed chart. The detailed chart is a combined chart (line chart and bar chart) that shows new commits vs. total commits.
On the Overview page, select the project and repositories (1) for which you want to see the data.
Select the specific period using the filter option (2).
The high-level tile (3) shows you the total commits for the selected time range.
The detailed analysis chart shows you the New commits and the cumulative count of total commits for the selected period. On the left side, the chart shows the chart trend summary (4).
Hover over the chart (5) to see the new commits and the total commits for the selected month.
This interactive download feature (6) enables you to download the chart in CSV and PNG file formats.
The metric enables project maintainers and stakeholders to gain valuable insights into code changes and progress within a specified period.
It provides insights into the volume and frequency of code changes made by contributors. By visualizing commit data in a bar chart, you can track the progress of development efforts over time.
Changes in commit counts provide periods of intense development, periods of slower activity, or the impact of specific events or milestones on the project.
The overview page should provide a high-level summary of the project's activity, contributors, and performance metrics, including:
The number of contributors and their distribution by location or organization.
The total number of commits, pull requests, and issues.
The average time to resolve issues and merge pull requests.
The overall health of the codebase, including code quality and security vulnerabilities.
The level of community engagement, such as the number of comments on pull requests.
The analytics tools on the overview page provide a range of features and visualizations that can help you gain insights into the project's performance, identify areas for improvement, and make informed decisions about development and collaboration.
The primary data sources for Insights V3 are the code repositories and the publicly available GitHub and Git databases. Refer to Integrations to learn more about data connectors.
Contributor Dependency measures and analyzes the dependencies or relationships between different contributors within a project. It explores how contributors rely on each other, collaborates, and interact in terms of code contributions, reviews, and other collaborative activities.
Contributor dependency shows the relationship between contributors or entities within a project, where the actions or outputs of one contributor depend on the inputs or outputs of another.
This chart displays individual identities, not merge contributors, as in Community Management tool. Even if certain identities are combined into one contributor in CM, they will still appear as separate entities in the Insights leaderboard charts. This distinction exists because merging contributors means combining the contributions of a single individual working under different accounts or identities. The separation of identities in Insights is maintained for privacy reasons and GDPR compliance.
Each activity is considered a separate contribution. For example, opening a PR and closing a PR are counted as two distinct contributions.
Collaboration: It identifies which contributors frequently interact, exchange ideas, review each other's work, and collaborate on code changes.
Knowledge Sharing and Expertise: Understanding these dependencies can help project maintainers identify subject matter experts, encourage knowledge sharing, and allocate resources effectively.
Project Health and Sustainability: By analyzing Contributor Dependency, project maintainers can evaluate the health and sustainability of the project. Dependencies that are concentrated around a few contributors may pose risks if those contributors become less active or leave the project.
On a regular basis, a number of checks are performed on each repository listed in the database.
Checks are grouped into check sets.
One or more check sets
are applied to a single repository, and each check set specifies the number of checks that will be performed on the repository.
The check’s file must declare the following information:
ID
: check identifier.
WEIGHT
: weight of this check, used to calculate scores.
CHECK_SETS
: check sets this new check belongs to.
The Pull Requests Metric measures and analyzes the three key activities related to pull requests:
Pull requests opened
Pull requests closed
Pull requests merged
Pull requests are a mechanism for proposing changes to a codebase, allowing developers to collaborate, review, and merge code changes into the project.
Analyzing the high-level tile (1) representing unique pull requests (opened, closed, and merged) provides valuable insights into the health of the codebase.
The detailed chart displays data related to pull requests opened, closed-unmerged, closed-merged, and the total cumulative pull requests over the selected time period. On the left side, the chart shows the chart trend summary (4).
Collaboration and Code Review: It provides insights into the active participation of developers and the effectiveness of the code review process. If the number of Pull Requests opened is high, the user can complement this data with other Pull Request metrics such as first time to Review, and Pull Request Cycle Time to find out the cause of the high number of Pull Requests open but not acted upon/closed/merged.
Community Engagement: A higher number of pull requests indicates an engaged community that actively contributes to the project.
Quality and Maintenance: By analyzing the number of pull requests opened, closed, and merged, you can assess the health of the codebase, identify areas that need attention, and ensure timely reviews and merging of contributions.
The Organization Leaderboard ranks organizations based on their contributions to the project. The leaderboard provides insights into organizations' collective efforts to drive your projects' success and growth.
It helps you determine if your project has a healthy contribution from multiple organizations and if new organizations are coming to contribute to the project.
This chart displays individual identities, not merge contributors, as in tool. Even if certain identities are combined into one contributor in CM, they will still appear as separate entities in the Insights leaderboard charts. This distinction exists because merging contributors means combining the contributions of a single individual working under different accounts or identities. The separation of identities in Insights is maintained for privacy reasons and GDPR compliance.
Each activity is considered a separate contribution. For example, opening a PR and closing a PR are counted as two distinct contributions.
Recognition: The Organization Leaderboard recognizes and showcases the contributions made by various organizations.
Project Sustainability: The Organization Leaderboard evaluates the involvement of organizations and assesses the project's long-term sustainability and growth potential.
Trust and Credibility: When organizations are actively engaged in your projects and their contributions are recognized through the leaderboard, it enhances the overall trust and credibility of the project.
ID: adopters
List of organizations using this project in production or at stages of testing.
This check passes if:
An adopters file is found in the repository. Globs used:
An adopters reference is found in the repository’s README
file. This is in the form of a title header or a link. Regexps used:
ID: changelog
A curated, chronologically ordered list of notable changes for each version.
This check passes if:
A changelog file is found in the repository. Globs used:
A changelog reference is found in the repository’s README
file. This can be in the form of a title header or a link. Regexps used:
A changelog reference is found in the last GitHub release content body. Regexps used:
ID: code_of_conduct
Adopt a code of conduct to establish community standards, promote an inclusive and welcoming initiative, and outline procedures for handling abuse.
This check passes if:
A code of conduct file is found in the repository. Globs used:
A code of conduct reference is found in the repository’s README
file. This can be in the form of a title header or a link. Regexps used:
ID: contributing
A contributing file in your repository provides potential project contributors with a short guide to how they can help with your project.
This check passes if:
A contributing file is found in the repository. Globs used:
A contributing reference is found in the repository’s README
file. This can be in the form of a title header or a link. Regexps used:
ID: governance
Document that explains how the governance and committer process works in the repository.
This check passes if:
A governance file is found in the repository. Globs used:
A governance reference is found in the repository’s README
file. This can be in the form of a title header or a link. Regexps used:
The maintainers file contains a list of the current maintainers of the repository.
This check passes if:
A maintainers file is found in the repository. Globs used:
A maintainers reference is found in the repository’s README
file. This can be in the form of a title header or a link. Regexps used:
ID: readme
The readme file introduces and explains a project. It contains information that is commonly required to understand what the project is about.
This check passes if:
A readme file is found in the repository. Globs used:
ID: roadmap
Defines a high-level overview of the project’s goals and deliverables ideally presented on a timeline.
This check passes if:
A roadmap file is found in the repository. Globs used:
A roadmap reference is found in the repository’s README
file. This can be in the form of a title header or a link. Regexps used:
ID: summary_table
The Projects Summary Table is a CNCF Business Value Subcommittee initiative to supplement the CNCF Landscape and include further information about CNCF projects for the wider Cloud Native community.
This check passes if:
ID: website
A url
that users can visit to learn more about your project.
This check passes if:
A website url
is configured in the GitHub repository.
ID: license_approved
Whether the repository uses an approved license or not.
This check passes if:
The license identified matches any of the following:
ID: license_scanning
License scanning software scans and automatically identifies, manages, and addresses open source licensing issues.
This check passes if:
A FOSSA
or Snyk
link is found in the repository’s README
file. Regexps used:
ID: Apache_2.0
A permissive license whose main conditions require preserving copyright and license notices. Contributors provide an express grant of patent rights. Licensed works, modifications, and larger works may be distributed under different terms and without source code.
ID: trademark_disclaimer
Project sites should have the Linux Foundation trademark disclaimer.
This check passes if:
The Linux Foundation trademark disclaimer is found in the content of the website configured in Github. Regexps used:
A best practice score visualization is a tool that helps project leads and managers assess the overall health and quality of an open source software project.
It typically evaluates the project against a set of best practices or standards for software development, such as the categories Documentation
,
Standards
,
Security
`and `
Legal`
.
It generates a score or rating based on how well the project meets these criteria.
On the Overview page, select the project and repositories for which you want to see the best practice score.
Select the specific time period using the filter option.
Scroll down to find the best practice score dashboard.
You can see the aggregated score (3) and each category's score on the dashboard.
Click the Download icon to download the dashboard.
Click on any category to see the expanded page where you can see the detailed analysis for each repository.
Click the Create Issue button to create an issue for each repository.
The Active Days metric measures the number of days a contributor has made at least one contribution to a project. It counts the number of days on which a contributor has been actively engaged in the project's development.
The Active Day chart also displays two bars for the current data and the previous data, allowing you to compare them.
The Active Days dashboard provides you with the following insights:
You can monitor progress and identify trends. This information can be used to set goals and benchmarks for the project and measure success.
The active days metric provides a quick snapshot of the project's activity level. It helps determine whether the project is actively maintained or not.
The visualization can be used to quickly assess the activity level of a repository. A repository with a high number of active days is likely to be more active and healthy than one with a low number of active days.
By highlighting the importance of active days, project managers can encourage new contributors to become more involved in the project.
Organization Dependency Metric shows the analysis of how much a project's contributions depend on or are associated with different organizations.
With Organization Dependency Metrics, you can assess which organizations are significantly contributing to your project.
Each activity is considered a separate contribution. For example, opening a PR and closing a PR are counted as two distinct contributions.
Engagement Assessment: For organizations involved in the project, this metric helps assess their level of engagement and impact. It can encourage healthy competition among contributors, resulting in greater involvement.
Risk Management: Dependency on a single organization for contributions can be risky. If that organization reduces its involvement, the project might face challenges.
A code of conduct file is found in the , for example.
A contributing file is found in the .
At least one of the has been set in the project's extra section in the file.
A link pointing to the license scanning results is provided in the metadata file.
Licensed under the Apache License, Version 2.0 (the "License"); you may not use except in compliance with the License. You may obtain a copy of the License at
The Organization Leaderboard ranks organizations based on their activity types on Confluence pages for the selected date range. These activity types include new pages, blog posts, attachments, total pages, and page comments.
This chart displays individual identities, not merge contributors, as in Community Management tool. Even if certain identities are combined into one contributor in CM, they will still appear as separate entities in the Insights V3 leaderboard charts. This distinction exists because merging contributors means combining the contributions of a single individual working under different accounts or identities. The separation of identities in Insights is maintained for privacy reasons and GDPR compliance.
The leaderboard ranks organizations on the Confluence platform according to the following criteria:
New Pages: The number of new pages created by the organization.
Blog Posts: The frequency and quality of blog posts published.
Attachments: The number of files and documents attached to pages and posts.
Page Comments: The level of engagement is demonstrated by comments on pages.
The Contributor Leaderboard on the Confluence Dashboard displays a ranking of users based on their contributions to Confluence activities within a specified date range.
Contributors are ranked based on metrics including new pages, comments, attachments uploaded, and blog posts on the platform.
The leaderboard provides valuable insights into user engagement and productivity within the Confluence environment.
To access the Contributor Leaderboard in Confluence, follow these steps:
On the Overview page, select the project for which you want to see the data.
On the left navigation pane, click Overview > Confluence.
Select the specific date range using the filter option (2).
Scroll down and you will see the Contributors Leaderboard widget.
Use the drop-down menu to filter the leaderboard based on specific confluence activities such as page edits, comments, attachments, and blog posts.
The leaderboard will dynamically update to display rankings based on the selected date range and activity filter.
The leaderboard serves several purposes:
Enhancing Engagement: Motivates users to participate more actively.
Tracking Productivity: Offers insights into who is the most active contributor.
Identifying Knowledge Leaders: Helps in recognizing contributors who are pivotal in spreading knowledge and expertise across the organization.
By effectively utilizing the Confluence Contributor Leaderboard, organizations can foster a more engaged and productive community, driving the collective success of their projects.
This metric shows the distribution of activities on your Confluence pages throughout the week. It helps you identify trends in page creation, updates, and engagement over different days.
When you select the date range, this metric shows the data for each day for the week of that selected period. For instance, if you select last year as a date range to analyze the data, the metric will show consolidated documentation activities for all days of the week.
Analyze the Key Metrics
Total Documentation Activities: This represents the total number of activities logged across community documents, aggregated by each day of the week.
Bar Chart: Each bar corresponds to a day of the week, with the length of the bar representing the total activities for that day.
Utilize Insights
Scheduling: Plan important meetings, updates, or content releases on high-activity days (Tuesday, Wednesday, and Thursday) to maximize engagement.
Resource Allocation: Allocate resources and support more effectively by focusing on the peak activity days.
Trend Analysis: Regularly monitor these trends to adapt strategies and improve engagement based on activity patterns.
This metric shows the list of active organizations in the past 6 months, but they have been inactive in the last 3 months.
This metric is not impacted by the date range selection.
The Confluence Data Analytics Dashboard available in the open-source analytics application, Insights, provides a comprehensive suite of tools designed for deep analysis of user interactions, collaboration patterns, and content efficiency within the Confluence platform. Here is an overview of its features:
Track peak times for user engagement and contributions.
User Interaction Graphs: Visualize the network of collaborations among users.
Page Views and Edits Tracking: Monitor the popularity and evolution of content over time.
Most Engaged Content: Identify the content that receives the most views, comments, and shares.
Team Performance Metrics: Evaluate the productivity and collaboration levels of different teams.
Individual Contribution Insights: Assess the input of individual team members in the collaborative process.
Flexible Filtering: Create custom reports by applying filters based on users, time frames, and content types.
Export Features: Export reports in various formats for sharing or further analysis.
The dashboard integrates seamlessly with Confluence, leveraging its API to pull real-time data. This enables teams to make data-driven decisions, enhance collaboration, and improve content quality on the Confluence platform.
The Most Popular Pages metric shows the top-ranked pages based on their activities.
The ranking is calculated based on the number of interactions each page receives during the selected period.
The metric provides insights into which pages are most engaging and popular among team members, helping you identify areas of interest and focus on improving content quality.
The top of the leaderboard indicates the % change in activities compared to the previous period. This gives a general idea of the engagement trend.
Breakdown of the Table
Rank: Indicates the popularity rank of each page based on activities.
Name: The title or name of the page or event.
Activities: The number of activities recorded on that page during the current time.
Change: The change in the number of activities compared to the previous time (e.g., +19).
Total: The percentage of total activities that each page contributes.
Utilize Insights
Decision Making: Use these metrics to understand which pages or meetings are driving engagement.
Content Focus: Focus more on high-engagement topics for future content or meetings.
Trend Analysis: Regularly monitor these metrics to identify trends and adjust strategies accordingly.
A mailing list is a communication platform where you can sign up to communicate messages via email. It acts as a centralized hub for discussions, announcements, and collaborations among a specific group sharing common interests or working towards a common project.
Groups.io is an online platform that offers mailing list management and hosting facilities. It offers features for creating, managing, archiving mailing lists, and facilitating email communication. To learn more, see https://groups.io/.
The Mailing Lists Dashboard within Insights, integrated with Groups.io, provides data insights into project communications. Its primary objectives and goals include:
Objective: Centralizing and analyzing communication data from Groups.io mailing lists.
Goals:
Communication Analysis: Understand the frequency, nature, and trends of interactions within mailing lists.
Engagement Measurement: Measure user engagement levels, message frequency, and active contributors.
Community Insights: Identify contributors and organizations involved, fostering collaboration and understanding community dynamics.
The Geographical Distribution chart is a feature in the Mailing Lists dashboard of the open-source analytics application, Insights V3. This interactive chart provides a visual representation of the geographical locations of contributors to your mailing lists.
It allows you to see where your contributors are located and the extent of their contributions during a selected period.
With this chart, you can:
View the distribution of contributors globally.
Understand the extent of contributions from different regions.
Filter data by a specified period to analyze trends.
To access this feature:
Navigate to the Insights V3 dashboard.
Click on the Mailing Lists section.
Locate and select the Geographical Distribution
chart.
This tool is invaluable for community managers looking to understand and grow their global contributor base.
Velocity Dashboard in Insights
Overview
The Velocity dashboard in Insights provides a visual representation of your development team's productivity and progress. It measures the team's velocity, which is the rate at which they complete work or deliver features over a specific period.
Key Metrics
The dashboard displays the following key metrics:
Performance Metrics
Lead Time
Average Review Time
Average Wait Time for First Review
Code Engagement
Benefits
A Velocity dashboard helps:
Project managers estimate the team's capacity, progress, and performance over time.
Development teams assess their performance and productivity, track progress, and identify areas for improvement.
Stakeholders gain visibility into the team's progress and productivity, making informed decisions based on real-time data.
Who Can Benefit
The Velocity dashboard is useful for:
Project managers
Leads
Development teams
Stakeholders involved in the open source software development process.
This chart ranks public mailing lists based on their overall activity, considering total messages, unique authors, and contributions from different organizations. It highlights the most active and engaged mailing lists within the project for the selected period.
The Leaderboard provides a snapshot of the most vibrant mailing lists within your project for a selected time period. Here's how to interpret the information:
Ranking: Indicates the position of each mailing list based on activity levels, with #1 being the most active.
Name: The name of the mailing list.
Threads: The count of discussion threads initiated in the mailing list.
Messages: Total number of messages posted in all threads.
Subscribers: The number of individuals subscribed to receive updates from the mailing list.
Contributors: Unique individuals who have posted at least one message to the mailing list.
Organization: The entities (like companies or institutions) that contributors are affiliated with.
Each category also displays a change (+/-) compared to the previous period, helping you see trends like growth or reduction in activity.
Gerrit data connector is a tool that allows you to connect Gerrit, a web-based code review system for Git repositories, with other data sources or systems. This connector enables you to extract, transform, and load data between Gerrit and Insights.
Following are the different Gerrit activity types:
The metric provides insights into the technical contribution breakdown across weekdays and weekends. It shows the time of day when most of the contributions happen so that you have maximum participation in the project. Each day is categorized by its level of activity, indicating low to high contribution levels.
Commits are recorded in the individual contributors' local time zone.
Only commit data is used for this dashboard. Each Commit is counted only once in this metric.
Commits
authored commit
Activity Level Assessment: Work Time Distribution allows you to assess the technical activities across different days of the week. By analyzing the chart, project managers can identify contribution patterns and trends, such as peak activity days or days with lower participation.
Productivity Monitoring: Work Time Distribution helps you to monitor contributors' productivity and engagement. By analyzing the breakdown of contributions, you can identify periods of high productivity and low engagement.
Work Optimization: By understanding the distribution of contributions across weekdays and weekends, project managers can identify potential collaboration challenges due to varying availability.
Weekday vs. Weekend Contributions: Compare the contribution levels between weekdays and weekends. Assess significant activities differences, and identify any patterns or preferences in contributor engagement during these periods.
Maximum Participation: As an Executive Director or Maintainer, when you want to set up a community call for your project, you view the time when most of the contributions happen so that you can have maximum participation.
Insights often incorporates a Geographical Distribution metric to provide insights into the locations from which contributions originate.
Geographical Distribution analyzes and visualizes the contributions made by contributors across different regions around the world. It provides a breakdown of the top regions based on the total number of contributors, providing a clear understanding of the project's global engagement and scope.
Hover over the chart to view the number of contributors for each region during the selected period. This information provides a more granular view of contributor activity within specific regions.
Global Impact: Geographical Distribution allows you to assess the global impact of the open source project by providing insights into the regions where contributions are coming from.
Regional Comparison: Compare the contribution numbers across different regions to identify any notable variations. Assess whether certain regions show consistent contribution levels or if there are fluctuations that require further investigation.
Top Contributing Regions: It helps to identify the top five regions based on the total number of contributions. These regions represent areas where the project has significant engagement and impact.
The Activities Breakdown chart provides a detailed breakdown of various activities on your Confluence pages, including new pages, blog posts, attachments, total pages, and page comments.
To access the Activities Breakdown, follow these steps:
On the Overview page, select the project for which you want to see the data.
On the left navigation pane, click Overview > Confluence.
Select the specific date range using the filter option (2).
Scroll down and you will see the Activities Breakdown widget.
Analyze the distribution of activities to understand user engagement patterns and trends over time.
Click icon to download the chart in PNG or CSV format.
The ranking is based on the number of messages or contributions from newly onboarded contributors to the mailing lists during a specific period.
This dashboard provides insights into the new contributor's activity for the selected projects. It integrates data from Groups.io to help you understand how new contributors are engaging with your project's mailing lists.
On the Overview page, select the project and repositories for which you want to see the new contributor leaderboard.
From the left navigation, click Mailing Lists.
Look for the date filter in the top-right corner of the dashboard.
Click on the calendar icon and select the timeframe you want to analyze. You can choose predefined options like "Last year," or "this year," or set a custom date range.
Scroll down the Mailing Lists dashboard to see the leaderboard.
The leaderboard displays the top contributors based on their message count during the chosen period.
Each entry shows:
Rank: Position on the leaderboard based on message count.
Contributor Name: Username of the contributor.
Messages: Total number of messages posted.
Last Message Date: Date of the most recent message posted.
Click Show More to expand the list.
Click the Download icon to download the leaderboard.
Identify Potential Contributors: Discover individuals who are actively engaging and might be interested in contributing to other project areas.
Measure Community Growth: Track the rate at which new people are joining the mailing list, which can indicate overall community health.
Regularly review the leaderboard to identify any trends or patterns in new contributor activity.
This leaderboard ranks contributors by their engagement level across all mailing lists within the specific timeframe, showcasing the most engaged and active contributors.
The leaderboard integrates data from Groups.io to help you understand how new contributors are engaging with your project's mailing lists.
On the Overview page, select the project and repositories for which you want to see the new contributor leaderboard.
From the left navigation, click Mailing Lists.
Look for the date filter in the top-right corner of the dashboard.
Click on the calendar icon and select the timeframe you want to analyze. You can choose predefined options like "Last year," or "this year," or set a custom date range.
Scroll down the Mailing Lists dashboard to see the most active contributors leaderboard.
The leaderboard displays the most active contributors based on their message count during the chosen period.
Each entry shows:
Rank: Position on the leaderboard based on message count.
Contributor Name: Username of the contributor.
Messages: Total number of messages posted.
Last Message Date: Date of the most recent message posted.
Click Show More to expand the list.
Click the Download icon to download the leaderboard.
The Key Metrics and Detailed Analysis section includes four high-level tiles with charts highlighting significant trends and patterns within your analytics data.
The cumulative charts help you compare different metrics on single charts. This comparative analysis helps identify relationships and draw meaningful conclusions.
Mailing Lists: Indicates the total number of mailing lists associated with the project for the selected period.
Messages: Displays the total count of messages exchanged within the selected mailing list(s) for the selected period.
Contributors: shows the total number of contributors actively participating in discussions.
Organizations: highlights the involved organizations or entities contributing to the discussions within the mailing lists.
On the Overview page, select the project and repositories (1) for which you want to see the data.
From the left navigation, click Mailing List.
Select the specific period using the filter option (2).
Click the high-level tile (3), which shows the total number of themailing lists/ messages/ contributors/ organizations
for the selected time range.
The detailed analysis charts show the following details:
Mailing Lists: shows Active mailing lists vs. total mailing lists.
Messages: shows new messages vs. the total messages for the selected period.
Contributors: shows new contributors vs. the total contributors for the selected period.
Organizations: shows new organizations vs. total organizations for the selected period.
The ranking is based on the number of messages or contributions from newly onboarded organizations to the mailing lists during a selected period.
It highlights the engagement level of these organizations by showcasing their rankings derived from the quantity of messages contributed across the entire spectrum of mailing lists.
On the Overview page, select the project and repositories for which you want to see the new contributor leaderboard.
From the left navigation, click Mailing Lists.
Look for the date filter in the top-right corner of the dashboard.
Click on the calendar icon and select the timeframe you want to analyze. You can choose predefined options like "Last year," or "this year," or set a custom date range.
Scroll down the Mailing Lists dashboard to see the leaderboard.
The leaderboard displays the new organizations based on their message count during the chosen period.
Each entry shows:
Rank: Position on the leaderboard based on message count.
Name: Username of the contributor.
Messages: Total number of messages posted.
Last Message Date: Date of the most recent message posted.
Click the Download icon to download the leaderboard.
The Performance Metrics provides key indicators of your project's pull requests (PRs). Understanding this metric is crucial for managing and improving the efficiency of your development process. The primary focus here is the Average Time to Merge (TTM).
The dashboard presents this information using a bar chart, allowing you to visualize and analyze these performance metrics over a selected period.
The Average Time to Merge (TTM) is a crucial metric representing the average duration it takes for a pull request to be merged from the time it is opened. This metric helps you understand the efficiency of your review and merge processes
The dashboard presents this information using a bar chart, allowing project managers to visualize and analyze these performance metrics over a selected period.
Key Elements:
Y-axis: The vertical axis represents the time in hours.
X-axis: The horizontal axis represents the time period, usually divided into months.
Bars: Each bar in the graph indicates the average TTM for that specific month.
Increase in TTM: An increase in TTM can signify delays in the review process, possibly due to a high volume of pull requests or insufficient reviewers.
Decrease in TTM: A decrease in TTM indicates an improvement in the efficiency of the review and merge process, suggesting that pull requests are being handled more promptly.
Identifying Bottlenecks:
High TTM: A consistently high TTM indicates bottlenecks in the review process. Identifying the root cause, such as limited reviewer availability or complex pull requests, is essential for addressing these delays.
Variability in TTM: Significant fluctuations in TTM from month to month may suggest inconsistent review processes or varying workload levels.
Automate Tests: Implement automated testing to catch issues early and reduce the manual effort required during reviews.
Timely Reviews: Encourage prompt and thorough code reviews to prevent backlogs and reduce TTM.
Manage PR Size: Promote smaller, more manageable pull requests to facilitate quicker reviews and merges.
This ranking displays the most recent discussions based on the timing of the last messages posted. It highlights the latest and most active discussions within the mailing lists for the selected period.
The table below provides a snapshot of the most active discussions within our mailing lists, ranked by the timing of the last message posted. This ensures you are always informed about the freshest and most relevant conversations.
Rank | Discussion Topic | Last Message Date |
---|
Please ensure to join these discussions to share your insights and contribute to our community's knowledge.
Check icon to verify the projects that have Gerrit as a data source.
Get a comprehensive overview of the Productivity dashboard, a key feature in the open source project analytics tool, and learn how to use it to enhance efficiency and measure productivity.
The Productivity page in the open source project analytics tool provides a centralized hub for monitoring project progress, identifying bottlenecks, and optimizing workflows. This dashboard offers a visual representation of essential metrics, including:
Commits per active day: Track the number of commits made by contributors on a daily basis.
New contributors: Identify new contributors and their level of engagement.
Drifting away contributors: Monitor contributors who have become less active in the project.
Engagement gap: Analyze the gap between contributors' expected and actual engagement levels.
Work time distribution impact: Understand how contributors spend their time on the project.
Effort by pull request size: Visualize the effort required for each pull request based on its size.
Key Benefits:
Real-time information: Stay up-to-date with the latest project metrics and trends.
Actionable analytics: Make informed decisions to improve collaboration and deliver high-quality software.
Data-driven decisions: Empower your open source project with data-driven insights to drive success.
Next Steps:
Explore the Productivity dashboard to better your project's performance.
Use the metrics and insights provided to identify areas for improvement and optimize your workflows.
Learn about the New Contributors Dashboard, a valuable feature in the open source project analytics tool, and discover why it's essential for the long-term sustainability and success of your project.
What is the New Contributors Dashboard?
The New Contributors Dashboard is a powerful tool that analyzes the participation of new contributors in your open source project. It provides a leaderboard that ranks new contributors based on their contributions over a selected period, giving you a clear picture of their involvement and impact.
Why is this metric important?
The New Contributors Dashboard is crucial for several reasons:
Sustainability and Succession Planning: As existing contributors move on or take on different responsibilities, new contributors help fill the gaps, ensuring the long-term sustainability of your project.
Fresh Perspectives and Ideas: New contributors bring diverse skill sets, innovative solutions, and fresh ideas to the project, contributing to its evolution and growth.
Key Benefits:
Identify and recognize new contributors who are making significant contributions to the project.
Understand the impact of new contributors on the project's growth and development.
Make informed decisions to encourage and retain new contributors, ensuring the project's long-term success.
Next Steps:
Explore the New Contributors Dashboard to analyze the participation of new contributors in your project.
Use the insights gained to develop strategies for retaining and engaging new contributors, ensuring the project's continued success.
When the project health score is low for an open-source project like CNCF or Kubernetes, here are some steps a user can take:
Review the project's health metrics: Understand what factors contribute to the low health score. Is it due to a lack of contributors, outdated documentation, or poor code quality? Knowing the root cause will help you focus on the right areas.
Join the community: Engage with the project's community by participating in discussions on forums, GitHub issues, or mailing lists. This will help you understand the project's dynamics, identify potential issues, and potentially find opportunities to contribute.
Contribute to the project: With a low health score, there may be opportunities to contribute to the project. This could be in the form of:
Code contributions: Fixing bugs, improving code quality, or implementing new features.
Documentation updates: Ensuring that the documentation is up-to-date, accurate, and easy to understand.
Testing and validation: Helping to identify and fix issues, ensuring that the project is stable and reliable.
Community outreach: Helping to attract new contributors, promoting the project, and building a stronger community.
Identify potential roadblocks: Be aware of potential roadblocks that might hinder your contributions, such as:
Complexity: The project may be too complex for you to contribute to, especially if you're new to the area.
Time commitment: Contributing to an open-source project requires a significant time commitment, which may not be feasible for everyone.
Consider forking the project: If the project is not actively maintained or has a low health score, you can consider forking the project to create a new, improved version. This can be a significant undertaking, but it can also provide an opportunity to create a more sustainable and maintainable project.
Raise awareness: If you're unable to contribute directly, you can still raise awareness about the project's low health score. Share your concerns with the community, and encourage others to contribute or help address the issues.
Explore alternative projects: If the project is too far gone, it might be time to explore alternative projects with a higher health score. This will ensure that you are contributing to a project that is well-maintained, stable, and has a strong community.
Onboarding a project in Insights starts with integrating a data source in Community Management.
To onboard a data source into CM, you need CM Manager Access within CM for managing the onboarding process, along with admin-level permissions for the data source you're integrating.
For example, GitHub Onboarding requires admin access to your GitHub organization because the onboarding process involves installing the CM GitHub application, which is essential for CM to collect data from GitHub. To learn more, see GitHub Integration.
Once you successfully connect the data source, it will take 24-48 hours to fully onboard a project on Insights depending on the volume of data.
We have separate identities for GitHub and Git to distinguish between the two platforms.
GitHub Identity: GitHub identity encompasses a user's GitHub profile, including their name, GitHub ID, logo, and public details like company information. It is used for authentication and accessing GitHub-specific data like repositories, issues, and pull requests.
Git Identity: Git identity is more basic, typically consisting of just the name and email address as they appear in the Git log. It is used for general Git operations such as cloning repositories, pushing changes, and managing branches.
In Insight, we opted for displaying usernames instead of real names due to:
Privacy Protection: Safeguarding user identity by allowing anonymity.
Enhanced Security: Minimizing personal information exposure to reduce security risks.
Platform Consistency: Aligning with common digital norms for user identification.
This approach reflects our dedication to ensuring a secure, private, and user-centric experience.
It takes up to 48 hours.
follow these steps to troubleshoot the issue:
Check Data Source:
Verify that the data source is correctly configured and is actively sending data to Insight V3.
Refresh Data:
Try manually refreshing the data within the application to ensure you have the most up-to-date information.
Check Connectivity:
Ensure that there are no connectivity issues preventing data from being retrieved. Check network connections and any relevant settings.
Contact Support:
If the issue persists, contact our support team for further assistance.
Our testing involves both automated and manual strategies. We use a QA automation system to check the data from most sources, like GitHub automatically. When we need to examine data, particularly from groups.io closely, we manually test to ensure the data is accurate and complete.
This data is not shown to maintain privacy and security, however, it is known which organizations are contributing.
A: The Contributors Diversification metric shows the distribution of contributions from individuals and organizations over time, helping you understand how your project's contributor base is evolving and identify trends in contribution patterns.
A: You can use the metric to track changes in contributor activity over time, identify areas where the project may need more contributors or support, and inform decisions about project governance and contributor management.
The Reports Dashboard gives you a comprehensive view of the project's performance through four primary metrics. The dashboard utilizes intuitive data visualizations such as charts, graphs, and tables. These visual representations make it easier to interpret complex data and identify patterns.
The Reports Dashboard enables you to generate comprehensive reports based on the selected metrics and filters. These reports can be exported in various formats, such as PDF or CSV, making sharing the insights with team members or external stakeholders convenient.
At the core of the Reports Dashboard are the following four major metrics:
The Project Health dashboard provides a comprehensive overview of your open source project's health and activity. It offers insights into key metrics such as contributor engagement, code quality, and issue tracking, allowing you to make data-driven decisions to improve your project's success. With the Project Health dashboard, you can:
Track contributor activity and identify trends in engagement
Monitor code quality and identify areas for improvement
Analyze issue tracking and resolve bottlenecks
Make informed decisions to drive project growth and success
By providing a clear and actionable view of your project's health, the Project Health dashboard empowers you to take control of your project's success and make a positive impact on the open source community.
The Contributors Diversification metric in the Project Health dashboard provides insights into the distribution of contributions from individuals and organizations over time. This metric helps you understand how your project's contributor base is evolving and identify trends in contribution patterns.
The Individual Contributors chart displays the distribution of commits over time, broken down into three categories:
Total Commits: The total number of commits made by all contributors.
Top 21 Contributors: The top 21 contributors who have made the most commits.
All Others: The remaining contributors who have made commits, but are not in the top 21.
The chart shows the total commits percentage on the right Y-axis and the total number of commits on the left Y-axis, with time on the X-axis.
The Organization chart displays the distribution of commits over time, broken down into two categories:
Top Organizations: The top organizations that have made the most commits.
All Others: The remaining organizations that have made commits, but are not in the top organizations.
The chart shows the total commits percentage on the Y-axis and time on the X-axis.
By analyzing the charts, you can:
Identify trends in contributor activity over time.
Determine if the project's contributor base is becoming more or less diverse.
Identify top contributors and organizations, and understand their contribution patterns.
Use the metric to track changes in contributor activity over time.
Identify areas where the project may need more contributors or support.
Use the metric to inform decisions about project governance and contributor management.
Code Review Engagement
Understanding Code Review Engagement
Code Review Engagement is a key metric in Insights that measures the level of involvement and participation in code review activities. This metric helps you assess the effectiveness of your code review process and identify areas for improvement.
Factors Considered in Code Review Engagement
The following factors are considered in the Pull Request review process to calculate Code Review Engagement:
Number of Pull Request Participants: The number of developers, reviewers, and other stakeholders involved in the review process.
Pull Requests Reviewed: The total number of Pull Requests reviewed, including those that were approved, rejected, or pending.
Review Comments for Pull Request: The number of comments and feedback provided on each Pull Request, indicating the level of engagement and participation.
Code Reviews: The number of code reviews conducted, including the types of reviews (e.g., manual, automated) and the frequency of reviews.
Code Review Engagement is crucial for ensuring the quality and reliability of your codebase. By analyzing this metric, you can:
Improve Code Quality: Identify areas where code quality may be compromised and take corrective action.
Enhance Collaboration: Foster a culture of collaboration and open communication among developers and reviewers.
Reduce Defects: Catch defects and issues early in the development process, reducing the risk of downstream problems.
Best Practices for Improving Code Review Engagement
To maximize the benefits of Code Review Engagement, follow these best practices:
Establish Clear Review Guidelines: Define clear expectations and guidelines for code reviews, including the types and the frequency of reviews.
Encourage Feedback: Foster a culture of open feedback and communication among developers and reviewers.
Monitor and Analyze Code Review Data: Regularly review and analyze code review data to identify trends and areas for improvement.
By understanding and improving Code Review Engagement, you can ensure your codebase quality, reliability, and maintainability.
This ranking showcases the top messages based on the number of responses generated by these messages. It highlights the most engaging and widely discussed topics within the mailing lists for the selected period.
On the Overview page, select the project and repositories (1) for which you want to see the data.
From the left navigation, click Mailing List.
Select the specific period using the filter option (2).
Scroll down the main webpage where the leaderboard is hosted.
The leaderboard shows the rankings based on the number of messages. Look through the list to see which messages or topics are most engaging.
The leaderboard includes a feature to compare the increase or decrease of messages over a selected period.
Click Show More to expand the list.
If you want to download the list, click .
The Commits per Active Day Dashboard provides insights into code commit frequency on active development days. It measures the average number of code commits contributors to make on active development days.
To calculate the Commits Per Active Day metric:
Identify Active Days: Count the number of days within a specified period where at least one commit was made.
Count Total Commits: Sum the total number of commits made within that period.
Calculate Average: Divide the total number of commits by the number of active days.
For example, if in a month there are 300 commits made over 20 active days, the Commits Per Active Day would be:
Commits Per Active Day = 300/15 = 20 days
This means, on average, there were 15 commits made on each day that had at least one commit.
Early Issue Detection: A higher number of commits per active day increases the likelihood of early issue detection. Regular code commits to provide more opportunities for contributors to identify potential issues or bugs during the development process.
Code Quality and Stability: A consistent number of commits indicates ongoing code enhancements and maintenance, leading to improved code quality over time.
Productivity Assessment: A higher number of commits per active day suggests that contributors are actively working on code changes, implementing new features, fixing bugs, and making improvements.
Insights V3 uses the Average Review Time by Pull Request Time metric to provide insights into the duration it takes for pull requests to be reviewed.
Average Review Time by Pull Request Time refers to the average duration it takes for pull requests to be reviewed by peers or project maintainers. It measures the time span between the creation of a pull request and when it receives thorough review feedback.
The chart consists of 5 bars, each of a different color. Each bar displays the average lead time in hours or days for pull requests based on the size of the request.
We have 5 buckets of Pull Request Sizes. They are:
1-9 lines
10-49 Lines
50-99 Lines
100-499 Lines
500+ Lines
Pull Request Size is computed by Lines "Changed". Lines changed could be lines of code added, deleted, or updated.
The length of the color inside the bar is determined by the average Review time. i.e., the longer it takes, the longer the length of the color inside the bar.
Code Quality Assurance: The metric helps you monitor the speed at which pull requests are reviewed. By minimizing the average review time, you can enhance the chances of identifying and resolving code issues promptly, resulting in higher code quality and overall project success.
Collaboration and Engagement: Prompt review feedback encourages active collaboration among contributors. It helps to maintain a responsive and interactive process. When pull requests receive timely reviews, contributors can address feedback and iterate on their code changes faster.
Project Velocity: Timely code reviews contribute to higher project velocity. The Average Review Time metric provides insights into the responsiveness of the review process, identifying areas for improvement. Minimizing review times helps ensure that code changes are integrated swiftly, allowing projects to deliver new features or updates faster.
The Engagement Gap metric measures the difference between expected and actual levels of contributor engagement. The dashboard shows the ratio of the difference between the contributor who comments the most over PRs vs. the contributor who comments the least.
To illustrate the importance of the Engagement Gap Metric, consider the following example:
For an open-source project with 100 contributors but only 20 actively engaged users (e.g., responding to issues, contributing code, or participating in discussions), there is an engagement gap of 80%. This may indicate a lack of community involvement.
Performance Assessment: The Engagement Gap metric enables you to assess the project's overall engagement level by comparing it to the expected or desired level. It provides a quantitative measure of how actively contributors are participating and helps identify any gaps between the expected and actual engagement levels.
Community Health: The Engagement Gap metric provides valuable insights into the health and dynamics of the project community. Large engagement gaps may indicate potential challenges, such as communication issues, a lack of mentorship, or unclear contribution guidelines.
Targeted outreach: By analyzing the engagement gap, project maintainers can focus their outreach efforts on users who are already engaged with the project, increasing the likelihood of retaining their interest and encouraging continued participation.
How to Improve Engagement Gap?
Set Clear Expectations: Clearly define the expected engagement levels for each project to provide a benchmark for comparison.
Encourage Collaboration: Foster a culture of collaboration within your team by encouraging open communication and sharing of ideas.
Provide Feedback: Regularly review the Engagement Gap metrics with your team and provide feedback on areas that need improvement.
Recognize Achievements: Acknowledge and reward team members who actively contribute to reducing the Engagement Gap, motivating others to follow suit.
Insights uses the Average Lead Time by Pull Request Time metric to provide insights into the time it takes for pull requests to be completed.
Average Lead Time by Pull Request Time refers to the average duration it takes for pull requests to progress from opening to merging. It measures the period between the creation of a pull request and its successful inclusion into the project's codebase.
The chart consists of five bars, each of a different color. Each bar displays the average lead time in hours/days for pull requests based on the pull request size.
We have five buckets of Pull Request Sizes. They are:
1-9 lines
10-49 Lines
50-99 Lines
100-499 Lines
500+ Lines
Pull Request Size is computed by Lines Changed. Lines changed could be lines of code added, deleted, or updated.
The length of the color inside the bar is determined by the average lead time. i.e., the longer it takes, the longer the length of the color inside the bar.
It is the "Average" Lead Time, so compute the average lead time for all PRs for a certain size and display the lead time in minutes/hours/days
Workflow Efficiency: This metric provides valuable insights into the efficiency of the pull request workflow. Optimizing the lead time results in faster integration of code changes and promotes collaboration among contributors.
Collaboration and Feedback: It reflects the speed at which contributors receive feedback on their code changes. A shorter lead time indicates a more responsive review process, encouraging contributors to engage actively.
Project Velocity: Monitoring the average lead time enables project managers to assess the overall project velocity. A shorter lead time helps maintain a high project velocity, ensuring rapid innovation and faster delivery of software features.
Select the project from the landing page or from the foundation page.
From the main navigation, select Reports and then click Activities Dashboard.
In the top-right corner, you will find the date filter option.
Select the desired start and end dates for the data you want to analyze.
Click Apply to update the dashboard with the selected date range.
Find the Platforms drop-down menu.
Choose a data source from the available options (e.g., GitHub or Git).
It shows the total number of all the activities that are performed by contributors for the project. Hover over the data points to view specific activity counts for each date.
Explore the Activities Today, Activities This Week, and Activities Organization This Month charts. The Charts are date-filter independent. They show real-time data.
Click View to expand the list on the right side and see the list of activities.
Download the list in CSV format for analysis.
Move to the Activities chart section.
Use the drop-down to select Daily, Weekly, or Monthly data.
Gain insights into the growth of activities over time.
Navigate to the Activities by Platform chart.
You will see the overall percentage distribution across the different platforms.
Click > icon to see the detailed distribution of activities on the different platforms.
Explore the Leaderboard section for activities by type.
Review the types of activities (e.g., code commits, issues) and their corresponding counts.
The Effort By Pull Request Batch Size metric analyzes the relationship between the size of pull requests (measured by lines of code changed) and the time contributors spend reviewing and merging them.
Here are ways you can interact with the chart to gain deeper insights:
Filter by Date Range: This allows users to analyze the metrics across different periods to observe how the trends have evolved.
Compare Trends: The chart compares previous periods, enabling you to spot differences or improvements.
The metric shows the distribution of pull requests across different size categories, from "Very Small" to "Gigantic."
This information can help identify potential bottlenecks or areas where the team may need additional support or process improvements.
The metric tracks the average time required for reviewing and merging pull requests in each size category.
Longer review and merge times for larger pull requests may indicate a need for better code organization, more thorough review processes, or additional resources.
The metric includes information on the number of participants and comments associated with each pull request size category.
Higher numbers of participants and comments for larger pull requests suggest increased collaboration and coordination efforts, which can be both positive (better code quality) and negative (potential delays or inefficiencies).
Development Cycle Time: The Effort By Pull Request Batch Size metric provides insights into the overall development cycle time. By analyzing the relationship between batch size and effort, you can identify trends that affect the time taken to review and merge pull requests.
Review Efficiency: The Effort By Pull Request Batch Size metric helps project managers evaluate the efficiency of the pull request review process. By analyzing the effort required for different batch sizes, You can identify patterns and trends that impact the speed and quality of reviews.
Q: What does the Effort By Pull Request Batch Size metric measure? A: It measures the relationship between the size of pull requests, in terms of lines of code changed, and the amount of time contributors spend reviewing and merging them.
Q: Why is it important to analyze the Effort By Pull Request Batch Size metric? A: Understanding this metric helps optimize the review process by identifying the most efficient batch sizes for pull requests, thus reducing review time and improving workflow efficiency.
Q: How can I interact with the chart to get more insights? A: You can filter the analysis by date range to observe trends over time and compare these trends with previous periods to identify improvements or regressions in efficiency.
Q: Can analyzing this metric reveal trends over specific periods? A: Yes, by filtering the data by specific date ranges, it's possible to observe how the trends in pull request batch sizes and review times have evolved, helping teams adapt their strategies accordingly.
Select the project from the landing page or the foundation page.
From the main navigation, select Reports, and click Retention Dashboard.
On the top-right corner, locate the date filter option.
Click on it to open.
Choose your desired duration to focus on specific data.
Click Apply to update the dashboard with your selected date range.
Find the Platforms drop-down menu.
Choose a data source from the available options (e.g., GitHub or Git).
The dashboard updates to display insights relevant to your selected platform.
Locate the Cohort Size drop-down menu.
Choose either Weekly or Monthly to define the cohort size for analysis.
The dashboard adjusts to display data based on your selected cohort size.
The retention rate in open-source projects measures the percentage of contributors who continue to be actively engaged in the project over a specified period.
For a specific time period, the retention rate for contributors is calculated by dividing the number of contributors who remain active during the current and previous time period by the total number of contributors who were active in the previous time frame.
Explore the Retention Rate Chart section.
Observe the chart that visualizes the retention rate of contributors over time.
This chart shows the percentage of contributors who continue to engage with your project over the defined cohort period.
This metric provides insights into how consistently contributors are involved in your open source project over a specific period.
Review the Average Number of Contributor Activities metric.
Gain insights into the average level of contributor activity within the defined cohort.
This chart helps you understand the retention of contributors over time. It provides insights into how many new contributors joined in a specific month and how many of them remained actively engaged in subsequent months.
The Work Time Distribution Impact Dashboard analyzes how contributors allocate their work time and the impact of different activities on project progress.
The chart shows the trends of commits and finds patterns if long hours, non-business hours, or weekends have contributed to burnout. Burnout can be thought of as a lower number of commits over a long period, before which there was heightened activity (commits).
The purpose of the chart is to find out if there is a risk of burnout among contributors due to long hours.
The chart displays the distribution of work time among contributors, highlighting the impact of different activities on project progress. The chart shows the following:
Commits: The number of commits made by contributors over a specified period.
Time of Day: The time of day when commits were made, categorized into business hours, non-business hours, and weekends.
Vertical Bar Chart: The chart displays a vertical bar chart showing the number of commits made during different time periods.
The Work Time Distribution metric is calculated by analyzing the time of day when commits were made. The metric is calculated as follows:
Time categorization: Commits are categorized into business hours, non-business hours, and weekends.
Counting commits: The number of commits made during each time period is counted.
Calculating percentage: The percentage of commits made during each time period is calculated based on the total number of commits.
Workload Distribution: Monitoring the Work Time Distribution Impact helps identify potential workload imbalances among contributors. If one or a few contributors are consistently spending a disproportionate amount of time on specific activities, it can lead to burnout or reduced productivity.
Performance Evaluation: The Work Time Distribution Impact metric can contribute to performance evaluation and feedback processes. By analyzing how contributors allocate their work time, project managers can identify patterns of efficiency or areas that require improvement.
Lead Time is a key performance indicator (KPI) that measures the average time for a Pull Request (PR) to move from creation to merge. It provides a comprehensive view of the entire lifecycle of a PR, covering the following stages
PR Raised: The moment when a developer submits a PR to the repository. This is the starting point of the Lead Time calculation.
Review Started: The time when the review process begins, indicating that the PR is being evaluated by team members or automated tools. This stage typically starts after the PR is raised and can take anywhere from a few minutes to several hours or even days, depending on the complexity of the code changes and the number of reviewers.
PR Accepted: The stage when the PR is approved and ready for merge. This typically happens after the review process is complete, and the PR has been deemed acceptable by the reviewers. The duration of this stage can vary depending on the complexity of the changes and the availability of the reviewers.
PR Merged: The final stage is when the PR is successfully merged into the main branch. This is the endpoint of the Lead Time calculation.
The Lead Time metric can be effectively visualized using box plots. Box plots can provide a visual representation of the distribution of lead times.
Project Efficiency: you can analyze the complete PR review cycle Lead Time that provides efficiency in the software development process. By analyzing the time it takes for code changes to move through the development pipeline, project managers can identify delays or inefficiencies.
Quality Assurance: Lead time can provide insights into the quality assurance process. Longer lead times may indicate delays in testing or quality assurance activities, potentially leading to issues and bugs reaching production.
Best Practices for Optimizing Lead Time
Streamline Your Review Process: Implement automated testing and review tools to reduce the time spent on manual reviews.
Set Realistic Expectations: Establish clear communication channels and deadlines for PR reviews to avoid delays.
Continuously Monitor and Improve: Regularly track Lead Time metrics and identify areas for improvement to optimize your development workflow.
Insights V3 incorporates the Average Wait Time for First Review metrics to provide insights into the duration it takes for pull requests to receive their first review.
The Average Wait Time for First Review refers to the average time it takes for pull requests to receive their first review after being opened. It measures the time span between the creation of a pull request and when it receives its initial feedback or review.
In the vertical bar chart, each bar displays the Average Wait Time for 1st Review with x-Axis showing the date and the y-axis showing "Time in Hours".
The Average Wait Time for a selected time period can be computed by summing up the time it took for the first review for all PRs, divided by the number of pull requests and results displayed in minutes/hours/days.
Each data point on the chart represents the average wait time for the first review during that specific time period.
Code Quality and Bug Resolution: Longer wait times may delay the identification and resolution of code issues or bugs, potentially affecting the overall quality of the software.
Faster Development Cycle: Reducing the wait time for the first review contributes to a faster development cycle. This allows projects to deliver new features, bug fixes, or improvements promptly, increasing the project's overall efficiency.
Collaboration and Iteration: The Average Wait Time for First Review metric directly impacts collaboration and iteration among contributors. Timely feedback on pull requests allows contributors to address issues or make improvements promptly.
The Organizations' reports provide insights into the individuals who have contributed to the project. The report gives you key metrics that you can use to assess a healthy contribution from multiple organizations.
Select the project from the landing page or from the foundation page.
On the left navigation pane, click Reports>Organizations.
Select the repositories from the drop-down menu to analyze the data for the particular repository.
In the top-right corner, you will find the date filter option.
Click on the date filter to open a calendar.
Select the desired start and end dates for the data you want to analyze.
Click Apply to update the dashboard with the selected date range.
Locate the drop-down menu for data source selection. For more information, see Data source platforms.
Locate the Display Only New Organizations toggle button.
Toggle it on to display data only for new organizations within the selected period.
Toggle it off to view data for all organizations.
Explore the Total Organizations chart section. When you toggle the New Organizations button, it displays the total number of new organizations.
Hover over the data points to view specific counts for that time.
Explore the Active Organizations Today, Active Organizations This Week, or Active Organizations This Month charts. The Charts are date filter independent. They show real-time data
Click View to expand the list on the right side and see the list of organizations.
The chart shows new organizations when you toggle on the Display Only New Organizations Button.
Navigate to the Leaderboard section.
You will find a list of the most active contributors.
Review their names and corresponding activity levels.
The Drifting Away Contributors metric focuses on identifying contributors who were once active in an open-source project but have gradually become less engaged over time.
This chart is not impacted by time filter changes. That means the data will always show with respect to "today".
Drifting Away Contributors are:
Users who made at least 5 code contributions at all times for the project.
At least one of those contributions must have been made in the last 6 months.
The contributor disappeared over the last 3 months.
The Drifting Away Contributors metric is essential for maintaining a healthy and active contributor community. By identifying contributors who are gradually becoming less engaged, project managers can take proactive measures to understand their reasons for disengagement and find ways to re-engage them.
Project popularity can be measured using various metrics, including project forks and GitHub stars. This guide provides an overview of how to analyze project popularity using these metrics.
A project fork occurs when someone creates a copy of a repository (project) on GitHub.
Forks can be used to create a modified version of the original project or to contribute to the original project by sending pull requests.
A GitHub Star is a way for users to show their appreciation for a project by clicking the star button on the project's repository page.
Stars indicate that a user wants to keep track of a project's updates.
To analyze project popularity, follow these steps:
Fork and Star Count: Compare the number of forks and stars a project has over time.
Fork and Star Growth Rate: Calculate the rate at which the number of forks and stars is increasing or decreasing.
Fork and Star Distribution: Analyze the distribution of forks and stars among different contributors or organizations.
Metrics for Project Popularity
The following metrics can be used to show project popularity within a selected period:
The metrics available for GitHub projects versus general Git projects differ mainly because GitHub provides a web-based platform with additional features and integrations that are not inherently part of the Git version control system. Here are the key differences:
GitHub offers a range of metrics that leverage its platform's capabilities, including social features, automation, and integration with other tools.
Git, as a version control system, focuses more on the raw data related to code changes and development history. The metrics available from Git are typically obtained using Git commands and require manual analysis or additional tooling.
Here is a comparison of metrics available for GitHub versus general Git projects:
Platform-Specific Metrics: GitHub provides additional metrics related to community engagement (stars, forks, watchers) and integrated project management tools (issues, pull requests, milestones, project boards).
Ease of Access: GitHub metrics are easily accessible via the web interface, while Git metrics typically require manual extraction and analysis.
Community and Engagement Data: Available in GitHub but not in standard Git.
Integrated Tools for Additional Insights: GitHub offers insights into code quality and dependency management, which are not inherently part of Git.
The Contributors' reports provide insights into the individuals who have contributed to the project. This dashboard allows for the selection of data sources and provides specialized insights.
Select the project from the landing page or from the foundation page.
On the left navigation pane, click Reports>Contributors.
In the top-right corner, you will find the date filter option.
Click on the date filter to open a calendar.
Select the desired start and end dates for the data you want to analyze.
Click Apply to update the dashboard with the selected date range.
For more information, see Filter Date Range.
Locate the drop-down menu for data source selection.
Choose the desired data source from the available options.
The dashboard will update to display data specific to the selected source.
Currently, both GitHub and Git are selected as the data sources by default.
Locate the Display Only New Contributors toggle button.
Toggle it on to display data only for new contributors within the selected period.
Toggle it off to view data for all contributors.
Explore the Total Contributors chart section. When you toggle the New Contributors button, it displays the total new contributors.
Hover over the data points to view specific counts for that time.
Explore Active Contributors Today, Active Contributors This Week, or Active Contributors This Month charts. The Charts are date filter independent. They show real-time data
Click View to expand the list on the right side and see the list of contributors.
The chart shows new contributors when you toggle on the Display Only New Contributors Button.
Move to the "Active Contributors vs. Returning Contributors" chart.
click the drop-down to filter the data.
Understand the comparison between contributors who are active for the first time and those who have returned.
Navigate to the "Leaderboard" section.
You will find a list of the most active contributors.
Review their names and corresponding activity levels.
Metric
Description
Fork Count
Total number of forks a project has received over a specific period.
Star Count
Total number of stars a project has received over a specific period.
Fork Growth Rate
Rate at which the number of forks is increasing or decreasing over a specific period.
Star Growth Rate
Rate at which the number of stars is increasing or decreasing over a specific period.
Fork/Star Ratio
Ratio of forks to stars, indicating the project's popularity and engagement.
Category
Metric
GitHub
Git
Repository Statistics
Commits
Yes
Yes
Contributors
Yes
Yes
Forks
Yes
No
Stars
Yes
No
Issues
Yes (open and closed)
No
Pull Requests
Yes (open, closed, and merged)
No
Releases
Yes
No
Traffic Metrics
Page Views
Yes
No
Clones
Yes
No
Community Metrics
Watchers
Yes
No
Discussions
Yes
No
Contributor Activity
Yes
No
Code Quality and Analysis
Code Review Statistics
Yes
No
Dependency Insights
Yes
No
Project Management
Milestones
Yes
No
Project Boards
Yes
No
Commit History
Commit Count
Yes
Yes
Commit Messages
Yes
Yes
Author Statistics
Yes
Yes
Branch Activity
Yes
Yes
Code Changes
Lines of Code (LoC)
Yes
Yes
File Changes
Yes
Yes
Merge and Conflict Metrics
Merge Frequency
Yes
Yes
Conflict Resolution
Yes
Yes
Temporal Metrics
Commit Frequency
Yes
Yes
Active Development Periods
Yes
Yes