Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Insights gives you complete visibility into project performance and ecosystem trends. Understand your contributor community and make informed decisions with our analytics and reporting tools.
The new release of Insights is now live with a refreshed user interface and new dashboards that make it easier to navigate and find the information that matters most.
Date: February 22, 2024
Insights V3 is an open-source analytical tool that provides insights from analyzing open-source software (OSS) projects.
Insights V3 helps project leads and technical managers understand their team members' engagement and participation in open-source projects, and identify the most active and productive contributors.
No new features have been added in this release.
Optimized Individual Project Card Data Retrieval: The data fetching for individual project cards has been significantly optimized, resulting in quicker load times and an enhanced user experience.
Enhanced Loading Performance for Foundation -> Projects Page: Loading times for the Foundation > Projects page have been drastically reduced through the implementation of advanced loading techniques, ensuring smoother user navigation.
Integration of DBT Models: All data within Insights V3 now utilize DBT (Data Build Tool) models, offering more scalable and robust data handling capabilities.
Model rendering and performance optimization with Cube Cloud: All dbt models are mapped to cubes and views with defined pre-aggregations in Cube Cloud’s semantic layer, optimizing load times.
Overall System Performance Improvements: A series of system optimizations have been carried out, leading to noticeable improvements in the tool's performance and reliability.
UI Enhancements
Enhanced Responsiveness: We have improved the alignment of cards across various screen sizes for a seamless viewing experience.
Report Filters Redesign: Filters in the Reports section that were previously considered "extra" have now been integrated into the top filter box for easier access.
New Default Period: The default period for viewing data has been updated to the Last 12 months, allowing for more relevant insights.
Color Palette Refresh: The site now follows a new color palette to maintain consistency and enhance visual appeal.
UI Consistency Improvements: Adjustments in alignment, padding, and margins across the platform ensure a unified and more polished look for all cards.
Project Status Indicator: Projects still in the onboarding phase will now be highlighted with a subtle red color for better visibility.
Updated Navigation: Clicking the LFX Logo will now redirect users to the Insights landing page, streamlining navigation.
Clarified Chart Descriptions: The description for the "Contribution outside working hours" chart has been refined for clarity.
Rectify the Best Practices category mismatch issue
Update the tooltip data for the Reports -> Active Contributors chart to reflect the Last 10 Years time range accurately
Ensure the presence of the Hyperledger Foundation logo on the Hyperledger card
Address the rounding-off error in the Software Value calculation
Fix the pagination bug on the Foundation -> Projects page
Eliminate the 100-row loading limit for sub-projects
Some of the projects are still onboarding, so the data may not be correct.
The date range filter in the dashboard module may not always display the correct data when selecting custom date ranges.
You may encounter occasional inconsistencies in data synchronization between Insights V3 and external data sources, leading to discrepancies in reporting.
Date: February 29, 2024
Insights V3 is an open-source analytical tool that provides insights from analyzing open source software (OSS) projects.
Insights V3 helps project leads and technical managers understand their team members' engagement and participation in open source projects, and identify the most active and productive contributors.
No new features have been added in this release.
UI Enhancements
Text changes:
Column names on organization-related tables from 'Name' to 'Organization'.
Column name: Foundation overview -> Project Velocity table from 'Authors' to 'Contributors'.
changed the card description for the Contributions Outside Work Hours component.
Fix column alignment issues on leaderboard tables
Enable the filtering of the table in the Project Velocity section on the Foundation overview page
Fix issues with downloading PNG on the Reports -> Activities page components
Fix the issue with French Polynesia not showing up on the activities by Geographical Distribution metrics
Some of the projects are still onboarding, so the data may not be correct.
The date range filter in the dashboard module may not always display the correct data when selecting custom date ranges.
You may encounter occasional inconsistencies in data synchronization between Insights V3 and external data sources, leading to discrepancies in reporting.
Beta Version
Welcome to Insights V3, an open-source project analytics tool that empowers you with valuable data-driven insights.
Important: Insights V3 is currently in the beta phase, which means it is actively being developed and refined to provide you with the best experience possible.
Insights V3 is a pre-release software version made available to users for testing and feedback. This means you may encounter occasional changes to the user interface, features, and functionality.
Encourage you to actively participate in the Insights V3 community by providing feedback, reporting bugs, and suggesting improvements.
Your feedback plays a crucial role in shaping the direction of feature development and prioritization.
The tool fetches data from various sources, including your open-source project repositories, and updates this information regularly.
You may see some delays in real-time data because the tool fetches data from various sources.
Note: The Community Management tool focuses exclusively on publicly available GitHub repositories. Forks and certain repositories are purposely excluded from monitoring to streamline the data integration process.
Insights V3 is an open source analytical tool that provides insights from analyzing open source software (OSS) projects.
Insights V3 helps project leads and technical managers understand their team members' engagement and participation in open source projects, and identify the most active and productive contributors.
The tool has many features that can be very useful in a number of ways, such as:
Data Visualization and Reporting: Insights will use data visualization techniques to make it easier to understand the metrics and reports generated by the software.
Code Analysis: Insights V3 analyzes the project's codebase for metrics such as code complexity, and code quality that help project leads and managers identify potential areas for improvement.
Development metrics: Metrics such as commit frequency, pull request acceptance rates, and time-to-resolution for issues can provide insights into the project's development process and help project leads and managers identify areas for improvement.
Integration with other tools: Integration with other tools commonly used in software development, such as Git, GitHub, or Jira, can provide a more competent view of the project's development and make it easier for contributors and managers to track progress.
Customization and flexibility: Providing users with the ability to customize the analytics tool to fit their specific needs and workflows can increase its usefulness and adoption.
Insights V3 has a user-friendly interface that is easy to navigate. The tool is designed to be intuitive, which means that you can quickly learn how to use it and start gaining insights from your data.
To use the new Insights user interface, follow these steps:
Visit the Insights web URL. You will be redirected to the Insights home page.
The Insights Dashboard is the default dashboard.
You can see all the foundation and project cards on the main page. Alternatively, search the project or a foundation using the Search Bar.
Insights V3 is the perfect tool for you if you:
Track the performance of open source projects in real time.
Want to analyze data quickly?
Are looking for an online reporting tool.
Want to download the reports in CSV or any other format?
Compare the reports for the selected time period.
Measure the project's growth and the team's performance.
Track historical data to identify trends and patterns.
Detect potential issues early and take corrective actions.
The target audience for Insights V3 can vary depending on its specific features and functionalities. Here are some of the key target audience groups that can benefit from this tool:
The Landing Page provides all the important analytics about your foundations and projects. It is designed to give you a quick overview of your data and help you navigate the tool easily.
This page focuses on the Foundation Cards and the individual Project Cards, which serve as the core navigational elements, presenting the key data metrics.
Select the Projects and the Foundations: The search box at the top of the main menu helps you find a particular project or repository.
Foundation Cards: Foundation Cards are like summary cards that provide key insights into different open source foundations. When you click on one of these cards, you will be redirected to a Foundation Overview dashboard specifically dedicated to that foundation.
When you click on a foundation card that has only one project, you will be redirected to the Project Overview page.
Project Cards: On the main page, you will see the project cards. Each card represents an individual open source project. When you click on a project card, it takes you to a dedicated Overview Page for that project. These cards show you real-time data about each project, such as important numbers and updates.
To see the detailed definition, click > to expand.
Identity can be claimed by a . A single can have multiple identities in different data sources or even multiple identities of the same data source type.
The user is an LFX profile. The user will have an LFID and can claim .
A contributor is defined as someone who makes a .
A contributor is either an unclaimed or a set of claimed identities (under a single ).
A new contributor is someone who contributes the first in a specific time range and can also be in a particular project.
This means that a given should not have any before the current time range start date (for a specific project, we consider a given project's new contributors).
An identity is virtually affiliated with a company when the 's email address domain matches the organization domain defined in a special org_domain table.
Click on a Foundation Card from the Landing Page or search (2) for the foundation using the search box at the top.
Scroll down to see all the listed foundations and projects.
The Foundation Overview page in Insights V3 provides a comprehensive snapshot of your open source foundation, enabling you to gain valuable insights into your projects' performance and growth.
Disclaimer: It is not necessary that all the foundations follow the same maturity level categorization. So, the Foundation Overview page may look different for your foundation.
At the top of the page, you will find the header section, which includes the following elements:
The name of your foundation is displayed prominently at the top of the page, providing clear identification.
This feature allows you to search for specific projects within your foundation, making it easy to find and access project information quickly. Select a project to go to the project overview page.
At the top, you will see the following four high-level metrics:
Projects: The metric shows the total projects within the foundation.
Contributors: It shows the total number of contributors among all the projects within the foundation.
Lines of Code: Displays the total lines of code written for all the projects within the foundation.
Organizations: The metric shows the total number of organizations that have contributed to the projects within the foundation.
Using this search box, you can select another foundation or a project.
On the landing page, the foundation cards are designed to show you real-time data and key metrics related to the foundation and its projects.
A foundation card displays the following key metrics:
When you click on a foundation card, it opens up a Foundation Dashboard dedicated to that foundation. Here, you will find more detailed information about the foundation's contributions to the open source community.
A foundation card has the following details:
It shows the key metrics of the foundation.
Software Value: Constructive Cost Modal (COCOMO) is a procedural cost estimate model for software projects.
On the Foundation Overview page, you will find the Project Ecosystem Metrics. This section includes two informative charts.
Project Ecosystem Metrics in an open source foundation represent quantitative measurements that provide insights into the health, growth, and diversity of projects within the foundation's ecosystem. These metrics involve data analysis across various dimensions, such as:
Project Maturity Levels: Categorization of projects based on their developmental stage, community engagement, and stability.
Growth Trends: Analysis of the number of projects being accepted over time, indicating the expansion and attraction of the foundation's ecosystem.
Diversity Indices: Evaluation of the diversity within projects and their communities, assessing the inclusiveness and global reach of the foundation's ecosystem.
Sustainability Indicators: Insights into the long-term viability of projects, including funding, resource allocation, and project continuity plans.
Disclaimer: It is not necessary that all the foundations follow the same maturity level categorization. So, the Foundation Overview page may look different for your foundation.
You will see the total number of projects of the foundation as per their maturity level.
The chart enables visualization of growth and acceptance patterns for new projects.
Hovering over the chart reveals the count of projects accepted during specific time frames.
It presents a historical trend of project acceptances into your foundation over time.
Provides analysis of acceptance rates to identify periods of high or low project acceptance.
The subsequent chart illustrates the trend of projects approved by your foundation.
Distribution of project based on maturity level and rating
The metric categorizes your projects based on their maturity level and rating.
This helps you see how projects are distributed across different maturity levels and ratings, allowing you to make informed decisions about resource allocation and project management.
At each maturity level, projects are further segregated as per rating.
For example: Click on the pie chart under the Incubating Projects card to see the projects' categorization as per the ratings.
From the left navigation pane, click the icon to return to the Landing Page.
Click on theicon to open the GitHub page of the foundation.
Click on the icon to open the foundation's webpage.
Hover over the to see the inception year of the foundation.
Download Icon: click icon (2) to download the foundation card.
Project velocity in open source projects refers to the rate at which development tasks are completed and features are delivered. It measures the amount of work completed in a specific amount of time.
A higher velocity suggests increased efficiency and progress, while a lower velocity may indicate challenges or bottlenecks.
Monitoring project velocity helps teams assess their performance and plan future tasks accordingly, ensuring steady project advancement.
The Project Velocity chart displays data from the last calendar year.
On the Y-axis, there's a logarithmic scale representing PRs and Issues.
On the X-axis, there's a logarithmic scale representing commits.
The chart visualizes the correlation between code changes and collaboration.
To further understand the project's velocity, create a leaderboard. This ranks projects based on their commit numbers and provides a comparative view of their commits, PRs, and issues. This leaderboard can help in identifying the most active projects at a glance.
Review the top projects based on their commit numbers.
Compare their commit count, PRs, and issues in a single view.
On the Projects page, you can see the project cards of the selected foundation with their project maturity tags.
You can filter the project cards using the Maturity Level, Rating, and Accepted filter options.
On the landing page, the project cards are designed to show you real-time data and key metrics related to each project.
A project card displays the following key metrics:
Key metrics on a project card may vary as per the data sources. Projects with Git data sources will have fewer metrics.
When you click on a project card, it opens up an overview page dedicated to that specific project. This overview page provides more detailed information about the project, such as in-depth analytics, charts, and other relevant data.
The Project Card has the following details:
GitHub Icon: Click the (1) GitHub icon to go to the GitHub repositories of the project.
Aggregated data: it shows the real-time data of contributions, commits, PRs, issues, stars, and forks for the project.
Info Icon: shows the date and the time when the Best Practice Score was last updated.
Software Value: Constructive Cost Modal (COCOMO) is a procedural cost estimate model for software projects.
Download Icon: click the icon (2) to download the project card.
Constructive Cost Model
The COCOMO (Constructive Cost Model) is a widely used model that estimates the effort, time, and cost associated with software development projects.
The model takes into account factors such as project size, complexity, team experience, and development environment.
The COCOMO model consists of three different levels or modes:
Basic COCOMO: This mode is used for early-stage project estimates and focuses on estimating effort based on lines of code (LOC). It uses a simple formula to calculate the effort required for a project, taking into account the project size in KLOC (thousands of lines of code).
Insights V3 uses the basic model to calculate the software estimates for the selected open source projects.
Constants based on Software Project Types (stored in the DB):
For more information, see:
The Commits metric refers to the analysis of contributor's code commits within a specified timeframe. A code commit represents a unit of change to the software's source code repository.
Each commit includes the following:
committed-commit
("Default Branch" only)
In this chart, only commits are counted, not the Roles. Each commit with a unique Commit SHA is counted as one Commit. The roles do not matter here.
The dashboard shows the commits snapshot and a detailed chart. The detailed chart is a combined chart (line chart and bar chart) that shows new commits vs. total commits.
On the Overview page, select the project and repositories (1) for which you want to see the data.
Select the specific period using the filter option (2).
The high-level tile (3) shows you the total commits for the selected time range.
The detailed analysis chart shows you the New commits and the cumulative count of total commits for the selected period. On the left side, the chart shows the chart trend summary (4).
Hover over the chart (5) to see the new commits and the total commits for the selected month.
This interactive download feature (6) enables you to download the chart in CSV and PNG file formats.
The metric enables project maintainers and stakeholders to gain valuable insights into code changes and progress within a specified period.
It provides insights into the volume and frequency of code changes made by contributors. By visualizing commit data in a bar chart, you can track the progress of development efforts over time.
Changes in commit counts provide periods of intense development, periods of slower activity, or the impact of specific events or milestones on the project.
The overview page should provide a high-level summary of the project's activity, contributors, and performance metrics, including:
The number of contributors and their distribution by location or organization.
The total number of commits, pull requests, and issues.
The average time to resolve issues and merge pull requests.
The overall health of the codebase, including code quality and security vulnerabilities.
The level of community engagement, such as the number of comments on pull requests.
The analytics tools on the overview page provide a range of features and visualizations that can help you gain insights into the project's performance, identify areas for improvement, and make informed decisions about development and collaboration.
The primary data sources for Insights V3 are the code repositories and the publicly available GitHub and Git databases. Refer to Integrations to learn more about data connectors.
The data visualization on the overview page shows real-time data on the total number of contributors and the total number of active contributors across all monitored repositories during the selected time period.
On the Overview page, select the project and repositories (1) for which you want to see the data.
Select the specific time period using the filter option (2).
The high-level tile (3) shows you the total unique contributors (calculated based on their member ID) for the selected time range.
The detailed analysis chart shows you the active contributors and the cumulative count of total contributors for the selected period. On the left side, the chart shows the chart trend summary (4).
Hover over the chart (5) to see the number of active contributors and the total contributors for the selected month.
This interactive download feature (6) enables you to download the chart in CSV and PNG file formats.
When you want to see the health of your open source project, the Contributor Chart is a crucial project performance indicator.
Visualizing the number of contributors over time makes it easier to identify trends, patterns, and overall community interest. The trend helps project maintainers and other stakeholders act based on the charts.
Tracking the number of contributors can provide insights into the health and vitality of your project.
By analyzing changes in the contributor count, project managers can gain insights into the effectiveness of their community outreach and development strategies.
The Overview page's Key Metrics and Detailed Analysis section includes six cumulative charts highlighting significant trends and patterns within your analytics data.
The cumulative charts help you compare different metrics on single charts. This comparative analysis helps identify relationships and draw meaningful conclusions.
The six cumulative charts show quick snapshots of the analytical data and the detailed analysis chart that helps you with the more profound analysis.
The Fork Metric measures and analyzes the number of times a project has been forked by other developers.
Forking is the process of creating a copy of a project's source code repository to either modify and enhance the project or use it as a starting point for a new project.
The bar chart on the dashboard represents the analysis, displaying the number of forks over time. Hover over a specific bar to access the detailed fork information for that particular month within the selected period.
The interactive download feature (Icon) enables you to download the chart in CSV and PNG file formats.
How popular is the project? The Fork Metric provides insights into the popularity of your project. A higher number of forks generally indicates that the developers find your project useful and valuable enough to build it or adapt it to their specific needs.
Code Reuse: By analyzing the Fork Metric, you can get data on code reuse and identify potential opportunities for improvement.
Community Engagement: A growing number of forks indicates an active and involved community, contributing to the project's growth.
Project Evolution: By monitoring forks over time, you can identify significant milestones.
Within Insights V3, the "Filter the Date" feature allows you to customize your analytics view based on specific date ranges. This feature provides flexibility and control over the time period for which data is displayed.
Follow these steps to utilize the date-filtering feature:
On the right side of the analytics dashboard, locate the "Date Filter" section.
Click on the "Date Filter" section to expand the options.
Choose from the predefined date range options: Today, Last Week, Last 30 days, Last Quarter, Last Year, Last Two Years, or All Time.
Select the desired option by clicking on it.
Click the Bots checkbox if you want to hide the bots' data from the analytics.
The filter has This Year as a default time period.
To specify a custom date range, click on the Custom option within the date range selection menu.
On the calendar widget, select the start and end dates for your custom range.
The analytics dashboard will automatically update to display data within the selected custom date range.
After selecting a predefined date range or setting a custom date range, click the Apply button to apply the date filter.
The analytics dashboard will refresh to reflect the chosen date range, displaying data only for the selected period.
To display data for the entire available range and remove the date filter, click on the Clear Dates button.
A Contribution Leadership board visualization displays the contributions made by individual contributors to an open source project. It ranks contributors based on the number of code commits, pull requests, issues closed, or other metrics and visually represents their relative activity levels and impact on the project.
This chart displays individual identities, not merge contributors, as in Community Management tool. Even if we combine some identities in CM, they will still show up as different ones in the Insights V3 leaderboard.
Merging contributors refers to combining the contributions of a single individual who may have been working under different user accounts.
This can happen when a contributor has multiple git configurations or submits pull requests from different email addresses.
Merging these accounts into a single identity ensures the accuracy of the leaderboard and gives a true representation of an individual’s contributions.
Recognition and Motivation: The Contributor Leaderboard recognizes and acknowledges the efforts of individual contributors. It highlights their contributions, encourages ongoing engagement, and motivates contributors to continue their valuable work.
Community Engagement: It creates a sense of community and healthy competition, encouraging collaboration and inspiring others to contribute and improve their ranking on the leaderboard.
Collaboration Opportunities: The leaderboard helps project maintainers and community members identify potential collaborators or subject-matter experts within the project. It will be easier to identify the most active contributors and connect with them.
The Issue Metric measures the number of issues reported and tracked within a specified period. It compares the number of issues opened, the number of issues closed, and the total commits for the selected time period.
The metric is based on the following activity types:
issues-closed
issues-opened
The analytics tool employs a combined chart (a line chart and bar charts) on its dashboard to analyze the Issue Metric. The line on the chart connects the data points, allowing you to observe trends and patterns over time.
The dashboard shows the issues (opened +
closed issues) in a snapshot and a detailed chart (open, closed, and the total issues).
On the Overview page, select the project and repositories (1) for which you want to see the data.
Select the specific time period using the filter option (2).
The high-level tile (3) shows you the total issues (open + closed) for the selected time range.
The detailed analysis chart shows you the open issues, closed issues, and the cumulative count of total issues for the selected period. On the left side, the chart shows the chart trend summary (4).
Hover over chart (5) to see the open issues, closed issues, and total issues for the selected month.
This interactive download feature (6) enables you to download the chart in CSV and PNG file formats.
Issues Tracking and Management: By visualizing the data on a line chart, it becomes easier to identify the increase or decrease in issue activity, allowing for effective resource allocation and prioritization.
Performance Evaluation: The Issue Metric helps in evaluating the performance of the development team and the project as a whole. Changes in issue count over time indicate improvements in software quality, bug-fixing efficiency, or the impact of development efforts.
Community Engagement: A higher number of reported issues indicates the active participation and involvement of the community in the open source project.
On a regular basis, a number of checks are performed on each repository listed in the database.
Checks are grouped into check sets.
One or more check sets
are applied to a single repository, and each check set specifies the number of checks that will be performed on the repository.
The check’s file must declare the following information:
ID
: check identifier.
WEIGHT
: weight of this check, used to calculate scores.
CHECK_SETS
: check sets this new check belongs to.
ID: adopters
List of organizations using this project in production or at stages of testing.
This check passes if:
An adopters file is found in the repository. Globs used:
An adopters reference is found in the repository’s README
file. This is in the form of a title header or a link. Regexps used:
ID: changelog
A curated, chronologically ordered list of notable changes for each version.
This check passes if:
A changelog file is found in the repository. Globs used:
A changelog reference is found in the repository’s README
file. This can be in the form of a title header or a link. Regexps used:
A changelog reference is found in the last GitHub release content body. Regexps used:
ID: code_of_conduct
Adopt a code of conduct to establish community standards, promote an inclusive and welcoming initiative, and outline procedures for handling abuse.
This check passes if:
A code of conduct file is found in the repository. Globs used:
A code of conduct reference is found in the repository’s README
file. This can be in the form of a title header or a link. Regexps used:
A code of conduct file is found in the default community health files repository, for example.
ID: contributing
A contributing file in your repository provides potential project contributors with a short guide to how they can help with your project.
This check passes if:
A contributing file is found in the repository. Globs used:
A contributing reference is found in the repository’s README
file. This can be in the form of a title header or a link. Regexps used:
A contributing file is found in the default community health files repository.
ID: governance
Document that explains how the governance and committer process works in the repository.
This check passes if:
A governance file is found in the repository. Globs used:
A governance reference is found in the repository’s README
file. This can be in the form of a title header or a link. Regexps used:
The maintainers file contains a list of the current maintainers of the repository.
This check passes if:
A maintainers file is found in the repository. Globs used:
A maintainers reference is found in the repository’s README
file. This can be in the form of a title header or a link. Regexps used:
ID: readme
The readme file introduces and explains a project. It contains information that is commonly required to understand what the project is about.
This check passes if:
A readme file is found in the repository. Globs used:
ID: roadmap
Defines a high-level overview of the project’s goals and deliverables ideally presented on a timeline.
This check passes if:
A roadmap file is found in the repository. Globs used:
A roadmap reference is found in the repository’s README
file. This can be in the form of a title header or a link. Regexps used:
ID: summary_table
The Projects Summary Table is a CNCF Business Value Subcommittee initiative to supplement the CNCF Landscape and include further information about CNCF projects for the wider Cloud Native community.
This check passes if:
At least one of the summary_* fields has been set in the project's extra section in the Landscape YAML file.
ID: website
A url
that users can visit to learn more about your project.
This check passes if:
A website url
is configured in the GitHub repository.
ID: analytics
Project websites provide some web analytics.
This check passes if:
A Google Analytics 3 (Universal Analytics) Tracking ID is found in the source of the website configured in GitHub. Regexps used:
A Google Analytics 4 Measurement ID is found in the source of the website configured in Github. Regexps used:
The HubSpot tracking code is found in the source of the website configured in Github. Regexps used:
ID: artifacthub_badge
Projects can list their content on Artifact Hub to improve their discoverability.
This check passes if:
An Artifact Hub
badge is found in the repository’s README
file. Regexps used:
ID: cla
The CLA defines the conditions under which intellectual property is contributed to a business or project.
This check passes if:
A CLA check is found in the latest merged PR on GitHub. Regexps used:
This check will be automatically marked as exempt if the DCO check passes but this one does not.
ID: community_meeting
Community meetings are often held to engage community members, hear more voices, and get more viewpoints.
This check passes if:
A reference to the community meeting is found in the repository’s README
file. Regexps used:
ID: dco
Mechanism for contributors to certify that they wrote or have the right to submit the code they are contributing.
This check passes if:
The last commits in the repository have the DCO signature (Signed-off-by). Merge pull request and merge branch commits are ignored for this check.
A DCO check is found in the latest merged PR on GitHub. Regexps used:
This check will be automatically marked as exempt if the CLA check passes, but this one does not.
ID: github_discussions
Projects should enable GitHub discussions in their repositories.
This check passes if:
A discussion that is less than one year old is found on GitHub.
ID: openssf_badge
The Open Source Security Foundation (OpenSSF) Best Practices badge is a way for Free/Libre and Open Source Software (FLOSS) projects to show that they follow best practices.
This check passes if:
An OpenSSF
(CII) badge is found in the repository’s README
file. Regexps used:
ID: openssf_scorecard_badge
Scorecard assesses open source projects for security risks through a series of automated checks. For more information about the Scorecard badge please see https://github.com/marketplace/actions/ossf-scorecard-action#scorecard-badge.
This check passes if:
An OpenSSF
Scorecard badge is found in the repository’s README
file. Regexps used:
ID: recent_release
The project should have released at least one version in the last year.
This check passes if:
A release that is less than one year old is found on GitHub.
ID: slack_presence
Projects should have presence in the CNCF Slack or Kubernetes Slack.
This check passes if:
A reference to the CNCF Slack or Kubernetes Slack is found in the repository’s README
file. Regexps used:
Calculating a global score for a best practice score in an open-source project involves evaluating various aspects of the project against predefined best practices and assigning weights to those aspects based on their importance. Let's understand this with the sample example.
Define the following set of best practices that are important for the success and quality of the open-source project. Each category should have a set of criteria that can be evaluated.
Assign weights to each category based on their relative importance. These weights should add up to 100%. The weights reflect how much each category contributes to the overall quality of the project.
Evaluate Each Criterion
For each criterion within a category, evaluate the project and assign a score.
Use a numerical scale (0–10) or any other suitable scale.
Code of conduct: 8
Governance: 9
Maintainer: 8
Website: 7
Analytics: 9
GitHub Discussion: 10
Community meetings: 8
Binary Artifacts: 8
Dangerous Workflow: 9
Approved Licenses: 9
It is calculated by the average score* weights
Documentation : ((8+9+8+7)/4)*.4= 3.2
Standards: ((9+10+8)/3)*.30= 2.7
Security: ((8+9)/2)*.20= 1.7
Legal: 9*.10= .9
Calculate Global Score
Sum up the category scores to obtain the global score for the best practice score of the open-source project.
Documentation+ Standards+Security+Legal= 3.2+2.7+1.7+.9 = 15.58
Contributor Dependency measures and analyzes the dependencies or relationships between different contributors within a project. It explores how contributors rely on each other, collaborates, and interact in terms of code contributions, reviews, and other collaborative activities.
Contributor dependency shows the relationship between contributors or entities within a project, where the actions or outputs of one contributor depend on the inputs or outputs of another.
Collaboration: It identifies which contributors frequently interact, exchange ideas, review each other's work, and collaborate on code changes.
Knowledge Sharing and Expertise: Understanding these dependencies can help project maintainers identify subject matter experts, encourage knowledge sharing, and allocate resources effectively.
Project Health and Sustainability: By analyzing Contributor Dependency, project maintainers can evaluate the health and sustainability of the project. Dependencies that are concentrated around a few contributors may pose risks if those contributors become less active or leave the project.
A best practice score visualization is a tool that helps project leads and managers assess the overall health and quality of an open source software project.
It typically evaluates the project against a set of best practices or standards for software development, such as the categories Documentation
,
Standards
,
Security
``
Legal` and `Reliance`
.
It generates a score or rating based on how well the project meets these criteria.
On the Overview page, select the project and repositories for which you want to see the best practice score.
Select the specific time period using the filter option.
Scroll down to find the best practice score dashboard.
You can see the aggregated score (3) and each category's score on the dashboard.
Click on any category to see the expanded page where you can see the detailed analysis for each repository.
Click the Create Issue button to create an issue for each repository.
ID: license_approved
Whether the repository uses an approved license or not.
This check passes if:
The license identified matches any of the following:
ID: license_scanning
License scanning software scans and automatically identifies, manages, and addresses open source licensing issues.
This check passes if:
A FOSSA
or Snyk
link is found in the repository’s README
file. Regexps used:
ID: Apache_2.0
A permissive license whose main conditions require preserving copyright and license notices. Contributors provide an express grant of patent rights. Licensed works, modifications, and larger works may be distributed under different terms and without source code.
ID: trademark_disclaimer
Project sites should have the Linux Foundation trademark disclaimer.
This check passes if:
The Linux Foundation trademark disclaimer is found in the content of the website configured in Github. Regexps used:
Insights V3 often incorporates a Geographical Distribution metric to provide insights into the locations from which contributions originate.
Geographical Distribution analyzes and visualizes the contributions made by contributors across different regions around the world. It provides a breakdown of the top regions based on the total number of contributors, providing a clear understanding of the project's global engagement and scope.
Hover over the chart to view the number of contributors for each region during the selected period. This information provides a more granular view of contributor activity within specific regions.
Global Impact: Geographical Distribution allows you to assess the global impact of the open source project by providing insights into the regions where contributions are coming from.
Regional Comparison: Compare the contribution numbers across different regions to identify any notable variations. Assess whether certain regions show consistent contribution levels or if there are fluctuations that require further investigation.
Top Contributing Regions: It helps to identify the top five regions based on the total number of contributions. These regions represent areas where the project has significant engagement and impact.
ID: code_review
ID: dangerous_workflow
ID: dependency_update_tool
ID: maintained
ID: sbom
List of components in a piece of software, including licenses, versions, etc.
This check passes if:
The latest release on Github includes an asset which name contains sbom. Regexps used:
The repository’s README
file contains a SBOM section that explains where they are published to, format used, etc. Regexps used to locate the title header:
ID: security_policy
Clearly documented security processes explaining how to report security issues to the project.
This check passes if:
A security policy file is found in the repository. Globs used:
A security policy reference is found in the repository’s README
file. This can be in the form of a title header or a link. Regexps used:
ID: signed_releases
ID: token_permissions
Click the Download icon to download the dashboard.
For more information, see .
A link pointing to the license scanning results is provided in the metadata file.
Licensed under the Apache License, Version 2.0 (the "License"); you may not use except in compliance with the License. You may obtain a copy of the License at
This check determines whether the project has generated executable (binary) artifacts in the source repository. For more details, see the
This check determines whether the project requires code review before pull requests (merge requests) are merged. For more details, see the
This check determines whether the project’s GitHub Action workflows has dangerous code patterns. For more details, see the .
This check tries to determine if the project uses a dependency update tool, specifically or . For more details, see the
This check determines whether the project is actively maintained. For more details, see the
A security policy file is found in the
This check tries to determine if the project cryptographically signs release artifacts. For more details, see the
This check determines whether the project’s automated workflows tokens are set to read-only by default. For more details, see the
Organization Dependency Metric shows the analysis of how much a project's contributions depend on or are associated with different organizations.
With Organization Dependency Metrics, you can assess which organizations are significantly contributing to your project.
Engagement Assessment: For organizations involved in the project, this metric helps assess their level of engagement and impact. It can encourage healthy competition among contributors, resulting in greater involvement.
Risk Management: Dependency on a single organization for contributions can be risky. If that organization reduces its involvement, the project might face challenges.
The Active Days metric measures the number of days a contributor has made at least one contribution to a project. It counts the number of days on which a contributor has been actively engaged in the project's development.
The Active Day chart also displays two bars for the current data and the previous data, allowing you to compare them.
The Active Days dashboard provides you with the following insights:
You can monitor progress and identify trends. This information can be used to set goals and benchmarks for the project and measure success.
The active days metric provides a quick snapshot of the project's activity level. It helps determine whether the project is actively maintained or not.
The visualization can be used to quickly assess the activity level of a repository. A repository with a high number of active days is likely to be more active and healthy than one with a low number of active days.
By highlighting the importance of active days, project managers can encourage new contributors to become more involved in the project.
The Confluence Data Analytics Dashboard available in the open-source analytics application, Insights V3, provides a comprehensive suite of tools designed for deep analysis of user interactions, collaboration patterns, and content efficiency within the Confluence platform. Here is an overview of its features:
Track peak times for user engagement and contributions.
User Interaction Graphs: Visualize the network of collaborations among users.
Page Views and Edits Tracking: Monitor the popularity and evolution of content over time.
Most Engaged Content: Identify the content that receives the most views, comments, and shares.
Team Performance Metrics: Evaluate the productivity and collaboration levels of different teams.
Individual Contribution Insights: Assess the input of individual team members in the collaborative process.
Flexible Filtering: Create custom reports by applying filters based on users, time frames, and content types.
Export Features: Export reports in various formats for sharing or further analysis.
The dashboard integrates seamlessly with Confluence, leveraging its API to pull real-time data. This enables teams to make data-driven decisions, enhance collaboration, and improve content quality on the Confluence platform.
The Organization Leaderboard ranks organizations based on their activity types on Confluence pages for the selected date range. These activity types include new pages, blog posts, attachments, total pages, and page comments.
The leaderboard ranks organizations on the Confluence platform according to the following criteria:
New Pages: The number of new pages created by the organization.
Blog Posts: The frequency and quality of blog posts published.
Attachments: The number of files and documents attached to pages and posts.
Page Comments: The level of engagement is demonstrated by comments on pages.
The Contributor Leaderboard on the Confluence Dashboard displays a ranking of users based on their contributions to Confluence activities within a specified date range.
Contributors are ranked based on metrics including new pages, comments, attachments uploaded, and blog posts on the platform.
The leaderboard provides valuable insights into user engagement and productivity within the Confluence environment.
To access the Contributor Leaderboard in Confluence, follow these steps:
On the Overview page, select the project and repositories (1) for which you want to see the data.
On the left navigation pane, click Data Integrations> Confluence.
Select the specific date range using the filter option (2).
Scroll down and you will see the Contributors Leaderboard widget.
Use the drop-down menu to filter the leaderboard based on specific confluence activities such as page edits, comments, attachments, and blog posts.
The leaderboard will dynamically update to display rankings based on the selected date range and activity filter.
The leaderboard serves several purposes:
Enhancing Engagement: Motivates users to participate more actively.
Tracking Productivity: Offers insights into who is the most active contributor.
Identifying Knowledge Leaders: Helps in recognizing contributors who are pivotal in spreading knowledge and expertise across the organization.
By effectively utilizing the Confluence Contributor Leaderboard, organizations can foster a more engaged and productive community, driving the collective success of their projects.
The Organization Leaderboard ranks organizations based on their contributions to the project. The leaderboard provides insights into the collective efforts of organizations to drive the success and growth of your projects.
It helps you determine if your project has a healthy contribution from multiple organizations and if new organizations are coming to contribute to the project.
Recognition: The Organization Leaderboard recognizes and showcases the contributions made by various organizations.
Project Sustainability: The Organization Leaderboard evaluates the involvement of organizations and assesses the project's long-term sustainability and growth potential.
Trust and Credibility: When organizations are actively engaged in your projects and their contributions are recognized through the leaderboard, it enhances the overall trust and credibility of the project.
The metric provides insights into the technical contribution breakdown across weekdays and weekends. It shows the time of day when most of the contributions happen so that you have maximum participation in the project. Each day is categorized by its level of activity, indicating low to high contribution levels.
Commits are recorded in the individual contributors' local time zone.
Only commit data is used for this dashboard.
Commits
committed-commit
co-authored-commit
authored commit
Activity Level Assessment: Work Time Distribution allows you to assess the technical activities across different days of the week. By analyzing the chart, project managers can identify contribution patterns and trends, such as peak activity days or days with lower participation.
Productivity Monitoring: Work Time Distribution helps you to monitor contributors' productivity and engagement. By analyzing the breakdown of contributions, you can identify periods of high productivity and low engagement.
Work Optimization: By understanding the distribution of contributions across weekdays and weekends, project managers can identify potential collaboration challenges due to varying availability.
Weekday vs. Weekend Contributions: Compare the contribution levels between weekdays and weekends. Assess significant activities differences, and identify any patterns or preferences in contributor engagement during these periods.
Maximum Participation: As an Executive Director or Maintainer, when you want to set up a community call for your project, you view the time when most of the contributions happen so that you can have maximum participation.
The Activities Breakdown chart provides a detailed breakdown of various activities on your Confluence pages, including new pages, blog posts, attachments, total pages, and page comments.
To access the Activities Breakdown, follow these steps:
On the Overview page, select the project and repositories (1) for which you want to see the data.
On the left navigation pane, click Data Integrations> Confluence.
Select the specific date range using the filter option (2).
Scroll down and you will see the Activities Breakdown widget.
Analyze the distribution of activities to understand user engagement patterns and trends over time.
Click icon to download the chart in PNG or CSV format.
A mailing list is a communication platform where you can sign up to communicate messages via email. It acts as a centralized hub for discussions, announcements, and collaborations among a specific group sharing common interests or working towards a common project.
Groups.io is an online platform that offers mailing list management and hosting facilities. It offers features for creating, managing, archiving mailing lists, and facilitating email communication. To learn more, see https://groups.io/.
The Mailing Lists Dashboard within Insights V3, integrated with Groups.io, provides data insights into project communications. Its primary objectives and goals include:
Objective: Centralizing and analyzing communication data from Groups.io mailing lists.
Goals:
Communication Analysis: Understand the frequency, nature, and trends of interactions within mailing lists.
Engagement Measurement: Measure user engagement levels, message frequency, and active contributors.
Community Insights: Identify contributors and organizations involved, fostering collaboration and understanding community dynamics.
This leaderboard ranks contributors by their engagement level across all mailing lists within the specific timeframe, showcasing the most engaged and active contributors.
The leaderboard integrates data from Groups.io to help you understand how new contributors are engaging with your project's mailing lists.
On the Overview page, select the project and repositories for which you want to see the new contributor leaderboard.
From the left navigation, click Mailing Lists.
Look for the date filter in the top-right corner of the dashboard.
Click on the calendar icon and select the timeframe you want to analyze. You can choose predefined options like "Last year," or "this year," or set a custom date range.
Scroll down the Mailing Lists dashboard to see the most active contributors leaderboard.
The leaderboard displays the most active contributors based on their message count during the chosen period.
Each entry shows:
Rank: Position on the leaderboard based on message count.
Contributor Name: Username of the contributor.
Messages: Total number of messages posted.
Last Message Date: Date of the most recent message posted.
Click Show More to expand the list.
Click the Download icon to download the leaderboard.
The ranking is based on the number of messages or contributions from newly onboarded contributors to the mailing lists during a specific period.
This dashboard provides insights into the new contributor's activity for the selected projects. It integrates data from Groups.io to help you understand how new contributors are engaging with your project's mailing lists.
On the Overview page, select the project and repositories for which you want to see the new contributor leaderboard.
From the left navigation, click Mailing Lists.
Look for the date filter in the top-right corner of the dashboard.
Click on the calendar icon and select the timeframe you want to analyze. You can choose predefined options like "Last year," or "this year," or set a custom date range.
Scroll down the Mailing Lists dashboard to see the leaderboard.
The leaderboard displays the top contributors based on their message count during the chosen period.
Each entry shows:
Rank: Position on the leaderboard based on message count.
Contributor Name: Username of the contributor.
Messages: Total number of messages posted.
Last Message Date: Date of the most recent message posted.
Click Show More to expand the list.
Identify Potential Contributors: Discover individuals who are actively engaging and might be interested in contributing to other project areas.
Measure Community Growth: Track the rate at which new people are joining the mailing list, which can indicate overall community health.
Regularly review the leaderboard to identify any trends or patterns in new contributor activity.
Click the Download icon to download the leaderboard.
The Geographical Distribution chart is a feature in the Mailing Lists dashboard of the open-source analytics application, Insights V3. This interactive chart provides a visual representation of the geographical locations of contributors to your mailing lists.
It allows you to see where your contributors are located and the extent of their contributions during a selected period.
With this chart, you can:
View the distribution of contributors globally.
Understand the extent of contributions from different regions.
Filter data by a specified period to analyze trends.
To access this feature:
Navigate to the Insights V3 dashboard.
Click on the Mailing Lists section.
Locate and select the Geographical Distribution
chart.
This tool is invaluable for community managers looking to understand and grow their global contributor base.
The Key Metrics and Detailed Analysis section includes four high-level tiles with charts highlighting significant trends and patterns within your analytics data.
The cumulative charts help you compare different metrics on single charts. This comparative analysis helps identify relationships and draw meaningful conclusions.
Mailing Lists: Indicates the total number of mailing lists associated with the project for the selected period.
Messages: Displays the total count of messages exchanged within the selected mailing list(s) for the selected period.
Contributors: shows the total number of contributors actively participating in discussions.
Organizations: highlights the involved organizations or entities contributing to the discussions within the mailing lists.
On the Overview page, select the project and repositories (1) for which you want to see the data.
From the left navigation, click Mailing List.
Select the specific period using the filter option (2).
Click the high-level tile (3), which shows the total number of themailing lists/ messages/ contributors/ organizations
for the selected time range.
The detailed analysis charts show the following details:
Mailing Lists: shows Active mailing lists vs. total mailing lists.
Messages: shows new messages vs. the total messages for the selected period.
Contributors: shows new contributors vs. the total contributors for the selected period.
Organizations: shows new organizations vs. total organizations for the selected period.
The ranking is based on the number of messages or contributions from newly onboarded organizations to the mailing lists during a selected period.
It highlights the engagement level of these organizations by showcasing their rankings derived from the quantity of messages contributed across the entire spectrum of mailing lists.
On the Overview page, select the project and repositories for which you want to see the new contributor leaderboard.
From the left navigation, click Mailing Lists.
Look for the date filter in the top-right corner of the dashboard.
Click on the calendar icon and select the timeframe you want to analyze. You can choose predefined options like "Last year," or "this year," or set a custom date range.
Scroll down the Mailing Lists dashboard to see the leaderboard.
The leaderboard displays the new organizations based on their message count during the chosen period.
Each entry shows:
Rank: Position on the leaderboard based on message count.
Name: Username of the contributor.
Messages: Total number of messages posted.
Last Message Date: Date of the most recent message posted.
Click the Download icon to download the leaderboard.
This ranking displays the most recent discussions based on the timing of the last messages posted. It highlights the latest and most active discussions within the mailing lists for the selected period.
The table below provides a snapshot of the most active discussions within our mailing lists, ranked by the timing of the last message posted. This ensures you are always informed about the freshest and most relevant conversations.
Please ensure to join these discussions to share your insights and contribute to our community's knowledge.
Rank | Discussion Topic | Last Message Date |
---|---|---|
Insights V3 incorporates a Performance Metric to provide insights into key performance indicators such as time to merge pull requests, build frequency, and build failure rate.
The dashboard presents this information using a bar chart, allowing you to visualize and analyze these performance metrics over a selected time period.
The dashboard presents this information using a bar chart, allowing project managers to visualize and analyze these performance metrics over a selected time period.
Time to Merge Pull Requests: Evaluate the average time taken to merge pull requests. Identify any significant variations or trends in the time-to-merge, which can indicate potential inefficiencies in the code review and merge processes.
Build Frequency: Assess the frequency of software builds. A higher build frequency signifies more frequent integration of code changes and adherence to continuous integration practices. A consistent and regular build schedule ensures rapid feedback and promotes collaboration among contributors.
Build Failure Rate: Analyze the percentage of build failures. Higher build failure rates indicate issues in the build process, such as compilation errors, test failures, or compatibility issues. Identifying and addressing these failures promptly ensures a more stable and reliable software product.
Lead time metric measures the average time between the time when a Pull Request is raised to the time it is merged.
It shows the entire lifecycle of a pull request, including the PR raised> Review started>PR accepted>PR merged.
The Lead Time metric can be effectively visualized using box plots. Box plots can provide a visual representation of the distribution of lead times.
Project Efficiency: you can analyze the complete PR review cycle Lead Time that provides efficiency in the software development process. By analyzing the time it takes for code changes to move through the development pipeline, project managers can identify delays or inefficiencies.
Quality Assurance: Lead time can provide insights into the quality assurance process. Longer lead times may indicate delays in testing or quality assurance activities, potentially leading to issues and bugs reaching production.
Insights V3 uses the Average Lead Time by Pull Request Time metric to provide insights into the time it takes for pull requests to be completed.
Average Lead Time by Pull Request Time refers to the average duration it takes for pull requests to progress from opening to merging. It measures the time span between the creation of a pull request and its successful inclusion into the project's codebase.
The chart consists of five bars, each of a different color. Each bar displays the average lead time in hours/days for pull requests based on the pull request size.
We have five buckets of Pull Request Sizes. They are:
1-9 lines
10-49 Lines
50-99 Lines
100-499 Lines
500+ Lines
Pull Request Size is computed by Lines Changed. Lines changed could be lines of code added, deleted, or updated.
The length of the color inside the bar is determined by the average lead time. i.e., the longer it takes, the longer the length of the color inside the bar.
It is the "Average" Lead Time, so compute the average lead time for all PRs for a certain size and display the lead time in minutes/hours/days
Workflow Efficiency: This metric provides valuable insights into the efficiency of the pull request workflow. Optimizing the lead time results in a faster integration of code changes and promotes collaboration among contributors.
Collaboration and Feedback: It reflects the speed at which contributors receive feedback on their code changes. A shorter lead time indicates a more responsive review process, encouraging contributors to engage actively..
Project Velocity: Monitoring the average lead time enables project managers to assess the overall project velocity. A shorter lead time helps maintain a high project velocity, ensuring rapid innovation and faster delivery of software features.
This ranking showcases the top messages based on the number of responses generated by these messages. It highlights the most engaging and widely discussed topics within the mailing lists for the selected period.
On the Overview page, select the project and repositories (1) for which you want to see the data.
From the left navigation, click Mailing List.
Select the specific period using the filter option (2).
Scroll down the main webpage where the leaderboard is hosted.
The leaderboard shows the rankings based on the number of messages. Look through the list to see which messages or topics are most engaging.
The leaderboard includes a feature to compare the increase or decrease of messages over a selected period.
Click Show More to expand the list.
If you want to download the list, click .
Insights V3 uses the Average Review Time by Pull Request Time metric to provide insights into the duration it takes for pull requests to be reviewed.
Average Review Time by Pull Request Time refers to the average duration it takes for pull requests to be reviewed by peers or project maintainers. It measures the time span between the creation of a pull request and when it receives thorough review feedback.
The chart consists of 5 bars, each of a different color. Each bar displays the average lead time in hours or days for pull requests based on the size of the request.
We have 5 buckets of Pull Request Sizes. They are:
1-9 lines
10-49 Lines
50-99 Lines
100-499 Lines
500+ Lines
Pull Request Size is computed by Lines "Changed". Lines changed could be lines of code added, deleted, or updated.
The length of the color inside the bar is determined by the average Review time. i.e., the longer it takes, the longer the length of the color inside the bar.
Code Quality Assurance: The metric helps you monitor the speed at which pull requests are reviewed. By minimizing the average review time, you can enhance the chances of identifying and resolving code issues promptly, resulting in higher code quality and overall project success.
Collaboration and Engagement: Prompt review feedback encourages active collaboration among contributors. It helps to maintain a responsive and interactive process. When pull requests receive timely reviews, contributors can address feedback and iterate on their code changes faster.
Project Velocity: Timely code reviews contribute to higher project velocity. The Average Review Time metric provides insights into the responsiveness of the review process, identifying areas for improvement. Minimizing review times helps ensure that code changes are integrated swiftly, allowing projects to deliver new features or updates faster.
A Velocity dashboard in Insights V3 is a visual representation that provides insights into the development team's velocity. Velocity refers to the rate at which the team completes work or delivers features over a specific period. This dashboard tracks and measures the team's productivity and progress.
It typically displays key metrics related to the team's velocity, such as Performance Metrics, Lead Time, Average Review Time, Average Wait Time for First Review, and Code Engagement.
A Velocity dashboard can help project managers and stakeholders understand the team's capacity and performance over time. The dashboard is useful to project managers, leads, development teams, and stakeholders in several ways:
The dashboard helps project managers estimate the team's capacity, progress, and performance over time. It provides insights into the team's historical productivity and better resource allocation.
The velocity dashboard allows development teams to assess their own performance and productivity. They can track their progress, identify patterns, and improve their estimation accuracy by comparing planned work with actual velocity.
The velocity dashboard provides stakeholders with visibility into the progress and productivity of the development team. It enables you to track the status of deliverables, understand the team's capacity, and make informed decisions based on real-time data.
Overall, a velocity dashboard is a useful tool for project management, performance evaluation, collaboration, and decision-making, benefiting all stakeholders involved in the open source software development process.
Insights V3 incorporates the Average Wait Time for First Review metrics to provide insights into the duration it takes for pull requests to receive their first review.
The Average Wait Time for First Review refers to the average time it takes for pull requests to receive their first review after being opened. It measures the time span between the creation of a pull request and when it receives its initial feedback or review.
In the vertical bar chart, each bar displays the Average Wait Time for 1st Review with x-Axis showing the date and the y-axis showing "Time in Hours".
The Average Wait Time for a selected time period can be computed by summing up the time it took for the first review for all PRs, divided by the number of pull requests and results displayed in minutes/hours/days.
Each data point on the chart represents the average wait time for the first review during that specific time period.
Code Quality and Bug Resolution: Longer wait times may delay the identification and resolution of code issues or bugs, potentially affecting the overall quality of the software.
Faster Development Cycle: Reducing the wait time for the first review contributes to a faster development cycle. This allows projects to deliver new features, bug fixes, or improvements in a timely manner, increasing the project's overall efficiency.
Collaboration and Iteration: The Average Wait Time for First Review metric directly impacts collaboration and iteration among contributors. Timely feedback on pull requests allows contributors to address issues or make improvements promptly.
In Insights V3, the Code Review Engagement metric assesses the level of involvement and participation in code review activities.
Following are the various factors that are considered in the Pull Request review process:
Number of Pull Request Participants
Pull Requests reviewed
Review comments for Pull Request
Code reviews
Process Improvement: Tracking the Code Review Engagement metric over time allows you to assess the effectiveness of code review processes and identify areas for improvement. Continuous improvement of the code review process leads to higher-quality code and improved productivity.
Quality Assurance: Code review plays a vital role in ensuring code quality and identifying potential issues or bugs. By tracking this metric, managers can identify areas where additional attention or improvement may be needed to maintain high code quality standards.
A Productivity page provides consolidated insights to enhance efficiency and measure productivity in software development.
It offers visual representations of data, such as commits per active day, New contributors, Drifting away contributors, Engagement gap, Work time distribution impact, and Effort by pull request size, allowing contributors and project managers to monitor progress, identify bottlenecks, and optimize workflows.
By providing real-time information and actionable analytics, the productivity dashboard empowers open source projects to make data-driven decisions, improve collaboration, and deliver high-quality software more effectively.
The Commits per Active Day Dashboard provides insights into code commit frequency on active development days. It measures the average number of code commits contributors make on active development days.
Early Issue Detection: A higher number of commits per active day increases the likelihood of early issue detection. Regular code commits provide more opportunities for contributors to identify potential issues or bugs during the development process.
Code Quality and Stability: A consistent number of commits indicates ongoing code enhancements and maintenance, leading to improved code quality over time.
Productivity Assessment: A higher number of commits per active day suggests that contributors are actively working on code changes, implementing new features, fixing bugs, and making improvements.
The Effort By Pull Request Batch Size metric analyzes the relationship between the size of pull requests (measured by lines of code changed) and the time contributors spend reviewing and merging them.
Here are ways you can interact with the chart to gain deeper insights:
Filter by Date Range: This allows users to analyze the metrics across different periods to observe how the trends have evolved.
Compare Trends: The chart compares previous periods, enabling you to spot differences or improvements.
The metric shows the distribution of pull requests across different size categories, from "Very Small" to "Gigantic."
This information can help identify potential bottlenecks or areas where the team may need additional support or process improvements.
The metric tracks the average time required for reviewing and merging pull requests in each size category.
Longer review and merge times for larger pull requests may indicate a need for better code organization, more thorough review processes, or additional resources.
The metric includes information on the number of participants and comments associated with each pull request size category.
Higher numbers of participants and comments for larger pull requests suggest increased collaboration and coordination efforts, which can be both positive (better code quality) and negative (potential delays or inefficiencies).
Development Cycle Time: The Effort By Pull Request Batch Size metric provides insights into the overall development cycle time. By analyzing the relationship between batch size and effort, you can identify trends that affect the time taken to review and merge pull requests.
Review Efficiency: The Effort By Pull Request Batch Size metric helps project managers evaluate the efficiency of the pull request review process. By analyzing the effort required for different batch sizes, You can identify patterns and trends that impact the speed and quality of reviews.
Q: What does the Effort By Pull Request Batch Size metric measure? A: It measures the relationship between the size of pull requests, in terms of lines of code changed, and the amount of time contributors spend reviewing and merging them.
Q: Why is it important to analyze the Effort By Pull Request Batch Size metric? A: Understanding this metric helps optimize the review process by identifying the most efficient batch sizes for pull requests, thus reducing review time and improving workflow efficiency.
Q: How can I interact with the chart to get more insights? A: You can filter the analysis by date range to observe trends over time and compare these trends with previous periods to identify improvements or regressions in efficiency.
Q: Can analyzing this metric reveal trends over specific periods? A: Yes, by filtering the data by specific date ranges, it's possible to observe how the trends in pull request batch sizes and review times have evolved, helping teams adapt their strategies accordingly.
The Reports Dashboard gives you a comprehensive view of the project's performance through four primary metrics. The dashboard utilizes intuitive data visualizations such as charts, graphs, and tables. These visual representations make it easier to interpret complex data and identify patterns.
The Reports Dashboard enables you to generate comprehensive reports based on the selected metrics and filters. These reports can be exported in various formats, such as PDF or CSV, making sharing the insights with team members or external stakeholders convenient.
At the core of the Reports Dashboard are the following four major metrics:
The Contributors' reports provide insights into the individuals who have contributed to the project. This dashboard allows for the selection of data sources and provides specialized insights.
Select the project from the landing page or from the foundation page.
On the left navigation pane, click Reports>Contributors.
In the top-right corner, you will find the date filter option.
Click on the date filter to open a calendar.
Select the desired start and end dates for the data you want to analyze.
Click Apply to update the dashboard with the selected date range.
For more information, see Filter Date Range.
Locate the drop-down menu for data source selection.
Choose the desired data source from the available options.
The dashboard will update to display data specific to the selected source.
Currently, both GitHub and Git are selected as the data sources by default.
Locate the Display Only New Contributors toggle button.
Toggle it on to display data only for new contributors within the selected period.
Toggle it off to view data for all contributors.
Explore the Total Contributors chart section. When you toggle the New Contributors button, it displays the total new contributors.
Hover over the data points to view specific counts for that time.
Explore Active Contributors Today, Active Contributors This Week, or Active Contributors This Month charts. The Charts are date filter independent. They show real-time data
Click View to expand the list on the right side and see the list of contributors.
The chart shows new contributors when you toggle on the Display Only New Contributors Button.
Move to the "Active Contributors vs. Returning Contributors" chart.
click the drop-down to filter the data.
Understand the comparison between contributors who are active for the first time and those who have returned.
Navigate to the "Leaderboard" section.
You will find a list of the most active contributors.
Review their names and corresponding activity levels.
The Organizations' reports provide insights into the individuals who have contributed to the project. The report gives you key metrics that you can use to assess a healthy contribution from multiple organizations.
Select the project from the landing page or from the foundation page.
On the left navigation pane, click Reports>Organizations.
Select the repositories from the drop-down menu to analyze the data for the particular repository.
In the top-right corner, you will find the date filter option.
Click on the date filter to open a calendar.
Select the desired start and end dates for the data you want to analyze.
Click Apply to update the dashboard with the selected date range.
Locate the drop-down menu for data source selection. For more information, see Data source platforms.
Locate the Display Only New Organizations toggle button.
Toggle it on to display data only for new organizations within the selected period.
Toggle it off to view data for all organizations.
Explore the Total Organizations chart section. When you toggle the New Organizations button, it displays the total number of new organizations.
Hover over the data points to view specific counts for that time.
Explore the Active Organizations Today, Active Organizations This Week, or Active Organizations This Month charts. The Charts are date filter independent. They show real-time data
Click View to expand the list on the right side and see the list of organizations.
The chart shows new organizations when you toggle on the Display Only New Organizations Button.
Navigate to the Leaderboard section.
You will find a list of the most active contributors.
Review their names and corresponding activity levels.
There are two primary dashboards in the Insights release: Global trends and analytics for all LF projects and LF project-specific trends and metrics:
Global Trends displays project performance dashboards for all the projects on-boarded by the Linux Foundation.
Select an individual project from the All Projects page or the search bar. You will see Project Analytics and Community Analytics dashboards that provide project-specific analytical data.
The Organization Contribution Index and the Project Trends dashboards are accessible without signing in. However, you must sign in to the portal to view all other global and project-specific dashboards.
The following Global dashboards are applicable to all the projects onboarded to LFX Insights::
The following project-specific dashboards are displayed after you select a project from the All Projects list:
Project Analytics: Trends (project specific), technical contributors, code velocity, and code base.
Community Analytics: Navigate to Community Analytics > People to view the contributor management dashboard. Community managers can use this dashboard to view and make necessary changes to the profiles and identities of technical contributors within their community.
To onboard a data source into CM, you need CM Manager Access within CM for managing the onboarding process, along with admin-level permissions for the data source you're integrating.
Once you successfully connect the data source, it will take 24-48 hours to fully onboard a project on Insights V3 depending on the volume of data.
we have separate identities for GitHub and Git to distinguish between the two platforms.
GitHub Identity: GitHub identity encompasses a user's GitHub profile, including their name, GitHub ID, logo, and public details like company information. It is used for authentication and accessing GitHub-specific data like repositories, issues, and pull requests.
Git Identity: Git identity is more basic, typically consisting of just the name and email address as they appear in the Git log. It is used for general Git operations such as cloning repositories, pushing changes, and managing branches.
In Insight V3, we opted for displaying usernames instead of real names due to:
Privacy Protection: Safeguarding user identity by allowing anonymity.
Enhanced Security: Minimizing personal information exposure to reduce security risks.
Platform Consistency: Aligning with common digital norms for user identification.
This approach reflects our dedication to ensuring a secure, private, and user-centric experience.
It takes up to 48 hours.
follow these steps to troubleshoot the issue:
Check Data Source:
Verify that the data source is correctly configured and is actively sending data to Insight V3.
Refresh Data:
Try manually refreshing the data within the application to ensure you have the most up-to-date information.
Check Connectivity:
Ensure that there are no connectivity issues preventing data from being retrieved. Check network connections and any relevant settings.
Contact Support:
If the issue persists, contact our support team for further assistance.
Our testing involves both automated and manual strategies. We use a QA automation system to check the data from most sources, like GitHub automatically. When we need to examine data, particularly from groups.io closely, we manually test to ensure the data is accurate and complete.
This data is not shown to maintain privacy and security, however, it is known which organizations are contributing.
: You do not need to sign in to view the dashboard. You can search for a specific project and view the contribution details of that project.
: This is the landing page for Insights. Sign-in is not required to view details.
Onboarding a project in Insights V3 starts with integrating a data source in .
For example, GitHub Onboarding requires admin access to your GitHub organization because the onboarding process involves installing the CM GitHub application, which is essential for CM to collect data from GitHub. To learn more, see Integration.
Data sources are the collaboration tools or the remote servers that are used to drive the development of a project. LFX Insights accesses such data sources, collects data for a project, segregates them to different sections, such as source control for code related data, issue management for issues statuses, documentation for confluence and wiki pages, CI/CD for Jenkins, and so on.
Currently Supported
Coming Soon
Insights 2.0 currently supports Git, and GitHub sources for tracking and visualizing the project's source code analytics, and issues.
The Linux Foundation is working towards supporting other data sources, such as Jira, Confluence, Bugzilla, Slack, GitLab, LinkedIn, Docker Hub, and many more.
Git, and GitHub. The LFX Insights, June 2022 release delivers support only for Git and GitHub (also called native connectors).
Organization OSS Index and Trends under Global are displayed publicly, which means you do not need to sign-in to view these dashboard data. However, to view all other global and project-specific dashboards, you must sign in to LFX Insights.
Following are various newly added dashboards to Insights 2.0:
Organization OSS Index, under Global, provides analytics of organization performance, such as how many total commits are made by the contributors from different organizations, how active or inactive an organization is in contributing code to the projects, and many more other details.
This is an enhancement of the previously existing Trends dashboard. This release supports only Technical Metrics, and displays aggregated performance data of all projects onboarded to Insights.
Event Analytics dashboard, under Global, provides analytics related to various LFX events. The analysis includes how many individuals registered for the events, how many individuals attended the events, how many individuals attended as speakers, how many organizations (individuals from different affiliated organizations) participated in the events, and so on.
Webinar Analytics dashboard, under Global, provides analytics of webinars conducted by the Linux Foundation. This includes data such as how many individuals registered for the webinars verses how many actually attended the webinars, attendees by geography, top most popular webinars, and so on.
Training and Certification Analytics dashboard, under Global, provides analytics of all the training and certification programs conducted by the Linux Foundation. It provides an insight into how popular the programs are and how much of an impact these programs have in the open source community.
Membership Analytics dashboard, under Global, provides an in-depth analysis of organizations' growth, new organizations joining the Linux Foundation as members, projects these organizations are contributing code to, and many more details.
Project Analytics provides analytics of an individual project or project group. This release supports the following dashboards:
Technical Contributors: Displays analytics of contributor strength and acquisition, unaffiliated contributors, participating organizations, and many more.
Code Velocity: Displays data related to commit analysis, PR pipeline, issue request pipeline, build, and release pipelines.
Code Base: Displays analytics of all the Git repositories.
The brand new Community Management tool provides comprehensive analytics of the project community and lets community managers manage their project communities.
The September 2022 release provides support for the technical contributor management dashboard under Community Management. This contributor management dashboard enables community managers:
View profile details (affiliations and identities) and contribution details of the contributors within the community. Contribution details include code-related activity, such as PRs, commits, and GitHub issues created, submitted, reviewed, approved, or merged.
Make necessary changes, such as adding or deleting affiliations and identities of contributors.
New Navigation menu: Effective changes to the navigation menu to clearly reflect the analytics under each category. For example, a new menu option called Event Analytics displays data very specific to the events organized by the Linux Foundation during a selected time range.
Project Card View: You can still see the old project card view by clicking the All Projects tab or searching for the project using the Search Projects field.
Enhancement of project and platform landing pages: The platform (Insights 2.0) and individual project landing pages show the Trends dashboard when you either open Insights 2.0 or select an individual project from the project list. This makes it easier to view aggregated metrics data for all projects or individual projects.
Release Date | Connectors support |
---|---|
31 October 2022
Gerrit, Jira and Confluence
30 November 2022
Slack, Groups.io, Google Groups and Pipermail
16 December 2022
Bugzilla, Jenkins, CircleCI, Docker Hub registry
30 December 2022
Earned Media, Twitter
31 January 2022
Gitlab
This page lists some common terms (also known as jargons) used in this document, along with definitions to help you understand them better.
A contributor is an individual who has performed any code-related activity during a given time period. Code-related activity includes any kind of activity started with a commit, issue, pull request, or changeset, as well as documentation work.
Contributors are identified as unique based on their LF SSO accounts (otherwise called LF IDs) or their source identities, using which they have been contributing. For example, if a contributor named Jon Snow has already claimed 3 different identities, each from GitHub, Gerrit, and Jira, they will be counted as one identity when computing contributors for the contributions coming from these sources and those 3 identities.
An active contributor is someone who is actively contributing to the code activities during a given time period.
A contributor who has been active in the last year but has not performed any coding activity in the last 6 months from any point of time within the given time period is called a drifting away contributor.
A contributor who contributed for the first time during the given time period.
A corporate contributor (also called an affiliated contributor) is an individual who is contributing code on behalf of an organization, i.e., who is affiliated with an organization other than Individual-No Account for the time period when the contribution was made. If a contributor has multiple organization affiliations, each organization will qualify as a contributing organization.
A contributor who has either chosen 'Individual-No Account' for their affiliation to the project (from Individual Dashboard) or has been assigned the 'Individual-No Account' affiliation for the project by a Community Manager or LF staff (Community Management) is called an Independent Contributor.
A contributor whose organization affiliation to the project has either expired or is currently defaulted to 'Individual-No Account', i.e., the individual themselves have not provided any affiliation is termed as Unaffiliated Contributor.
An organization that is affiliated with an active contributor is called an Active Organization. During the selected time period, if a contributor has multiple affiliations, each organization will be treated as a contributing organization.
Members are organizations that join the Linux Foundation or any project of the Linux Foundation, such as Hyperledger, LFX Networking, CNCF, and so on. These organizations become LF members based on the membership tiers chosen by the organizations at the time of joining the Linux Foundation.
Member organizations (also called members) are the organizations that join the Linux Foundation or any project of the Linux Foundation, such as Hyperledger, LFX Networking, CNCF, and so on. These organizations become LF members based on the membership tiers chosen by the organizations at the time of joining as members.
A dashboard is a digital tool that displays data and metrics in a visual format to provide a quick overview of key information. Dashboards are commonly used in business settings to monitor performance, track progress toward goals, and identify trends. They typically include charts, graphs, tables, and other visualizations that make it easy to understand complex data. Dashboards can be customized to show different types of information depending on the user's needs, and often include features like filters, alerts, and drill-down capabilities. By providing a clear and concise view of critical data, dashboards help users make informed decisions and take action based on real-time information.
The Pull Requests Metric measures and analyzes the three key activities related to pull requests:
Pull requests opened
Pull requests closed
Pull requests merged
Pull requests are a mechanism for proposing changes to a codebase, allowing developers to collaborate, review, and merge code changes into the project.
Analyzing the high-level tile (1) representing unique pull requests (opened, closed, and merged) provides valuable insights into the health of the codebase.
The detailed chart displays data related to pull requests opened, closed-unmerged, closed-merged, and the total cumulative pull requests over the selected time period. On the left side, the chart shows the chart trend summary (4).
Collaboration and Code Review: It provides insights into the active participation of developers and the effectiveness of the code review process. If the number of Pull Requests opened is high, the user can complement this data with other Pull Request metrics such as first time to Review, and Pull Request Cycle Time to find out the cause of the high number of Pull Requests open but not acted upon/closed/merged.
Community Engagement: A higher number of pull requests indicates an engaged community that actively contributes to the project.
Quality and Maintenance: By analyzing the number of pull requests opened, closed, and merged, you can assess the health of the codebase, identify areas that need attention, and ensure timely reviews and merging of contributions.
The Star Metric measures and analyzes the number of stars a project receives on a code hosting platform like GitHub.
The metric gives you a real-time data analysis of projects' popularity, community engagement, and overall project visibility.
Stars represent a way for you to bookmark or indicate your interest in and appreciation for a particular project. Each star serves as a measure of the project's popularity.
To analyze the Star Metric, the analytics tool employs a line chart on its dashboard. The line connecting the data points on the chart showcases the trend and changes in the number of stars over time.
When you hover over a specific point on the line chart, detailed information about the number of stars for that particular month within the selected period is displayed.
The metric helps you analyze your project's popularity. A higher number of stars generally suggests a widely recognized and appreciated project, potentially attracting more contributors.
Date: April 08, 2024
Insights V3 is an open-source analytical tool that provides insights from analyzing open source software (OSS) projects.
Insights V3 helps project leads and technical managers understand their team members' engagement and participation in open source projects, and identify the most active and productive contributors.
No new features have been added in this release.
Backend Performance: Strengthened backend performance for Organization Dependency, Contributor Dependency, and Leaderboard tables on the Overview page by integrating DBT Platinum models, significantly improving data processing times.
UI Enhancements: Initiated a clean-up of the plugin's interface to streamline user experience.
Project Metrics: Enhanced the efficiency of fetching key project metrics data, ensuring more accurate and timely insights.
UI Enhancements
Active Days Chart Width and Sizing Adjustments: Modified to align with the sizing of other bar charts for visual consistency.
New Sidebar Navigation: Implemented an enhanced sidebar navigation system for improved user experience.
Progress Bar Styling Updates: Revised progress bar designs to match our latest stylistic preferences.
UI Card Alignment: Updated user interface cards to adhere to the new LFX Style Guide, ensuring a cohesive look and feel.
Resolved issues with exporting PNG files in the Geographical Distribution section, mailing list components, and velocity charts.
Bug fixes in Pagination for Organization and Contributor Dependency charts
Resolved issues where the Active Contributors chart wasn't loading in the Reports Contributors section.
Fixed bug on Best Practices flyout sidebar error.
Some of the projects are still onboarding, so the data may not be correct.
The date range filter in the dashboard module may not always display the correct data when selecting custom date ranges.
You may encounter occasional inconsistencies in data synchronization between Insights V3 and external data sources, leading to discrepancies in reporting.
This chart ranks public mailing lists based on their overall activity, considering total messages, unique authors, and contributions from different organizations. It highlights the most active and engaged mailing lists within the project for the selected period.
The Leaderboard provides a snapshot of the most vibrant mailing lists within your project for a selected time period. Here's how to interpret the information:
Ranking: Indicates the position of each mailing list based on activity levels, with #1 being the most active.
Name: The name of the mailing list.
Threads: The count of discussion threads initiated in the mailing list.
Messages: Total number of messages posted in all threads.
Subscribers: The number of individuals subscribed to receive updates from the mailing list.
Contributors: Unique individuals who have posted at least one message to the mailing list.
Organization: The entities (like companies or institutions) that contributors are affiliated with.
Each category also displays a change (+/-) compared to the previous period, helping you see trends like growth or reduction in activity.
The Work Time Distribution Impact Dashboard analyzes how contributors allocate their work time and the impact of different activities on project progress.
The chart shows the trends of commits and finds patterns if long hours, non-business hours, or weekends have contributed to burnout. Burnout can be thought of as a lower number of commits over a long period, before which there was heightened activity (commits).
The purpose of the chart is to find out if there is a risk of burnout among contributors due to long hours.
Workload Distribution: Monitoring the Work Time Distribution Impact helps identify potential workload imbalances among contributors. If one or a few contributors are consistently spending a disproportionate amount of time on specific activities, it can lead to burnout or reduced productivity.
Performance Evaluation: The Work Time Distribution Impact metric can contribute to performance evaluation and feedback processes. By analyzing how contributors allocate their work time, project managers can identify patterns of efficiency or areas that require improvement.
The New Contributors Dashboard analyzes the participation of new contributors in the project. It provides a leaderboard of the number of new contributors and ranks them based on their contributions over a selected period.
Sustainability and Succession Planning: The continuous involvement of new contributors ensures the long-term sustainability of open-source projects. As existing contributors may move on or take up different responsibilities, new contributors play a vital role in filling those gaps.
Fresh Perspectives and Ideas: New contributors bring fresh perspectives, ideas, and diverse skill sets to the project. They may offer innovative solutions, identify areas for improvement, and contribute to the overall project evolution.
The Drifting Away Contributors metric focuses on identifying contributors who were once active in an open-source project but have gradually become less engaged over time.
This chart is not impacted by time filter changes. That means the data will always show with respect to "today".
Drifting Away Contributors are:
Users who made at least 5 code contributions at all times for the project.
At least one of those contributions must be made in the last 6 months.
The contributor disappeared over the last 3 months.
The Drifting Away Contributors metric is essential for maintaining a healthy and active contributor community. By identifying contributors who are gradually becoming less engaged, project managers can take proactive measures to understand their reasons for disengagement and find ways to re-engage them.
The Engagement Gap metric measures the difference between expected and actual levels of contributor engagement. The dashboard shows the ratio of the difference between the contributor who comments the most over PRs vs. the contributor who comments the least.
Performance Assessment: The Engagement Gap metric enables you to assess the project's overall engagement level by comparing it to the expected or desired level. It provides a quantitative measure of how actively contributors are participating and helps identify any gaps between the expected and actual engagement levels.
Community Health: The Engagement Gap metric provides valuable insights into the health and dynamics of the project community. Large engagement gaps may indicate potential challenges, such as communication issues, a lack of mentorship, or unclear contribution guidelines.
How to Improve Engagement Gap?
Set Clear Expectations: Clearly define the expected engagement levels for each project to provide a benchmark for comparison.
Encourage Collaboration: Foster a culture of collaboration within your team by encouraging open communication and sharing of ideas.
Provide Feedback: Regularly review the Engagement Gap metrics with your team and provide feedback on areas that need improvement.
Recognize Achievements: Acknowledge and reward team members who actively contribute to reducing the Engagement Gap, motivating others to follow suit.
Select the project from the landing page or the foundation page.
From the main navigation, select Reports, and click Retention Dashboard.
On the top-right corner, locate the date filter option.
Click on it to open.
Choose your desired duration to focus on specific data.
Click Apply to update the dashboard with your selected date range.
Find the Platforms drop-down menu.
Choose a data source from the available options (e.g., GitHub or Git).
The dashboard updates to display insights relevant to your selected platform.
Locate the Cohort Size drop-down menu.
Choose either Weekly or Monthly to define the cohort size for analysis.
The dashboard adjusts to display data based on your selected cohort size.
The retention rate in open-source projects measures the percentage of contributors who continue to be actively engaged in the project over a specified period.
For a specific time period, the retention rate for contributors is calculated by dividing the number of contributors who remain active during the current and previous time period by the total number of contributors who were active in the previous time frame.
Explore the Retention Rate Chart section.
Observe the chart that visualizes the retention rate of contributors over time.
This chart shows the percentage of contributors who continue to engage with your project over the defined cohort period.
This metric provides insights into how consistently contributors are involved in your open source project over a specific period.
Review the Average Number of Contributor Activities metric.
Gain insights into the average level of contributor activity within the defined cohort.
This chart helps you understand the retention of contributors over time. It provides insights into how many new contributors joined in a specific month and how many of them remained actively engaged in subsequent months.
Select the project from the landing page or from the foundation page.
From the main navigation, select Reports and then click Activities Dashboard.
In the top-right corner, you will find the date filter option.
Select the desired start and end dates for the data you want to analyze.
Click Apply to update the dashboard with the selected date range.
Find the Platforms drop-down menu.
Choose a data source from the available options (e.g., GitHub or Git).
It shows the total number of all the activities that are performed by contributors for the project. Hover over the data points to view specific activity counts for each date.
Explore the Activities Today, Activities This Week, and Activities Organization This Month charts. The Charts are date-filter independent. They show real-time data.
Click View to expand the list on the right side and see the list of activities.
Download the list in CSV format for analysis.
Move to the Activities chart section.
Use the drop-down to select Daily, Weekly, or Monthly data.
Gain insights into the growth of activities over time.
Navigate to the Activities by Platform chart.
You will see the overall percentage distribution across the different platforms.
Click > icon to see the detailed distribution of activities on the different platforms.
Explore the Leaderboard section for activities by type.
Review the types of activities (e.g., code commits, issues) and their corresponding counts.
Access LFX Insights
To use the new Insights user interface, follow these steps:
Once you are logged in, you will see Project Trends as the default landing page.
In the left-hand navigation, click any dashboard that you want to open in the category; once it expands, click on it to see the analytics.
Visit the login page and log in if you have an existing LF account. For the next several weeks, you can still view your project in the former version of Insights and toggle between versions as you get your project teams engaged in the new version.
On the dashboard, click Filters CTA to see the projects for the selected time. For more information, see .
To create an account, see .
The new release of Insights is now live with a refreshed user interface and new dashboards that make it easier to navigate and find the information that matters most.
In the September 2022 release, Insights 2.0 supports Git and GitHub. Support for all other dashboards will be released in batches very soon.
With this new release, there’s even more data and insights into the health of your project.
All the code-related technical data, such as commit analysis, technical contributor analysis, and repository analysis, are displayed on Trends and Project Analytics dashboards using visualization reports to boost users' interactivity.
Under Community Analytics and other service-specific dashboards, analytics of business-specific services, including Training and Certification, LFX Events, Webinars, LFX members, and many more, are shown.
A brand new Community Management application that provides complete visibility into community members' activities, affiliations, and identities (both individuals and organizations) and empowers community administrators to manage the individuals to recognize their contributions.
The following are the key highlights of the new Insights 2.0 UI:
Do you want to know which organizations contribute most to our open-source projects? Check out our new Organization Contribution Index that displays technical contributions.
Are your contributors getting the most from their organization's LF Benefits? Does participation in your project require certifications? The Trainings & Certification Analytics dashboards give you a pulse on the Training and Certification landscape within the Linux Foundation. LFX Insights v2 global training analytics will allow us to understand the top training programs and more.
We define project health across the following categories:
Organization Contribution Index
Project Trends
Event Analytics
Webinar Analytics
Training & Certification
Insights 2.0 is a highly improved version of Insights 1.0.
Insights 2.0 supports:
Data lake architecture that provides a highly scalable and performance-oriented platform. Data from different data sources can be brought into the data lake layer through various native and custom connectors. Project administrators can use the Project Control Center (PCC) to configure various data sources from which project-related data is brought into the data lake layer.
A new enhanced version of the Insights UI provides better visualization reports to analyze the data.
A brand new Community Management application that empowers community managers to better manage their project communities, for example, by adding and managing affiliations and identities of technical contributors within their communities.
Insights 1.0 uses Elastic Search as the data store and embedded Kibana dashboards as the visualization layer, thus restricting the scaling and configuration of data in certain areas. The new data lake architecture in Insights 2.0 solves this problem and provides massive improvements to the scaling and performance of the tool.
Yes. However, the primary URL will remain unchanged and will automatically point to Insights 2.0.
Our goal is to make Insights 2.0 available to all the LF projects so that every Open Source Software (OSS) community gets the same experience and makes full use of the features and analytics provided by Insights 2.0.
To scale the platform across hundreds of existing projects and growth, we had to completely re-architecture our connector ecosystem. In the upcoming releases, LFX will provide support for other native connectors, such as Jira, Confluence, Slack, Groupsio, Google Groups, Pipermail, Earned Media, Twitter, Docker Hub, and many more.
No, The LFX team will migrate the projects from Insights 1.0 to Insights 2.0, and you will be able to start using the new Insights 2.0 dashboards from day 1 of the product release.
No. Insights 1.0 (https://insights.lfx.dev) will still be up and running and also regularly sync data for the data sources (or connectors) that are, for the time being unavailable for v2. You can continue to use the v1 platform while we transition your project data completely to the v2 platform. Once we have completely migrated all our projects and data sources to v2, we will decommission the v1 platform.
Unfortunately not, since the two systems are massively different in terms of the APIs and UI, your existing URLs from v1 will not work right out of the box on the v2 platform. Please reach out to us via our community forum or support desk if you need any assistance to get the data that you are looking for.