Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
The Overview page's Key Metrics and Detailed Analysis section includes six cumulative charts highlighting significant trends and patterns within your analytics data.
The cumulative charts help you compare different metrics on single charts. This comparative analysis helps identify relationships and draw meaningful conclusions.
The six cumulative charts show quick snapshots of the analytical data and the detailed analysis chart that helps you with the more profound analysis.
Within Insights, the "Filter the Date" feature allows you to customize your analytics view based on specific date ranges. This feature provides flexibility and control over the period for which data is displayed.
Follow these steps to utilize the date-filtering feature:
On the right side of the analytics dashboard, locate the "Date Filter" section.
Click on the "Date Filter" section to expand the options.
Choose from the predefined date range options.
Select the desired option by clicking on it.
Click the Bots checkbox to hide the bots' data from the analytics.
To specify a custom date range, click on the Custom option within the date range selection menu.
Select the start and end dates for your custom range on the calendar widget.
The analytics dashboard will automatically update to display data within the selected custom date range.
After selecting a predefined date range or setting a custom date range, click the Apply button to apply the date filter.
The analytics dashboard will refresh to reflect the chosen date range, displaying data only for the selected period.
Click on the Clear Dates button to display data for the entire available range and remove the date filter.
The Issue Metric measures the number of issues reported and tracked within a specified period. It compares the number of issues opened, the number of issues closed, and the total commits for the selected time period.
The metric is based on the following activity types:
issues-closed
issues-opened
The analytics tool employs a combined chart (a line chart and bar charts) on its dashboard to analyze the Issue Metric. The line on the chart connects the data points, allowing you to observe trends and patterns over time.
The dashboard shows the issues (opened +
closed issues) in a snapshot and a detailed chart (open, closed, and the total issues).
On the Overview page, select the project and repositories (1) for which you want to see the data.
Select the specific time period using the filter option (2).
The high-level tile (3) shows you the total issues (open + closed) for the selected time range.
The detailed analysis chart shows you the open issues, closed issues, and the cumulative count of total issues for the selected period. On the left side, the chart shows the chart trend summary (4).
Hover over chart (5) to see the open issues, closed issues, and total issues for the selected month.
This interactive download feature (6) enables you to download the chart in CSV and PNG file formats.
Issues Tracking and Management: By visualizing the data on a line chart, it becomes easier to identify the increase or decrease in issue activity, allowing for effective resource allocation and prioritization.
Performance Evaluation: The Issue Metric helps in evaluating the performance of the development team and the project as a whole. Changes in issue count over time indicate improvements in software quality, bug-fixing efficiency, or the impact of development efforts.
Community Engagement: A higher number of reported issues indicates the active participation and involvement of the community in the open source project.
The overview page should provide a high-level summary of the project's activity, contributors, and performance metrics, including:
The number of contributors and their distribution by location or organization.
The total number of commits, pull requests, and issues.
The average time to resolve issues and merge pull requests.
The overall health of the codebase, including code quality and security vulnerabilities.
The level of community engagement, such as the number of comments on pull requests.
The analytics tools on the overview page provide a range of features and visualizations that can help you gain insights into the project's performance, identify areas for improvement, and make informed decisions about development and collaboration.
The primary data sources for Insights V3 are the code repositories and the publicly available GitHub and Git databases. Refer to Integrations to learn more about data connectors.
The Commits metric refers to the analysis of contributor's code commits within a specified timeframe. A code commit represents a unit of change to the software's source code repository.
Each commit includes the following:
committed-commit
("Default Branch" only)
In this chart, only commits are counted, not the Roles. Each commit with a unique Commit SHA is counted as one Commit. The roles do not matter here.
The dashboard shows the commits snapshot and a detailed chart. The detailed chart is a combined chart (line chart and bar chart) that shows new commits vs. total commits.
On the Overview page, select the project and repositories (1) for which you want to see the data.
Select the specific period using the filter option (2).
The high-level tile (3) shows you the total commits for the selected time range.
The detailed analysis chart shows you the New commits and the cumulative count of total commits for the selected period. On the left side, the chart shows the chart trend summary (4).
Hover over the chart (5) to see the new commits and the total commits for the selected month.
This interactive download feature (6) enables you to download the chart in CSV and PNG file formats.
The metric enables project maintainers and stakeholders to gain valuable insights into code changes and progress within a specified period.
It provides insights into the volume and frequency of code changes made by contributors. By visualizing commit data in a bar chart, you can track the progress of development efforts over time.
Changes in commit counts provide periods of intense development, periods of slower activity, or the impact of specific events or milestones on the project.
The data visualization on the overview page shows real-time data on the total number of contributors and the total number of active contributors across all monitored repositories during the selected time period.
On the Overview page, select the project and repositories (1) for which you want to see the data.
Select the specific time period using the filter option (2).
The high-level tile (3) shows you the total unique contributors (calculated based on their member ID) for the selected time range.
The detailed analysis chart shows you the active contributors and the cumulative count of total contributors for the selected period. On the left side, the chart shows the chart trend summary (4).
Hover over the chart (5) to see the number of active contributors and the total contributors for the selected month.
This interactive download feature (6) enables you to download the chart in CSV and PNG file formats.
When you want to see the health of your open source project, the Contributor Chart is a crucial project performance indicator.
Visualizing the number of contributors over time makes it easier to identify trends, patterns, and overall community interest. The trend helps project maintainers and other stakeholders act based on the charts.
Tracking the number of contributors can provide insights into the health and vitality of your project.
By analyzing changes in the contributor count, project managers can gain insights into the effectiveness of their community outreach and development strategies.
The Star Metric measures and analyzes the number of stars a project receives on a code hosting platform like GitHub.
The metric gives you a real-time data analysis of projects' popularity, community engagement, and overall project visibility.
Stars represent a way for you to bookmark or indicate your interest in and appreciation for a particular project. Each star serves as a measure of the project's popularity.
To analyze the Star Metric, the analytics tool employs a line chart on its dashboard. The line connecting the data points on the chart showcases the trend and changes in the number of stars over time.
When you hover over a specific point on the line chart, detailed information about the number of stars for that particular month within the selected period is displayed.
The metric helps you analyze your project's popularity. A higher number of stars generally suggests a widely recognized and appreciated project, potentially attracting more contributors.
A Contribution Leadership board visualization displays the contributions made by individual contributors to an open source project. It ranks contributors based on the number of code commits, pull requests, issues closed, or other metrics and visually represents their relative activity levels and impact on the project.
This chart displays individual identities, not merge contributors, as in Community Management tool. Even if certain identities are combined into one contributor in CM, they will still appear as separate entities in the Insights V3 leaderboard charts. This distinction exists because merging contributors means combining the contributions of a single individual working under different accounts or identities. The separation of identities in Insights is maintained for privacy reasons and GDPR compliance.
Recognition and Motivation: The Contributor Leaderboard recognizes and acknowledges the efforts of individual contributors. It highlights their contributions, encourages ongoing engagement, and motivates contributors to continue their valuable work.
Community Engagement: It creates a sense of community and healthy competition, encouraging collaboration and inspiring others to contribute and improve their ranking on the leaderboard.
Collaboration Opportunities: The leaderboard helps project maintainers and community members identify potential collaborators or subject-matter experts within the project. It will be easier to identify the most active contributors and connect with them.
The Pull Requests Metric measures and analyzes the three key activities related to pull requests:
Pull requests opened
Pull requests closed
Pull requests merged
Pull requests are a mechanism for proposing changes to a codebase, allowing developers to collaborate, review, and merge code changes into the project.
Analyzing the high-level tile (1) representing unique pull requests (opened, closed, and merged) provides valuable insights into the health of the codebase.
The detailed chart displays data related to pull requests opened, closed-unmerged, closed-merged, and the total cumulative pull requests over the selected time period. On the left side, the chart shows the chart trend summary (4).
Collaboration and Code Review: It provides insights into the active participation of developers and the effectiveness of the code review process. If the number of Pull Requests opened is high, the user can complement this data with other Pull Request metrics such as first time to Review, and Pull Request Cycle Time to find out the cause of the high number of Pull Requests open but not acted upon/closed/merged.
Community Engagement: A higher number of pull requests indicates an engaged community that actively contributes to the project.
Quality and Maintenance: By analyzing the number of pull requests opened, closed, and merged, you can assess the health of the codebase, identify areas that need attention, and ensure timely reviews and merging of contributions.
The Fork Metric measures and analyzes the number of times a project has been forked by other developers.
Forking is the process of creating a copy of a project's source code repository to either modify and enhance the project or use it as a starting point for a new project.
The bar chart on the dashboard represents the analysis, displaying the number of forks over time. Hover over a specific bar to access the detailed fork information for that particular month within the selected period.
The interactive download feature (Icon) enables you to download the chart in CSV and PNG file formats.
How popular is the project? The Fork Metric provides insights into the popularity of your project. A higher number of forks generally indicates that the developers find your project useful and valuable enough to build it or adapt it to their specific needs.
Code Reuse: By analyzing the Fork Metric, you can get data on code reuse and identify potential opportunities for improvement.
Community Engagement: A growing number of forks indicates an active and involved community, contributing to the project's growth.
Project Evolution: By monitoring forks over time, you can identify significant milestones.
ID: analytics
Project websites provide some web analytics.
This check passes if:
A Google Analytics 3 (Universal Analytics) Tracking ID is found in the source of the website configured in GitHub. Regexps used:
A Google Analytics 4 Measurement ID is found in the source of the website configured in Github. Regexps used:
The HubSpot tracking code is found in the source of the website configured in Github. Regexps used:
ID: artifacthub_badge
Projects can list their content on Artifact Hub to improve their discoverability.
This check passes if:
An Artifact Hub
badge is found in the repository’s README
file. Regexps used:
ID: cla
The CLA defines the conditions under which intellectual property is contributed to a business or project.
This check passes if:
A CLA check is found in the latest merged PR on GitHub. Regexps used:
This check will be automatically marked as exempt if the DCO check passes but this one does not.
ID: community_meeting
Community meetings are often held to engage community members, hear more voices, and get more viewpoints.
This check passes if:
A reference to the community meeting is found in the repository’s README
file. Regexps used:
ID: dco
Mechanism for contributors to certify that they wrote or have the right to submit the code they are contributing.
This check passes if:
The last commits in the repository have the DCO signature (Signed-off-by). Merge pull request and merge branch commits are ignored for this check.
A DCO check is found in the latest merged PR on GitHub. Regexps used:
This check will be automatically marked as exempt if the CLA check passes, but this one does not.
ID: github_discussions
Projects should enable GitHub discussions in their repositories.
This check passes if:
A discussion that is less than one year old is found on GitHub.
ID: openssf_badge
The Open Source Security Foundation (OpenSSF) Best Practices badge is a way for Free/Libre and Open Source Software (FLOSS) projects to show that they follow best practices.
This check passes if:
An OpenSSF
(CII) badge is found in the repository’s README
file. Regexps used:
ID: openssf_scorecard_badge
This check passes if:
An OpenSSF
Scorecard badge is found in the repository’s README
file. Regexps used:
ID: recent_release
The project should have released at least one version in the last year.
This check passes if:
A release that is less than one year old is found on GitHub.
ID: slack_presence
Projects should have presence in the CNCF Slack or Kubernetes Slack.
This check passes if:
A reference to the CNCF Slack or Kubernetes Slack is found in the repository’s README
file. Regexps used:
On a regular basis, a number of checks are performed on each repository listed in the database.
Checks are grouped into check sets.
One or more check sets
are applied to a single repository, and each check set specifies the number of checks that will be performed on the repository.
The check’s file must declare the following information:
ID
: check identifier.
WEIGHT
: weight of this check, used to calculate scores.
CHECK_SETS
: check sets this new check belongs to.
Scorecard assesses open source projects for security risks through a series of automated checks. For more information about the Scorecard badge please see .
ID: adopters
List of organizations using this project in production or at stages of testing.
This check passes if:
An adopters file is found in the repository. Globs used:
An adopters reference is found in the repository’s README
file. This is in the form of a title header or a link. Regexps used:
ID: changelog
A curated, chronologically ordered list of notable changes for each version.
This check passes if:
A changelog file is found in the repository. Globs used:
A changelog reference is found in the repository’s README
file. This can be in the form of a title header or a link. Regexps used:
A changelog reference is found in the last GitHub release content body. Regexps used:
ID: code_of_conduct
Adopt a code of conduct to establish community standards, promote an inclusive and welcoming initiative, and outline procedures for handling abuse.
This check passes if:
A code of conduct file is found in the repository. Globs used:
A code of conduct reference is found in the repository’s README
file. This can be in the form of a title header or a link. Regexps used:
A code of conduct file is found in the default community health files repository, for example.
ID: contributing
A contributing file in your repository provides potential project contributors with a short guide to how they can help with your project.
This check passes if:
A contributing file is found in the repository. Globs used:
A contributing reference is found in the repository’s README
file. This can be in the form of a title header or a link. Regexps used:
A contributing file is found in the default community health files repository.
ID: governance
Document that explains how the governance and committer process works in the repository.
This check passes if:
A governance file is found in the repository. Globs used:
A governance reference is found in the repository’s README
file. This can be in the form of a title header or a link. Regexps used:
The maintainers file contains a list of the current maintainers of the repository.
This check passes if:
A maintainers file is found in the repository. Globs used:
A maintainers reference is found in the repository’s README
file. This can be in the form of a title header or a link. Regexps used:
ID: readme
The readme file introduces and explains a project. It contains information that is commonly required to understand what the project is about.
This check passes if:
A readme file is found in the repository. Globs used:
ID: roadmap
Defines a high-level overview of the project’s goals and deliverables ideally presented on a timeline.
This check passes if:
A roadmap file is found in the repository. Globs used:
A roadmap reference is found in the repository’s README
file. This can be in the form of a title header or a link. Regexps used:
ID: summary_table
The Projects Summary Table is a CNCF Business Value Subcommittee initiative to supplement the CNCF Landscape and include further information about CNCF projects for the wider Cloud Native community.
This check passes if:
At least one of the summary_* fields has been set in the project's extra section in the Landscape YAML file.
ID: website
A url
that users can visit to learn more about your project.
This check passes if:
A website url
is configured in the GitHub repository.
This check determines whether the project has generated executable (binary) artifacts in the source repository. For more details, see the check documentation.
ID: code_review
This check determines whether the project requires code review before pull requests (merge requests) are merged. For more details, see the check documentation.
ID: dangerous_workflow
This check determines whether the project’s GitHub Action workflows has dangerous code patterns. For more details, see the check documentation.
ID: dependency_update_tool
This check tries to determine if the project uses a dependency update tool, specifically dependabot or renovatebot. For more details, see the check documentation.
ID: maintained
This check determines whether the project is actively maintained. For more details, see the check documentation.
ID: sbom
List of components in a piece of software, including licenses, versions, etc.
This check passes if:
The latest release on Github includes an asset which name contains sbom. Regexps used:
The repository’s README
file contains a SBOM section that explains where they are published to, format used, etc. Regexps used to locate the title header:
ID: security_policy
Documented security processes explaining how to report security issues to the project.
This check passes if:
A security policy file is found in the repository. Globs used:
A security policy reference is found in the repository’s README
file. This can be in the form of a title header or a link. Regexps used:
A security policy file is found in the default community health files repository.
ID: signed_releases
This check tries to determine if the project cryptographically signs release artifacts. For more details, see the check documentation.
ID: token_permissions
This check determines whether the project’s automated workflows tokens are set to read-only by default. For more details, see the check documentation.
Calculating a global score for a best practice score in an open-source project involves evaluating various aspects of the project against predefined best practices and assigning weights to those aspects based on their importance. Let's understand this with the sample example.
Define the following set of best practices that are important for the success and quality of the open-source project. Each category should have a set of criteria that can be evaluated.
Assign weights to each category based on their relative importance. These weights should add up to 100%. The weights reflect how much each category contributes to the overall quality of the project.
Evaluate Each Criterion
For each criterion within a category, evaluate the project and assign a score.
Use a numerical scale (0–10) or any other suitable scale.
Code of conduct: 8
Governance: 9
Maintainer: 8
Website: 7
Analytics: 9
GitHub Discussion: 10
Community meetings: 8
Binary Artifacts: 8
Dangerous Workflow: 9
Approved Licenses: 9
It is calculated by the average score* weights
Documentation : ((8+9+8+7)/4)*.4= 3.2
Standards: ((9+10+8)/3)*.30= 2.7
Security: ((8+9)/2)*.20= 1.7
Legal: 9*.10= .9
Calculate Global Score
Sum up the category scores to obtain the global score for the best practice score of the open-source project.
Documentation+ Standards+Security+Legal= 3.2+2.7+1.7+.9 = 15.58
ID: license_approved
Whether the repository uses an approved license or not.
This check passes if:
The license identified matches any of the following:
ID: license_scanning
License scanning software scans and automatically identifies, manages, and addresses open source licensing issues.
This check passes if:
A FOSSA
or Snyk
link is found in the repository’s README
file. Regexps used:
A link pointing to the license scanning results is provided in the .clomonitor.yml metadata file.
ID: Apache_2.0
A permissive license whose main conditions require preserving copyright and license notices. Contributors provide an express grant of patent rights. Licensed works, modifications, and larger works may be distributed under different terms and without source code.
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at
ID: trademark_disclaimer
Project sites should have the Linux Foundation trademark disclaimer.
This check passes if:
The Linux Foundation trademark disclaimer is found in the content of the website configured in Github. Regexps used:
The Contributor Leaderboard on the Confluence Dashboard displays a ranking of users based on their contributions to Confluence activities within a specified date range.
Contributors are ranked based on metrics including new pages, comments, attachments uploaded, and blog posts on the platform.
The leaderboard provides valuable insights into user engagement and productivity within the Confluence environment.
To access the Contributor Leaderboard in Confluence, follow these steps:
On the Overview page, select the project for which you want to see the data.
On the left navigation pane, click Overview > Confluence.
Select the specific date range using the filter option (2).
Scroll down and you will see the Contributors Leaderboard widget.
Use the drop-down menu to filter the leaderboard based on specific confluence activities such as page edits, comments, attachments, and blog posts.
The leaderboard will dynamically update to display rankings based on the selected date range and activity filter.
The leaderboard serves several purposes:
Enhancing Engagement: Motivates users to participate more actively.
Tracking Productivity: Offers insights into who is the most active contributor.
Identifying Knowledge Leaders: Helps in recognizing contributors who are pivotal in spreading knowledge and expertise across the organization.
By effectively utilizing the Confluence Contributor Leaderboard, organizations can foster a more engaged and productive community, driving the collective success of their projects.
Gerrit data connector is a tool that allows you to connect Gerrit, a web-based code review system for Git repositories, with other data sources or systems. This connector enables you to extract, transform, and load data between Gerrit and Insights.
Following are the different Gerrit activity types:
A best practice score visualization is a tool that helps project leads and managers assess the overall health and quality of an open source software project.
It typically evaluates the project against a set of best practices or standards for software development, such as the categories Documentation
,
Standards
,
Security
`and `
Legal`
.
It generates a score or rating based on how well the project meets these criteria.
On the Overview page, select the project and repositories for which you want to see the best practice score.
Select the specific time period using the filter option.
Scroll down to find the best practice score dashboard.
You can see the aggregated score (3) and each category's score on the dashboard.
Click the Download icon to download the dashboard.
Click on any category to see the expanded page where you can see the detailed analysis for each repository.
Click the Create Issue button to create an issue for each repository.
Insights often incorporates a Geographical Distribution metric to provide insights into the locations from which contributions originate.
Geographical Distribution analyzes and visualizes the contributions made by contributors across different regions around the world. It provides a breakdown of the top regions based on the total number of contributors, providing a clear understanding of the project's global engagement and scope.
Hover over the chart to view the number of contributors for each region during the selected period. This information provides a more granular view of contributor activity within specific regions.
Global Impact: Geographical Distribution allows you to assess the global impact of the open source project by providing insights into the regions where contributions are coming from.
Regional Comparison: Compare the contribution numbers across different regions to identify any notable variations. Assess whether certain regions show consistent contribution levels or if there are fluctuations that require further investigation.
Top Contributing Regions: It helps to identify the top five regions based on the total number of contributions. These regions represent areas where the project has significant engagement and impact.
The Organization Leaderboard ranks organizations based on their contributions to the project. The leaderboard provides insights into organizations' collective efforts to drive your projects' success and growth.
It helps you determine if your project has a healthy contribution from multiple organizations and if new organizations are coming to contribute to the project.
This chart displays individual identities, not merge contributors, as in tool. Even if certain identities are combined into one contributor in CM, they will still appear as separate entities in the Insights leaderboard charts. This distinction exists because merging contributors means combining the contributions of a single individual working under different accounts or identities. The separation of identities in Insights is maintained for privacy reasons and GDPR compliance.
Each activity is considered a separate contribution. For example, opening a PR and closing a PR are counted as two distinct contributions.
Recognition: The Organization Leaderboard recognizes and showcases the contributions made by various organizations.
Project Sustainability: The Organization Leaderboard evaluates the involvement of organizations and assesses the project's long-term sustainability and growth potential.
Trust and Credibility: When organizations are actively engaged in your projects and their contributions are recognized through the leaderboard, it enhances the overall trust and credibility of the project.
The metric provides insights into the technical contribution breakdown across weekdays and weekends. It shows the time of day when most of the contributions happen so that you have maximum participation in the project. Each day is categorized by its level of activity, indicating low to high contribution levels.
Commits are recorded in the individual contributors' local time zone.
Only commit data is used for this dashboard. Each Commit is counted only once in this metric.
Commits
authored commit
Activity Level Assessment: Work Time Distribution allows you to assess the technical activities across different days of the week. By analyzing the chart, project managers can identify contribution patterns and trends, such as peak activity days or days with lower participation.
Productivity Monitoring: Work Time Distribution helps you to monitor contributors' productivity and engagement. By analyzing the breakdown of contributions, you can identify periods of high productivity and low engagement.
Work Optimization: By understanding the distribution of contributions across weekdays and weekends, project managers can identify potential collaboration challenges due to varying availability.
Weekday vs. Weekend Contributions: Compare the contribution levels between weekdays and weekends. Assess significant activities differences, and identify any patterns or preferences in contributor engagement during these periods.
Maximum Participation: As an Executive Director or Maintainer, when you want to set up a community call for your project, you view the time when most of the contributions happen so that you can have maximum participation.
The Organization Leaderboard ranks organizations based on their activity types on Confluence pages for the selected date range. These activity types include new pages, blog posts, attachments, total pages, and page comments.
This chart displays individual identities, not merge contributors, as in tool. Even if certain identities are combined into one contributor in CM, they will still appear as separate entities in the Insights V3 leaderboard charts. This distinction exists because merging contributors means combining the contributions of a single individual working under different accounts or identities. The separation of identities in Insights is maintained for privacy reasons and GDPR compliance.
The leaderboard ranks organizations on the Confluence platform according to the following criteria:
New Pages: The number of new pages created by the organization.
Blog Posts: The frequency and quality of blog posts published.
Attachments: The number of files and documents attached to pages and posts.
Page Comments: The level of engagement is demonstrated by comments on pages.
Organization Dependency Metric shows the analysis of how much a project's contributions depend on or are associated with different organizations.
With Organization Dependency Metrics, you can assess which organizations are significantly contributing to your project.
Each activity is considered a separate contribution. For example, opening a PR and closing a PR are counted as two distinct contributions.
Engagement Assessment: For organizations involved in the project, this metric helps assess their level of engagement and impact. It can encourage healthy competition among contributors, resulting in greater involvement.
Risk Management: Dependency on a single organization for contributions can be risky. If that organization reduces its involvement, the project might face challenges.
The Active Days metric measures the number of days a contributor has made at least one contribution to a project. It counts the number of days on which a contributor has been actively engaged in the project's development.
The Active Day chart also displays two bars for the current data and the previous data, allowing you to compare them.
The Active Days dashboard provides you with the following insights:
You can monitor progress and identify trends. This information can be used to set goals and benchmarks for the project and measure success.
The active days metric provides a quick snapshot of the project's activity level. It helps determine whether the project is actively maintained or not.
The visualization can be used to quickly assess the activity level of a repository. A repository with a high number of active days is likely to be more active and healthy than one with a low number of active days.
By highlighting the importance of active days, project managers can encourage new contributors to become more involved in the project.
Check icon to verify the projects that have Gerrit as a data source.
This metric shows the list of active organizations in the past 6 months, but they have been inactive in the last 3 months.
This metric is not impacted by the date range selection.
This metric shows the distribution of activities on your Confluence pages throughout the week. It helps you identify trends in page creation, updates, and engagement over different days.
When you select the date range, this metric shows the data for each day for the week of that selected period. For instance, if you select last year as a date range to analyze the data, the metric will show consolidated documentation activities for all days of the week.
Analyze the Key Metrics
Total Documentation Activities: This represents the total number of activities logged across community documents, aggregated by each day of the week.
Bar Chart: Each bar corresponds to a day of the week, with the length of the bar representing the total activities for that day.
Utilize Insights
Scheduling: Plan important meetings, updates, or content releases on high-activity days (Tuesday, Wednesday, and Thursday) to maximize engagement.
Resource Allocation: Allocate resources and support more effectively by focusing on the peak activity days.
Trend Analysis: Regularly monitor these trends to adapt strategies and improve engagement based on activity patterns.
The Most Popular Pages metric shows the top-ranked pages based on their activities.
The ranking is calculated based on the number of interactions each page receives during the selected period.
The metric provides insights into which pages are most engaging and popular among team members, helping you identify areas of interest and focus on improving content quality.
The top of the leaderboard indicates the % change in activities compared to the previous period. This gives a general idea of the engagement trend.
Breakdown of the Table
Rank: Indicates the popularity rank of each page based on activities.
Name: The title or name of the page or event.
Activities: The number of activities recorded on that page during the current time.
Change: The change in the number of activities compared to the previous time (e.g., +19).
Total: The percentage of total activities that each page contributes.
Utilize Insights
Decision Making: Use these metrics to understand which pages or meetings are driving engagement.
Content Focus: Focus more on high-engagement topics for future content or meetings.
Trend Analysis: Regularly monitor these metrics to identify trends and adjust strategies accordingly.
The Confluence Data Analytics Dashboard available in the open-source analytics application, Insights, provides a comprehensive suite of tools designed for deep analysis of user interactions, collaboration patterns, and content efficiency within the Confluence platform. Here is an overview of its features:
Track peak times for user engagement and contributions.
User Interaction Graphs: Visualize the network of collaborations among users.
Page Views and Edits Tracking: Monitor the popularity and evolution of content over time.
Most Engaged Content: Identify the content that receives the most views, comments, and shares.
Team Performance Metrics: Evaluate the productivity and collaboration levels of different teams.
Individual Contribution Insights: Assess the input of individual team members in the collaborative process.
Flexible Filtering: Create custom reports by applying filters based on users, time frames, and content types.
Export Features: Export reports in various formats for sharing or further analysis.
The dashboard integrates seamlessly with Confluence, leveraging its API to pull real-time data. This enables teams to make data-driven decisions, enhance collaboration, and improve content quality on the Confluence platform.
The Key Metrics and Detailed Analysis section includes four high-level tiles with charts highlighting significant trends and patterns within your analytics data.
The cumulative charts help you compare different metrics on single charts. This comparative analysis helps identify relationships and draw meaningful conclusions.
Mailing Lists: Indicates the total number of mailing lists associated with the project for the selected period.
Messages: Displays the total count of messages exchanged within the selected mailing list(s) for the selected period.
Contributors: shows the total number of contributors actively participating in discussions.
Organizations: highlights the involved organizations or entities contributing to the discussions within the mailing lists.
On the Overview page, select the project and repositories (1) for which you want to see the data.
From the left navigation, click Mailing List.
Select the specific period using the filter option (2).
Click the high-level tile (3), which shows the total number of themailing lists/ messages/ contributors/ organizations
for the selected time range.
The detailed analysis charts show the following details:
Mailing Lists: shows Active mailing lists vs. total mailing lists.
Messages: shows new messages vs. the total messages for the selected period.
Contributors: shows new contributors vs. the total contributors for the selected period.
Organizations: shows new organizations vs. total organizations for the selected period.
The ranking is based on the number of messages or contributions from newly onboarded contributors to the mailing lists during a specific period.
This dashboard provides insights into the new contributor's activity for the selected projects. It integrates data from Groups.io to help you understand how new contributors are engaging with your project's mailing lists.
On the Overview page, select the project and repositories for which you want to see the new contributor leaderboard.
From the left navigation, click Mailing Lists.
Look for the date filter in the top-right corner of the dashboard.
Click on the calendar icon and select the timeframe you want to analyze. You can choose predefined options like "Last year," or "this year," or set a custom date range.
Scroll down the Mailing Lists dashboard to see the leaderboard.
The leaderboard displays the top contributors based on their message count during the chosen period.
Each entry shows:
Rank: Position on the leaderboard based on message count.
Contributor Name: Username of the contributor.
Messages: Total number of messages posted.
Last Message Date: Date of the most recent message posted.
Click Show More to expand the list.
Click the Download icon to download the leaderboard.
Identify Potential Contributors: Discover individuals who are actively engaging and might be interested in contributing to other project areas.
Measure Community Growth: Track the rate at which new people are joining the mailing list, which can indicate overall community health.
Regularly review the leaderboard to identify any trends or patterns in new contributor activity.
This leaderboard ranks contributors by their engagement level across all mailing lists within the specific timeframe, showcasing the most engaged and active contributors.
The leaderboard integrates data from Groups.io to help you understand how new contributors are engaging with your project's mailing lists.
On the Overview page, select the project and repositories for which you want to see the new contributor leaderboard.
From the left navigation, click Mailing Lists.
Look for the date filter in the top-right corner of the dashboard.
Click on the calendar icon and select the timeframe you want to analyze. You can choose predefined options like "Last year," or "this year," or set a custom date range.
Scroll down the Mailing Lists dashboard to see the most active contributors leaderboard.
The leaderboard displays the most active contributors based on their message count during the chosen period.
Each entry shows:
Rank: Position on the leaderboard based on message count.
Contributor Name: Username of the contributor.
Messages: Total number of messages posted.
Last Message Date: Date of the most recent message posted.
Click Show More to expand the list.
Click the Download icon to download the leaderboard.
The Activities Breakdown chart provides a detailed breakdown of various activities on your Confluence pages, including new pages, blog posts, attachments, total pages, and page comments.
To access the Activities Breakdown, follow these steps:
On the Overview page, select the project for which you want to see the data.
On the left navigation pane, click Overview > Confluence.
Select the specific date range using the filter option (2).
Scroll down and you will see the Activities Breakdown widget.
Analyze the distribution of activities to understand user engagement patterns and trends over time.
Click icon to download the chart in PNG or CSV format.
The ranking is based on the number of messages or contributions from newly onboarded organizations to the mailing lists during a selected period.
It highlights the engagement level of these organizations by showcasing their rankings derived from the quantity of messages contributed across the entire spectrum of mailing lists.
On the Overview page, select the project and repositories for which you want to see the new contributor leaderboard.
From the left navigation, click Mailing Lists.
Look for the date filter in the top-right corner of the dashboard.
Click on the calendar icon and select the timeframe you want to analyze. You can choose predefined options like "Last year," or "this year," or set a custom date range.
Scroll down the Mailing Lists dashboard to see the leaderboard.
The leaderboard displays the new organizations based on their message count during the chosen period.
Each entry shows:
Rank: Position on the leaderboard based on message count.
Name: Username of the contributor.
Messages: Total number of messages posted.
Last Message Date: Date of the most recent message posted.
Click the Download icon to download the leaderboard.
The Geographical Distribution chart is a feature in the Mailing Lists dashboard of the open-source analytics application, Insights V3. This interactive chart provides a visual representation of the geographical locations of contributors to your mailing lists.
It allows you to see where your contributors are located and the extent of their contributions during a selected period.
With this chart, you can:
View the distribution of contributors globally.
Understand the extent of contributions from different regions.
Filter data by a specified period to analyze trends.
To access this feature:
Navigate to the Insights V3 dashboard.
Click on the Mailing Lists section.
Locate and select the Geographical Distribution
chart.
This tool is invaluable for community managers looking to understand and grow their global contributor base.
This ranking displays the most recent discussions based on the timing of the last messages posted. It highlights the latest and most active discussions within the mailing lists for the selected period.
The table below provides a snapshot of the most active discussions within our mailing lists, ranked by the timing of the last message posted. This ensures you are always informed about the freshest and most relevant conversations.
Rank | Discussion Topic | Last Message Date |
---|---|---|
Please ensure to join these discussions to share your insights and contribute to our community's knowledge.
This chart ranks public mailing lists based on their overall activity, considering total messages, unique authors, and contributions from different organizations. It highlights the most active and engaged mailing lists within the project for the selected period.
The Leaderboard provides a snapshot of the most vibrant mailing lists within your project for a selected time period. Here's how to interpret the information:
Ranking: Indicates the position of each mailing list based on activity levels, with #1 being the most active.
Name: The name of the mailing list.
Threads: The count of discussion threads initiated in the mailing list.
Messages: Total number of messages posted in all threads.
Subscribers: The number of individuals subscribed to receive updates from the mailing list.
Contributors: Unique individuals who have posted at least one message to the mailing list.
Organization: The entities (like companies or institutions) that contributors are affiliated with.
Each category also displays a change (+/-) compared to the previous period, helping you see trends like growth or reduction in activity.
Contributor Dependency measures and analyzes the dependencies or relationships between different contributors within a project. It explores how contributors rely on each other, collaborates, and interact in terms of code contributions, reviews, and other collaborative activities.
Contributor dependency shows the relationship between contributors or entities within a project, where the actions or outputs of one contributor depend on the inputs or outputs of another.
This chart displays individual identities, not merge contributors, as in Community Management tool. Even if certain identities are combined into one contributor in CM, they will still appear as separate entities in the Insights leaderboard charts. This distinction exists because merging contributors means combining the contributions of a single individual working under different accounts or identities. The separation of identities in Insights is maintained for privacy reasons and GDPR compliance.
Each activity is considered a separate contribution. For example, opening a PR and closing a PR are counted as two distinct contributions.
Collaboration: It identifies which contributors frequently interact, exchange ideas, review each other's work, and collaborate on code changes.
Knowledge Sharing and Expertise: Understanding these dependencies can help project maintainers identify subject matter experts, encourage knowledge sharing, and allocate resources effectively.
Project Health and Sustainability: By analyzing Contributor Dependency, project maintainers can evaluate the health and sustainability of the project. Dependencies that are concentrated around a few contributors may pose risks if those contributors become less active or leave the project.
A mailing list is a communication platform where you can sign up to communicate messages via email. It acts as a centralized hub for discussions, announcements, and collaborations among a specific group sharing common interests or working towards a common project.
Groups.io is an online platform that offers mailing list management and hosting facilities. It offers features for creating, managing, archiving mailing lists, and facilitating email communication. To learn more, see https://groups.io/.
The Mailing Lists Dashboard within Insights, integrated with Groups.io, provides data insights into project communications. Its primary objectives and goals include:
Objective: Centralizing and analyzing communication data from Groups.io mailing lists.
Goals:
Communication Analysis: Understand the frequency, nature, and trends of interactions within mailing lists.
Engagement Measurement: Measure user engagement levels, message frequency, and active contributors.
Community Insights: Identify contributors and organizations involved, fostering collaboration and understanding community dynamics.
This ranking showcases the top messages based on the number of responses generated by these messages. It highlights the most engaging and widely discussed topics within the mailing lists for the selected period.
On the Overview page, select the project and repositories (1) for which you want to see the data.
From the left navigation, click Mailing List.
Select the specific period using the filter option (2).
Scroll down the main webpage where the leaderboard is hosted.
The leaderboard shows the rankings based on the number of messages. Look through the list to see which messages or topics are most engaging.
The leaderboard includes a feature to compare the increase or decrease of messages over a selected period.
Click Show More to expand the list.
If you want to download the list, click .