The primary goal of this Power BI dashboard is to empower stakeholders with a clear, data-driven comparison between two user experiences: a Control group using a standard submit button and a Test group using a free quote button. By synthesizing visitor logs, the dashboard provides actionable insights into which variation yields higher engagement, lower bounce rates, and ultimately, greater revenue, guiding final implementation decisions.
The datasets used in this project (Advanced_AB_Test_Data.csv and Realistic_Advanced_AB_Test_Data.csv) were strictly generated by AI for demonstration purposes. They simulate realistic web traffic logs, capturing data points such as visit dates, device types, traffic sources, bounce flags, and revenue generated per user.
This dashboard is built with a focus on modern, minimalist design to keep the attention strictly on the data. Each visualization serves a specific analytical function:
- High-Level Key Performance Indicator Cards: To provide an immediate snapshot of core metrics—Total Visitors, Overall Conversion Rate, Average Revenue Per User, and Total Revenue—allowing for a rapid assessment of overall performance.
- Conversion Rate by Test Group : To directly compare the primary success metric of click-through rate between the Control and Test variations.
- Revenue and Engagement Over Time : To track daily performance trends. This ensures that the overall results are consistent and not skewed by a single anomalous day, weekend drop-off, or external event.
- Segmentation by Device and Traffic Source : To break down the results by user categories. This helps identify if the Test variation performs exceptionally well on specific platforms like mobile versus desktop, or acquisition channels like organic search versus paid social media.
- Bounce Rate Comparison: To evaluate if the new button format is inadvertently driving users away without interaction, serving as a necessary balance to the conversion rate.
Based on the analysis of the simulated dataset, the dashboard reveals the following key outcomes:
- Conversion Improvement: The Test group demonstrated a notable increase in the overall conversion rate compared to the Control group.
- Revenue Impact: The increased engagement in the Test group directly correlated with higher Total Revenue and an improved average revenue per user.
- Segment Variations: While the Test group performed better overall, segmentation analysis indicates that the improvement was most significant among Desktop users. Mobile performance remained relatively flat between the two groups, highlighting an area for future mobile-specific interface optimization.
- Advanced_AB_Test_Data.csv - The primary AI-generated dataset containing A/B testing visitor logs.
- Realistic_Advanced_AB_Test_Data.csv - A secondary dataset variant including detailed timestamps for deeper time-series analysis.
- AB Testing Results Dashboard.pbix - The complete Power BI project file containing the data model, measures, and visualizations.
- AB Testing Results Dashboard.png - A static screenshot of the final dashboard layout.
- Clone this repository to your local machine.
- Open the AB Testing Results Dashboard.pbix file using Power BI Desktop.
- If necessary, refresh the data connections to point to the local paths of the included csv files.
- Important for version control: Ensure that the line ignoring CSV files is removed from your .gitignore file so you can successfully upload and track the data alongside your code and dashboard files.
Benchz Sobrepeña