Enhancing the Design Process with Azure ML–Powered A/B Test

Experimented with Azure Machine Learning to build a modular A/B test pipeline that predicts UI performance and accelerates iteration for B2B products

Personal Project

Azue Machine Learning

A/B Test Framework

OVERVIEW /

Curious about how machine learning can inform product decisions, I ran a self-initiated personal experiment using Microsoft Azure Machine Learning to build a predictive A/B test pipeline from real Amazon web-experiment data comparing two UI versions.

Context

Project Goal

Explore how machine learning can accelerate product validation and decision-making in data-limited B2B environments, helping teams test faster with credibility.

Why this matters?

Traditional A/B tests are costly, slow, and data-scarce. Enterprise tools often cost $5K–$10K per month, a single test can take 2+ weeks, and most B2B products lack the traffic volume needed for statistical confidence.

For designers, this means great ideas stall before they’re proven, and iteration becomes a cycle of waiting instead of discovery.

Design Opportunity

I explored how predictive modeling could generate early engagement signals without waiting for full experiments, combining data science and product design to accelerate iteration and reduce guesswork.

Executed Project

Using sample A/B test data from Amazon e-commerce sites, I built a predictive testing pipeline in Azure Machine Learning Studio, training a Boosted Decision Tree Regression model to forecast engagement and conversion between two UI variants.

Process

Reframing A/B Test as a Design System: Architected a predictive pipeline that lets designers forecast outcomes before running full experiments

  1. I mapped the end-to-end pipeline architecture to visualize how the A/B test pipeline's flow. This helped me clarify the at which point machine learning could support.

  1. I configured the workflow in Azure Machine Learning Studio, turning that architecture into a working prototype that connected imported CSV sample data, model training, and evaluation.

  1. I designed for balance and reliability by allocating 70 % of the data for training and 30 % for testing, ensuring the model learned effectively while still validating accuracy.

  1. I trained the model on user behavior signals, labeling “Conversion Rate” as the target variable so the system could learn which UI factors drive engagement.

  1. Once the pipeline was ready, the job was executed and I validated the pipeline’s performance by reviewing model-scoring results and confirming successful, failure-free execution.

  1. The successful pipeline gave the below results for the “Score Model” component, including the predicted engagement (Scored Labels) of the Conversion Rate column.

Result

Machine Learning-Powered A/B Testing Confirms Version A Wins

+9.8% higher

Version A Conversion Rate
Ver A (0.1108) vs. Ver B (0.1009)

+19.6 % higher

Version A Predicted Engagement
Ver A (0.1270) vs. Ver B (0.1062)

I connected the Azure ML output to Power BI (data visualization tool) to visualize the model results and to simplify data into design insight.

The visualization clearly showed that Version A outperformed Version B in both predicted engagement and conversion rate, revealing which UI direction resonated more with users and demonstrating how data can inform design decisions before launch.

Power BI (data visualization tool) - Model Results as Data Insights

Reflection

Exploring Emerging Tools to Redefine the Design Process

This project reflects how I approach design with curiosity and systems thinking: constantly exploring how emerging tools can expand what’s possible in the design process

By experimenting hands-on with Azure ML and Power BI, I discovered how integrating data, AI, and workflow design can unlock faster, more informed ways to test and iterate—delivering product decisions with both speed and quality.

As a next step, I plan to deploy this as a prototype tool and run evaluative testing with designers, gathering feedback to refine its efficiency and understand how it can best support real-world product design workflows.