31 Active Experiments
67 Experiments Completed
24 Failed Experiments

SKY Labs

Pure experiments with transparent results. Early-stage Adsense tests, subdomain experiments, and pattern recognition in small datasets from the SKY ecosystem.

Lab Principles

No fake numbers • No revenue screenshots
Focus on process, learning, observations
Words like: early-stage, directional, initial, observations

SKY Labs is our public experiment log where we test hypotheses, document failures, and share learnings from early-stage experiments across the SKY ecosystem. No conclusions—just transparent observations.

SKY Labs Experiments

How We Run Public Experiments

Ongoing Methodology

Our transparent methodology for running and documenting public experiments across the SKY ecosystem.

Methodology Transparency
View Methodology

Early-Stage Adsense Experiments Explained

Completed 60 days

Testing Adsense layouts, placements, and formats on low-traffic SKY ecosystem sites.

Adsense Monetization
View Results

What Low Traffic Data Can (and Can't) Tell You

Ongoing Analytics

Understanding the limits and possibilities of data analysis with 100-1000 monthly visitors.

Analytics Data
Read Analysis

Experiment Log: First 30 Days of a New Subdomain

Completed 30 days

Day-by-day documentation of what happens when launching a new TrainWithSKY subdomain.

SEO Subdomains
View Log

Why Some Experiments Fail Completely

Completed Learnings

Detailed analysis of 24 failed experiments and what we learned from each.

Failure Learning
Read Analysis

Testing Without Scale: Is It Still Useful?

Ongoing Methodology

The value (and limitations) of running experiments with small sample sizes.

Methodology Testing
Read Analysis

How We Document Experiments Honestly

Completed Documentation

Our template and process for transparent experiment documentation.

Documentation Transparency
View Template

When to Stop an Experiment Early

Completed Decision Making

Criteria and signals for stopping experiments before planned completion.

Decision Making Methodology
Read Guidelines

Pattern Recognition in Small Datasets

Ongoing Data Analysis

Techniques for identifying meaningful patterns in limited data.

Data Analysis Patterns
View Techniques

Lessons Learned from Failed Tests

Completed Learnings

Compilation of key insights from 24 failed experiments across the SKY ecosystem.

Learnings Failure
View Lessons

Experiments Across SKY Ecosystem

All experiments are conducted on real SKY platforms with actual users:

SKY TTS

AI voice experiments, user retention studies, TTS optimization tests

Visit Platform

SKY ConverterTools

UX experiments, conversion rate tests, tool performance optimization

Visit Platform

TrainWithSKY

18 subdomain experiments, content strategy tests, learning engagement studies

Visit Platform

Our 5-Step Experiment Methodology

Every SKY Labs experiment follows this transparent process:

1

Define Clear Hypothesis

"If we change X, we expect Y to happen because Z." Specific, testable, and measurable.

2

Set Success Criteria

What data would support or reject our hypothesis? Defined before starting, never changed mid-experiment.

3

Run Controlled Test

Change one variable at a time. Control group where possible. Document all setup details.

4

Document Everything

Setup, timeline, unexpected events, all data points (even contradictory ones). No cherry-picking.

5

Share Learnings Publicly

What worked, what failed, what surprised us. No fake numbers, no revenue screenshots.

Failed Experiments Archive

24 experiments that didn't work as expected, but taught us valuable lessons:

  • Social Media Auto-Posting: Automated cross-posting reduced engagement by 42% compared to platform-native content.
  • Early Exit Popups: Popups at 5 seconds increased email list growth but decreased page engagement by 31%.
  • AI Content Detection: No consistent pattern found in distinguishing AI vs human-written content on early-stage sites.
  • Complex Navigation Menus: Advanced mega-menus decreased user engagement by 28% compared to simple navigation.
View All Failed Experiments

SKY Labs Principles

No Fake Numbers

We show actual data, even when it's unimpressive or contradicts popular beliefs.

Transparent Process

We document setup, methodology, and unexpected events—not just final results.

Failure Documentation

Failed experiments are often more valuable than successful ones. We share both.

Small Data Focus

We work with realistic traffic levels (100-1000 visitors/month), not millions.

Why Transparent Experiments Matter

Builds Real Trust

Sharing failures and small wins builds more credibility than only showcasing successes.

Realistic Benchmarks

Most creators have small traffic. Our experiments reflect that reality, not enterprise-scale data.

Learning Through Doing

Experiments force us to be specific about hypotheses and measurement.

Community Learning

Shared experiments help others avoid our mistakes and build on our learnings.