Skip to main content

Tests & Results

Configure, run, and analyze UX tests.

Creating a Test

Every test starts with a URL and a task description. Here are the key configuration options:

  • URL — The page where personas will start their session. For public sites, use the full URL. For localhost or VPN sites, use the Chrome Extension.
  • Task description — A natural language instruction telling personas what to do (e.g. "Sign up for a free account").
  • Success criteria — How the persona knows the task is complete (e.g. "You see a welcome dashboard").
  • Panel size — The number of personas to run (1–20 depending on your plan).
  • Execution mode — Cloud (CrowdRunner servers) or Your Browser (Chrome Extension).

Test Modes

Task Mode

The default mode. Personas follow your task description and try to complete it. Best for testing specific user flows like checkout, sign-up, or search.

Exploration Mode

Personas browse your site freely without a specific goal. Useful for discovering general usability issues and understanding how different users naturally navigate your interface.

Chaos Mode

Personas deliberately try unusual interactions — edge cases, unexpected inputs, rapid navigation. Great for stress-testing your UI and finding error handling gaps.

Understanding Results

Session Timeline

Each persona's session is recorded as a step-by-step timeline with screenshots. You can click through each step to see exactly what the persona saw, what action they took, and what they were thinking.

Screenshots & Comparison

The comparison view shows screenshots from multiple personas side by side at each step. This makes it easy to spot where different user profiles diverge in their experience.

Friction Points

CrowdRunner automatically identifies moments where personas got confused, encountered errors, or struggled to complete an action. Each friction point includes the step number, a description of the issue, and a severity rating.

Reports

After all sessions complete, CrowdRunner generates an AI synthesis report that summarizes:

  • Overall task completion rates across the panel
  • Common friction points and their severity
  • Demographic patterns — did certain user profiles struggle more than others?
  • Actionable recommendations for improvement

A/B Testing

Compare two versions of a page or flow by creating an A/B test. Navigate to A/B Tests in the dashboard, enter two URLs, and CrowdRunner runs the same cohort against both. The comparison report highlights differences in task completion, satisfaction, and friction points.

Scheduling Tests

Set up recurring tests to monitor your site over time. Go to Schedules, configure a test with a cron expression (e.g. every Monday at 9am), and CrowdRunner will run it automatically. This is useful for catching regressions after deployments.

Sharing Results

Generate a shareable link for any test result to share with teammates or stakeholders. Shared links provide read-only access to the full results without requiring a CrowdRunner account.

We use cookies to improve your experience. Privacy Policy