Evaluat is in private access. Demos open through May. Book a slot
Performance Testing

Performance testing that runs real browsers.

Push your site to a target concurrent user count. Measure what real browsers actually see. Get a per-session report on every one of those users, with Web Vitals, network, console, and video. Not a request script's guess at it.

Performance Testing is live now. Testing Suite and Monitoring are coming.

Performance Test config
1380 × 1035 · screenshot

What Evaluat calls performance testing is what other tools call load tests, stress tests, or spike tests. Different traffic shapes, same configuration. We use the umbrella term because the differences are knobs on one test, not separate features. Set the ramp-up, the steady-state, the ramp-down. Pick the scenarios. Hit go.


Configure the test

All the dials you'd expect. None you wouldn't.

Every performance test in Evaluat is built from the same six pieces. Same controls on Starter as on Enterprise.

  • Load mode — Duration (fixed time, target concurrency) or Sessions (target completed runs).
  • Users and duration — concurrent target and runtime. Minimum ramp rates kept realistic.
  • Traffic shape — ramp-up, steady-state, ramp-down. Each phase configured separately.
  • Scenarios with weights — multiple journeys in one test, distribute 100% across them.
  • Region, timezone, locale, viewport — match the customer segment you're modelling.
  • Popup handler toggles — pick which auto-dismiss rules apply for this run.
See the configurable conditions
Test Plan configuration
1380 × 1035 · screenshot
Build it once. Run it everywhere.

Scenarios are reusable.

A scenario you build for a 1,000-user performance test runs as a deployment smoke test in CI and as a 5-minute production monitor. Same definition, same configuration UI, three different lifecycles. The maintenance burden is one place, not three.

Parameterise the scenario with datasets so virtual users follow different paths. Wire a project-level popup handler once so the cookie banner doesn't break every script.

More on scenarios
Browser showing a cookie consent dialog handled by an Evaluat popup handler
What you get back

Five views over the same run.

Every test produces a five-view report. Aggregate enough for the executive summary. Detailed enough to find why something broke for 14 users out of 42,000.

  • Overview — active users, sessions completed, failed sessions, time-series Web Vitals.
  • URL performance — every URL hit, per-URL Web Vitals, status distribution, timing breakdown.
  • Sessions — every virtual user's session, individually addressable, with video.
  • Console logs — every browser message, deduplicated and counted across the run.
  • Network logs — every HTTP request, filterable across millions of rows.

Reports have stable URLs and can be shared read-only with people outside the team.

Engineer presenting an Evaluat performance test report with charts

What teams use it for

From peak-day readiness to "what just broke?".

The patterns we hear most often on demo calls. Different shapes of the same test, with the same forensic detail in the report.

Capacity rehearsals

Push critical paths to 5x, 10x, 20x normal traffic. Find the third-party tag, the slow query, or the cache miss that breaks first.

Release validation

Compare two test runs side-by-side. If LCP regressed by 400ms after this deploy, you'll see it before customers do.

Web Vitals budgets

Set thresholds on LCP, INP, CLS. The test fails when the budget breaks. Stop shipping regressions.

Third-party tag analysis

Analytics, A/B testing, consent banners. Measure exactly how much each one costs you in INP under load.

Regional performance

Run from multiple regions with the right timezone and locale. See what customers in Amsterdam actually experience.

Forensic debugging

Something failed for 12 sessions out of 40,000. Open one. Watch the video. One-hour root cause, not one-week.

Get a demo

See it run on your site.

30 minutes. We build a scenario on your real checkout, run a small test, and walk you through the report views with your data in them.

Session replay preview
30s video · 16:9