Building Orbit: A Dashboard That Actually Works
Orbit is the analytics dashboard demo on our site. It has real charts, date filtering, sortable tables, and CSV export. Not a screenshot. Not a mockup. A working tool, built in one day.
Dashboards are a good test for speed-tier development because everyone has an opinion about what they should look like. The challenge is not building charts. It is building an interface where charts, KPIs, tables, and filters all react to the same state, feel cohesive, and handle edge cases without breaking.
The brief
The brief was one paragraph: an internal analytics dashboard for a SaaS company. Show MRR, signups, churn, and active users. Make the charts interactive. Include a data table with sorting and export. It should feel like a tool someone would actually use, not a portfolio piece.
We wrote the spec in about 30 minutes. It covered: the four KPI cards with period-over-period comparisons, date range filtering (7d, 30d, 90d, 12 months), an MRR line chart with tooltips, a revenue-by-plan bar chart, a sortable data table, and CSV export. We also specified how the fake data should behave: 365 days of SaaS metrics with realistic growth, variance, a churn spike at month 6, and a signup boost at month 9.
Data generation
The data was the most important decision. A dashboard with flat lines and round numbers looks fake immediately. We specified a seeded pseudo-random generator that produces consistent values across renders but with enough variance to look like real business data.
MRR grows from roughly 52,000 to 200,000 SEK over the year, with month-over-month variance of 8 to 15 percent. Revenue splits across three plans: Starter at 30%, Pro at 50%, Enterprise at 20%. The churn spike at day 180 and the signup boost at day 270 give the charts a story to tell, not just a line going up and to the right.
The agent implemented this data layer correctly on the first pass. Specifying the exact parameters (growth range, split percentages, event timing) in the spec eliminated any ambiguity.
Recharts and interactivity
We chose Recharts for the chart layer. The line chart shows MRR over time with an area fill and hover tooltips that display the exact value and date. The bar chart breaks down revenue by plan. Both charts respond to the date range filter, which updates every element on the page: KPI cards recalculate, charts redraw, and the data table filters to the selected period.
The trickiest part was the tooltip formatting. The agent produced functional tooltips on the first pass, but the number formatting (Swedish locale, SEK currency) needed one correction. We specified the exact format string and it was right on the next iteration.
The data table
The sortable data table was where the demo separates itself from a typical portfolio dashboard. Click any column header to sort ascending or descending. The table shows the top 20 rows of the filtered dataset and updates live as you change the date range. CSV export downloads whatever you are looking at.
The agent handled the sort logic and CSV generation without any correction. These are well-defined, deterministic features that map directly from a spec to an implementation. No aesthetic judgment required.
KPI cards and period comparison
Each of the four KPI cards (MRR, New Signups, Churn Rate, Active Users) shows the current value and a comparison to the previous period. Select “Last 30 days” and each card shows the 30-day value alongside the change from the prior 30 days, with green or red arrows.
This required the agent to implement the comparison logic correctly for each metric type: absolute values for MRR and signups, percentage for churn, and count for active users. One correction was needed: the agent initially showed the percentage change for all four, but churn rate should display the absolute difference in percentage points, not the relative change. A small but important distinction.
Timeline
Total time from blank spec to deployed demo: about 7 hours. Spec writing took 30 minutes. The agent ran for approximately 5 hours across two sessions. Human review, corrections, and deployment took about 1.5 hours. The final output: roughly 2,200 lines of TypeScript across 12 files.
A comparable dashboard built traditionally by a developer familiar with the stack would take 3 to 5 days. The primary time savings came from the data generation layer and the chart integration, both of which the agent produced from spec with minimal correction.
What we learned
Interactivity is where AI-native development shines hardest. The filtering, sorting, and state management that connects charts to tables to KPI cards is tedious to write by hand but straightforward for an agent to generate from a clear spec. The spec says “date range filter updates all elements” and the agent wires the state correctly.
The correction rate on Orbit was lower than on Pulse. We attribute this to the spec being more precise: exact data parameters, exact metric types, exact interaction behaviors. Dashboards are inherently more specifiable than landing pages because every element has a defined function rather than an aesthetic goal.
The live demo is at /demos/dashboard. Every chart is interactive. Every number is computed from the underlying data. It is a tool, not a picture of one.