Today, they operate two flagship products: Tito, a platform for selling tickets online, and Vito, a platform that allows folks to take their events and communities online which has a focus on the developer community and better user experience. Despite the team’s relatively small size, they were never afraid to embrace new technologies—and they invited Evil Martians to guide them through this process.
Online events must contend with specific user load requirements: this involves long periods of quiet time interspersed with very busy periods during the events themselves. To tackle this, the team trusted us to do a full migration to Kubernetes. Today, the platform uses a K8s cluster on AWS EKS, with databases based on RDS and cache servers on Elasticache.
Beyond the introduction of brand-new K8s, our SRE work included squeezing more performance out of their infrastructure, solving scaling problems, load balancing, and testing improvements. As a result of our efforts, deployment time was reduced by 50%.
Also, we noticed that many testing and deployment services had offered the Vito team plans that were too costly. Moreover, the surge in load increased the average bill additionally. We recommended some cost optimization, and monthly infrastructure expenses were decreased by 30%.
Initially, Vito’s Ruby on Rails platform’s real-time communication features—chat, notifications, and live content updates—were built with Action Cable, but their first big event resulted in some serious scaling headaches. This event required larger Heroku dyno sizes—with all of them maxed out—to handle a few thousand simultaneous connections. It seemed the Vito team was beginning to hit a limit before they had even properly gotten started.
The Vito platform needed something that would have extreme efficiency out of the box and that would be able to handle thousands of concurrent users, all updating live, and simultaneously. Further, they needed a solution that would be Action Cable compatible, and that could be implemented with no major changes to their Ruby app. For these reasons, they were drawn to AnyCable—a perfect fit for all of the above.
We also helped the Vito team go through some basic performance optimizations and, before their first major event, stress tests that involved k6 flooding the servers to determine their maximum capacity at various configurations.
This migration helped Vito withstand large peak loads with thousands of concurrent users, scale up fast, and remain cost-effective in periods of quiet time. AnyCable Pro now powers Vito’s notifications and live-updating features: every piece of content on Vito updates live, without requiring a refresh, and with updates over WebSockets. Vito also includes a WebSocket-powered live-chat feature which is backed by AnyCable.
To make Vito even more equipped to handle load, we’ve thrown imgproxy into the mix; this dedicated image processing server decreased the loading speeds for media-rich content and offered further optimization for high traffic on the platform.
By 2022, Vito had already hosted dozens of online conferences and continued to serve customers such as HalfStack, UX London, Smashing Magazine, TSConf (official TypeScript conference), NG Conf, ElixirConf, and more.