After we launched Bunny Stream in 2021, we did one thing daring. In a world the place corporations existed solely to promote video transcoding, we determined to vary the established order by introducing a video service that gives a full end-to-end resolution for video supply, the place the one charges you pay are the bandwidth and storage prices.
Our customers beloved it, and Bunny Stream took off. With all that progress, the infrastructure needed to scale as properly. In the present day, after a whole lot of hundreds of thousands of uploaded movies, Bunny Stream operates a large cluster of over 500 servers working away 24/7 and continues to develop each month.
Free mustn’t imply subpar
We launched free encoding not simply to assist customers cut back prices, but additionally to supply a seamless end-to-end expertise with out the entire conventional complications that often occur with video supply at scale. In a world the place “free” typically comes with compromises, we had been decided to ship a service that remained distinctive with out reducing corners on high quality.
But, regardless of the entire new {hardware} we saved including, we regularly received reminded that encoding movies is just a really resource-intensive course of as we often bumped into lengthy processing queues. To handle that, we labored yr after yr on new optimizations. From sensible video queuing and encoding optimizations to parallel processing and higher error dealing with, we progressively pushed efficiency ahead.
However at present, as that title implies, we’re excited to share one thing a lot greater.
Twice the efficiency, zero the price
Thanks to 2 main updates that we launched in September and October, we have simply doubled the transcoding capability for Bunny Stream! These updates embrace a big overhaul of our {hardware} and a vital optimization of our video storage system. Collectively, they’ve resulted in a large enchancment within the video processing pipeline.

For you, meaning your movies now course of twice as quick as they did just some months in the past, and better of all, this comes on the usual value of free.
Here is how we did it.
Distributed storage load balancing
For some time, we observed that transcoding speeds weren’t scaling linearly with newly put in {hardware}. It turned out that with all of our computing energy at our disposal, we had been capable of produce such excessive quantities of knowledge that the bottleneck turned the video storage system itself.
Bunny Storage, equally to encoding, runs in massive clusters of a whole lot of servers per area. Whereas importing a file, these large clusters work collectively to try to discover an excellent vacation spot for the add. Throughout this course of, we take many elements into consideration, together with server load, connection rely, and extra, which permits us to effectively write tens of Gbits of information per second throughout the cluster, however even that wasn’t totally sufficient for the masses that Bunny Stream was producing.
When debugging gradual areas of video encoding, we realized some encoded chunks had been importing slower than others and sometimes actually slowed to a crawl. After a prolonged investigation, we discovered that the load-balancing logic had one main flaw whereas making an attempt to consider all these metrics. The entire servers had been performing this logic independently. In some conditions, this led to many API servers deciding to jot down to the identical optimum servers on the similar time, which rapidly slowed issues down.
To handle this, we launched a brand new, distributed load-balancing system that runs on every of our storage servers. Initially developed for close-to-realtime world safety data synchronization for our CDN, we now constructed this method into storage. The system now permits us to share real-time information throughout each Bunny Storage API server and communicates routing choices to each different API server, considerably enhancing the load-balancing course of. With this replace, the cluster now operates as a single distributed load balancer, making considerably higher choices than tens of servers independently.
This variation in load balancing resulted in as much as 10X enchancment in file writes throughout aggravating conditions and massively improved not simply Bunny Stream however the capability for Buny Storage to jot down information quick. Actually quick.
Consequently, this performed a big affect on the entire encoding pipeline, nearly immediately growing the encoding throughput by as a lot as 2X as soon as this replace went into manufacturing.

Switching to the most recent Intel-based CPUs
The second piece of the efficiency puzzle was the encoding {hardware}.
Traditionally, Bunny Stream was powered by AMD Ryzen 5 3600 CPUs. Because of our optimized {hardware} stack, these CPUs supplied a wonderful price-to-performance ratio and a capability to encode in a extremely parallelized surroundings. Nonetheless, as expertise continues to evolve, we’re continually evaluating totally different {hardware} configurations with a purpose to regularly hop forward.
In October, we made the change. After rigorous testing, and a significant {hardware} funding, we kickstarted an formidable new challenge to utterly overhaul our transcoding cluster, and change to the most recent Intel i5-13500 CPUs. Whereas this took a big quantity of sources, we doubled down with the aim of providing an distinctive service with out counting on compromises.
The consequence? A staggering 60% improve in transcoding velocity.

Constructing a greater strategy to ship on-line video
We constructed Bunny Stream as a platform to supply a greater strategy to ship on-line video, and for us, that mission by no means ends. We’re excited to have the ability to share a glimpse into how we’re making Bunny Stream higher day-after-day and regularly pushing ahead boundaries of efficiency to supply distinctive person experiences with out all of the complexities.
We have now thrilling plans forward for Bunny Stream, and when you’re obsessed with video and streaming and want to assist make these a actuality, be certain to take a look at our careers web page.
Dejan Grofelnik Pelzel