Show HN: I took back Video.js after 16 years and we rewrote it to be 88% smaller
What do you do when private equity buys your old company and fires the maintainers of the popular open source project you started over a decade ago? You reboot it, and bring along some new friends to do it.Video.js is used by billions of people every month, on sites like Amazon.com, Linkedin, and Dropbox, and yet it wasn’t in great shape. A skeleton crew of maintainers were doing their best with a dated architecture, but it needed more. So Sam from Plyr, Rahim from Vidstack, and Wes and Christain from Media Chrome jumped in to help me rebuild it better, faster, and smaller.It’s in beta now. Please give it a try and tell us what breaks.
407 points by Heff - 78 comments
Video handling on the web is still surprisingly painful in 2026 -- between codec fragmentation, adaptive bitrate, and accessibility requirements. Having a maintained, lightweight player that handles the hard parts is genuinely valuable. Looking forward to trying this on a couple of projects where I am currently using a bloated custom setup.
https://github.com/morhetz/gruvbox
I had one question I couldn't answer reading the site: what makes this different from the native html video element?
AFAICT just the transport controls?
Generally, the video tag is great and has come a very long way from when Video.js was first created. If the way you think about video is basically an image with a play button, then the video tag works well. If at some point you need Video.js, it'll become obvious pretty quick. Notable differences include:
* Consistent, stylable controls across browsers (browsers each change their native controls over time)
* Advanced features like analytics, ABR, ads, DRM, 360 video (not all of those are in the new version yet)
* Configurable features (with browsers UIs you mostly get what you get)
* A common API to many streaming formats (mp4/mp3, HLS, DASH) and services (Youtube, Vimeo, Wistia)
Of course many of those things are doable with the video tag itself, because (aside from the iframe players) video.js uses the video tag under the hood. But to add those features you're going to end up building something like video.js.
(And why does that matter? Dynamic bitrate adjustment. The chunks are slightly easier to cache as well.)
P.S i built movie streaming and tv broadcasting player for country of Georgia and supported environments from 2009 LG Smart TVs to modern browsers.
That means when you're encoding the downscaled variants, the encoder wants to know the size of the file segments so it can insert those IDR frames. Therefore it's common to do the encoding and segmentation in a single step (e.g. with ffmpeg's "dash" formatter).
You can have variable-duration or fixed-duration segments. Supposedly some decoders are happier with fixed-duration segments, but it can be fiddly to get the ffmpeg settings just right, especially if you want the audio and video to have exactly the same segment size (here's a useful little calculator for that: https://anton.lindstrom.io/gop-size-calculator/)
For hosting, a typical setup would be to start with a single high-quality video file, have an encoder/segmenter pipeline that generates a bunch of video and audio chunks and DASH (.mpd) and/or HLS (.m3u8) manifests, and put all the chunks and manifests on S3 or similar. As long as all the internal links are relative they can be placed anywhere. The video player will start with the top-level manifest URL and locate everything else it needs from there.
The simplest option is to use some basic object storage service and it'll usually work well out of the box (I use DO Spaces with built-in CDN, that's basically it).
1. No playback rates under 1
2. No volume rocker on mobile
3. Would appreciate having seek buttons on mobile too
4. No (easily apparent) way to add an accent color, stuck with boring monochrome
5. Docs lacked clear example/demo/playground so I wasn't sure what it would look like until implemented
We learned some tough lessons with media-chrome[1] and Mux Player, where we tried to just write web components. The React side of things was a bit of a thorn, so we created React shims that provided a more idiomatic React experience and rendered the web components...which was mostly fine, but created a new set of issues. The reason we chose web components was to not have to write framework-specific code, and then we found ourselves doing both anyway.
With VJS 10 I think we've landed on a pretty reasonable middle ground. The core library is "headless," and then the rendering layer sits on top of it. Benefit is true React components and nice web components.
[1] https://github.com/muxinc/media-chrome
If you mean "why do I need React / any kind of bundling; why can't I just include the minified video.js library as a script tag / ES6 module import?" — I'm guessing you can, but nobody should really want to, since half the point here is that the player JS that registers to back the custom elements, is now way smaller, because it's getting tree-shaken down to just the JS required to back the particular combination of custom elements that you happen to use on your site. And doing that requires that, at "compile time", the tree-shaking logic can understand the references from your views into the components of the player library. That's currently possible when your view is React components, but not yet possible (AFAIK) when your view is ordinary HTML containing HTML Custom Elements.
I guess you could say, if you want to think of it this way, that your buildscript / asset pipeline here ends up acting as a web-component factory to generate the final custom-tailored web-component for your website?
Hope this new iteration is exceptionally successful.
I hope the plugin directory get an overhaul too and a prominent place an the webpage. The plugin ecosystem was for me a huge benefit for Video.js
Even though some of them are outdated, they were a good source of inspiration.
https://github.com/videojs/v10/discussions
Some background: our store[1] which was inspired by Zustand[2] is created and passed down via context too. This is the central state management piece of our library and where we imagine most devs will build on for extending and customizing to their needs.
Updates are handled via simple store actions like `store.play()`, `store.setVolume(10)`, etc. Those actions are generally called in response to DOM events.
On the events side of things, rather than registering event listeners directly, in v10 you'd subscribe to the store instead. Something like `store.subscribe(callback)`, or in React you'd use our `usePlayer`[3] hook. The store is the single source of truth, so rather than listening to the underlying media element directly, you're observing state changes.
---
So far with v10 we haven't been thinking about "plugins" in the traditional sense either. If I had to guess at what it would look like, it'd be three things:
1. Custom store slices[4] so plugins can extend the store with their own state and actions
2. A middleware layer that plugs into the store's action pipeline so a plugin could intercept or react to actions before or after they're applied, similiar to Zustand middleware, or even in some ways like Video.js v8 middleware[5]
3. UI components that plugins can ship which use our core primitives for accessing the store, subscribing to state, etc.
I believe that'd cover the vast majority of what plugins needed in v8. We haven't nailed down the exact API yet but that's the direction we're leaning towards. We're still actively working on both the library and our docs so I don't have somewhere I can link to for these just yet (sadly)! We're likely targeting sooner, but GA (end of June) is the deadline.
I should also add... one thing we prototyped early on that may return: tracking end-to-end requests through the store. A DOM event triggers a store action like play, which calls `video.play()`, which then waits for the media event response (play, error, etc.). It worked really well and lines up nicely with the middleware direction.
[1]: https://github.com/videojs/v10/tree/main/packages/store
[2]: https://github.com/pmndrs/zustand
[3]: https://videojs.org/docs/framework/react/reference/use-playe...
[4]: https://zustand.docs.pmnd.rs/learn/guides/slices-pattern#sli...
[5]: https://legacy.videojs.org/guides/middleware/
Granted, my knowledge on the matter is rather limited, but I had some long running streams (weeks) and with HLS the playlist became quite large while with dash, the mpd was as small as it gets.
HLS also has newer features that address the growing manifest issues you were seeing. [2]
All that said, I think a lot of people would feel more comfortable if the industry's adaptive streaming standard wasn't completely controlled by Apple.
[1] https://caniuse.com/http-live-streaming
[2] https://www.mux.com/blog/low-latency-hls-part-2
There are no immediate plans to deprecate React Player and I think it holds a special place in the ecosystem, but there will be overlap with video.js v10 and if there's specific features you care about or feel are missing, or if you think we're doing a bad job, please voice it here.
It was a similar story with Vidstack and Plyr, with Mux first sponsoring the projects. That's how I met Rahim and Sam, and how we got talking about a shared vision for the future of players.
We’re taking a new approach to the library with a lot of new concepts, so your feedback would help us a ton during Beta as we figure out what’s working well and what isn’t.
Basically few kB for CSS and few kB for a thin “framework” layer for managing attr to prop mapping, simple lifecycle, context, and so on.
I'm a one-man operation. In the order of hundreds of videos served a week. All I want is control over my own destiny. If this and a VPS can do that, that'll be amazing. Thank you for doing this.
Might need to consider bandwidth and the usual mitigation against scrapers if you're serving video unauthenticated.
We'll be moving to videojs 10 when it hits GA.
In the meantime, we’re hoping our custom elements will act as a good stopgap. Most frameworks including Svelte support them well, and we’re pouring love into the APIs so they feel good to use regardless of which framework.
If you’re interested in peeking under the hood, architecturally we’re taking a similar approach to TanStack and separating out a shared core from the beginning, but with one added step of splitting out the DOM as well to aid in supporting RN one day.
Did the private equity buy the domain videojs.org (did it take control of the project and you somehow regained control after selling) or was this domain (and the project) always under your control?
https://videojs.org/docs/framework/react/concepts/presets
Throws Uncaught (in promise) TypeError: AbortSignal.any is not a function on volume-slider-data-attrs.BOpj3NK1.js
[1]: https://github.com/videojs/v10/issues/1120
https://github.com/Qbix/Platform/blob/main/platform/plugins/...
We currently already use video.js, and our framework us used all over the place, so we’d be the perfect use case for you guys.
How would we use video.js 10 instead, and for what? We would like to load a small video player, for videos, but which ones? Only mp4 files or can we somehow stream chunks via HTTP without setting up ridiculous streaming servers like Wowsa or Red5 in 2026?
What are you supporting today that requires Wowza or Red5? The short answer is Video.js is only the front-end so it won't help the server side of live streaming much. I'm of course happy to recommend services that make that part easier though.
[1] https://github.com/muxinc/media-elements
So I'm just wondering whether we can do streaming that way, and video.js can "just work" to play the video as we fetch chunks ahead of it ("buffering" without streaming servers, just basic HTTP range requests or similar).