Because Google pretty much forces that. Youtube is a notorious example of them "optimizing" it to where Firefox takes longer to load.
Like most things, it's a bit more nuanced than that.
Google tried getting people excited about Web Components and started shipping a draft spec in Chrome.
Other browser makers didn't like that version of Web Components, but based on the data and feedback that was able to be gathered by shipping the early version in Chrome, they were able to work together to make a far better version of Web Components.
Unfortunately, YouTube did a big site redesign in the middle of this and ended up using the Chrome-only early version, expecting other browsers would follow suit and until then a polyfill would be good enough. (Polyfills are never really good enough, ftr.)
Since then, the library that the YouTube redesign used (Polymer) got updated to use the new Web Components spec everyone likes (and which Firefox implemented), so everyone is currently happy. I think?
Web standardization is a fairly messy process. I think it was after this Google stopped shipping draft specs to end-users and instead started requiring web developers register with Google for origin trials (you need a special token now to use experimental features.) So this situation won't ever repeat (in this way.)
Additionally, everyone has finally (thankfully) woken up to the fact that user agent strings are busted as fuck (and also a privacy leaking nightmare), and are planning on throwing that trash in the bin.

Instead of rejecting entire browsers wholesale, the idea is web apps will be required to use feature detection (a best practice anyways.)