Why Standards?
The Web Standards the W3C and ECMA International maintain are a critical piece of the web infrastructure. Having a standard for how HTML, CSS, and JavaScript should be interpreted by a web browser means that every browser can display the same webpage in the same way - allowing you the choice of your favorite browser without worrying your web browsing experience would be impacted.
This was not always the case; during the early days of the world-wide-web in a period known as the Browser Wars the browsers Netscape Navigator and Internet Explorer both added features to their browsers not defined in the standards, and web developers had to decide which browser to build against.
Even after that point, Microsoft’s Internet Explorer did not adopt the full standards, forcing web developers to build their web pages to display in a standards-compliant browser (like Firefox, Chrome, or Safari), then modify their design to work within Internet Explorer. Thankfully, even Microsoft grew frustrated with maintaining the non-compliant Internet Explorer, and replaced it with Microsoft Edge.
The web standards also provide a mechanism for adding new features. The approach involves all stakeholders in the process, and allows browser manufacturers to add “experimental” features so that new ideas can be tried out before being adopted. In fact, browser manufacturers often adopt different approaches to a potential feature, allowing for a comparison before a final standard is adopted.
All standards maintained by the W3C are voluntary - there is no mechanism to force a browser manufacturer to support a particular feature. Some features may take a long time to be adopted by all browsers, or may never be adopted. The MDN Web Docs offer a table with each feature identifying which version (if any) of the major web browsers have adopted the feature.