![]() The goal isn’t to make HDR content as a whole way brighter than SDR content but rather to make certain parts of the video really stand out. You can think of HDR content being divided into two segments: The SDR segment, which covers between 0 nits and 100 nits, and the highlights segment, corresponding to values above 100 nits. While most SDR content is mastered for 100 nits of peak luminance, a lot of HDR content is mastered for 1,000 nits of peak luminance or more! However, the average luminance of most HDR content is still well below 100 nits, and it turns out to be very similar to that of most SDR content. One of the key benefits of HDR over SDR is the increased dynamic range of the luminance component of a video, hence the “high” in high dynamic range. To understand what SDR dimming is trying to solve, it’s important to understand how the problem arises in the first place. When SDR dimming is enabled, “SDR layers should be dimmed to the desired SDR white point instead of being treated as native display brightness,” according to the comment. The title and description are obviously not very descriptive to say the least, but a comment in the updated code fortunately explains what the feature is intended to do. Thanks for reading Android Dessert Bites, a weekly column that dives deep into the Android platform topics that matter to system engineers, app developers, or power users.įollowing the release of Android 12’s source code last year, I discovered a code change titled “make sdr white point do a thing.” Still, I’ll attempt to explain what I think it does with help from my former colleague Dylan Raga, Display Reviewer for XDA-Developers. Maybe you have, but I wouldn’t be surprised if most of you didn’t notice this issue before I pointed it out to you.ĭespite how niche this problem may seem, Google appears to be tackling it with a new feature they’re calling “SDR dimming.” This feature has been in development for over a year, and it isn’t clear how it works nor when it’ll launch. You’d especially notice this if you turn on subtitles while watching an HDR film with a lot of dark parts on a screen that can hit 1,000 nits of peak brightness. The captions aren’t hard coded into the video but are instead overlaid on top, and they probably look a lot brighter than the video itself. If you don’t notice it, then try opening this YouTube video in full screen and turn on subtitles/closed captions. You’ll hopefully see what I’m talking about. It’s hard to demonstrate this issue through text, so here’s a little experiment: open this YouTube video on a device that supports HDR, enter full screen, and then pull down the status bar. Sometimes those UI elements are way brighter than most of the pixels in whatever HDR video you’re watching. Android, however, has a lot of UI elements that you can interact with while an HDR video is playing, whether that be the status bar, a head-ups notification, a screenshot overlay, in-app captions, etc. When you’re watching HDR content, most of the time it’s taking up your entire screen. ![]() If it sounds like I’m making up a problem that doesn’t exist, I’m not. Regardless of your device’s display and chipset capabilities though, if it runs Android, it’ll have the same problem as every other Android phone: HDR and SDR content don’t blend well together. Of course, since there are multiple HDR formats and hardware differences among devices that claim to support HDR, your experience with HDR video may not be the same as everyone else’s. In 2022, you’d be hard pressed to find a phone or TV above the mid-range class without some form of HDR support. Android added basic support for high dynamic range (HDR) video with the release of Android 7.0 back in 2016. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |