lkml.org 
[lkml]   [2022]   [Jun]   [30]   [last100]   RSS Feed
Views: [wrap][no wrap]   [headers]  [forward] 
 
Messages in this thread
/
Date
From
SubjectRe: [PATCH 0/1] [RFC] drm/fourcc: Add new unsigned R16_UINT/RG1616_UINT formats
On Wed, 29 Jun 2022 14:53:49 +0000
Simon Ser <contact@emersion.fr> wrote:

> On Wednesday, June 29th, 2022 at 16:46, Dennis Tsiang <dennis.tsiang@arm.com> wrote:
>
> > Thanks for your comments. This is not intended to be used for KMS, where
> > indeed there would be no difference. This proposal is for other Graphics
> > APIs such as Vulkan, which requires the application to be explicit
> > upfront about how they will interpret the data, whether that be UNORM,
> > UINT .etc. We want to be able to import dma_bufs which create a VkImage
> > with a "_UINT" VkFormat. However there is currently no explicit mapping
> > between the DRM fourccs + modifiers combos to "_UINT" VkFormats. One
> > solution is to encode that into the fourccs, which is what this RFC is
> > proposing.
>
> As a general comment, I don't think it's reasonable to encode all of the
> VkFormat information inside DRM FourCC. For instance, VkFormat has SRGB/UNORM
> variants which describe whether pixel values are electrical or optical
> (IOW, EOTF-encoded or not). Moreover, other APIs may encode different
> information in their format enums.

Yeah, do not add any of that information to the DRM pixel format codes.

There is *so much* other stuff you also need to define than what's
already mentioned, and which bits you need for the API at hand depends
totally on the API at hand. After the API has defined some parts of the
metadata, the API user has to take care of the remaining parts of the
metadata in other ways, like dynamic range or color space.

Besides, when you deal with dmabuf, you already need to pass a lot of
metadata explicitly, like the pixel format, width, height, stride,
modifier, etc. so it's better to add more of those (like we will be
doing in Wayland, and not specific to dmabuf even) than to try make
pixel formats a huge mess through combinatorial explosion and sometimes
partial and sometimes conflicting image metadata.

You might be able to get a glimpse of what all metadata there could be
by reading
https://gitlab.freedesktop.org/pq/color-and-hdr/-/blob/main/doc/pixels_color.md
.

Compare Vulkan formats to e.g.
https://docs.microsoft.com/en-us/windows/win32/api/dxgicommon/ne-dxgicommon-dxgi_color_space_type
and you'll see that while DXGI color space enumeration is mostly about
other stuff, it also has overlap with Vulkan formats I think, at least
the SRGB vs. not part.

Btw. practically all buffers you see used, especially if they are 8
bpc, they are almost guaranteed to be "SRGB" non-linearly encoded, but
do you ever see that fact being explicitly communicated?

Then there is the question that if you have an SRGB-encoded buffer, do
you want to read out SRGB-encoded or linear values? That depends on
what you are doing with the buffer, so if you always mapped dmabuf to
Vulkan SRGB formats (or always to non-SRGB formats), then you need some
other way in Vulkan for the app to say whether to sample encoded or
linear (electrical or optical) values. And whether texture filtering is
done in encoded or linear space, because that makes a difference too.

IOW, there are cases where the format mapping depends on the user of the
buffer and not only on the contents of the buffer.

Therefore you simply cannot create a static mapping table between two
format definition systems when the two systems are fundamentally
different, like Vulkan and DRM fourcc.


Thanks,
pq
[unhandled content-type:application/pgp-signature]
\
 
 \ /
  Last update: 2022-06-30 09:48    [W:0.067 / U:0.496 seconds]
©2003-2020 Jasper Spaans|hosted at Digital Ocean and TransIP|Read the blog|Advertise on this site