lkml.org 
[lkml]   [2022]   [Apr]   [21]   [last100]   RSS Feed
Views: [wrap][no wrap]   [headers]  [forward] 
 
Messages in this thread
/
Date
From
SubjectRe: [RFC PATCH 00/16] drm/rockchip: Rockchip EBC ("E-Book Controller") display driver
On Wed, 13 Apr 2022 17:19:00 -0500
Samuel Holland <samuel@sholland.org> wrote:

[...]
> Waveform Selection From Userspace
> =================================
> EPDs use different waveforms for different purposes: high-quality
> grayscale vs. monochrome text vs. dithered monochrome video. How can
> userspace select which waveform to use? Should this be a plane property?
>
Or does userspace rather select a QoS, like low-latency vs. high
quality. Or this will not change for a longer time: like doing full
refreshes.

> It is also likely that userspace will want to use different waveforms at
> the same time for different parts of the screen, for example a fast
> monochrome waveform for the drawing area of a note-taking app, but a
> grayscale waveform for surrounding UI and window manager.
>

> I believe the i.MX6 EPDC supports multiple planes, each with their own
> waveform choice. That seems like a good abstraction, but the EBC only
> supports one plane in hardware. So using this abstraction with the EBC
> would require blending pixels and doing waveform lookups in software.
>
The iMX6 EPDC has one working buffer containing the old+new state of
the pixel. That is 16bpp. Then for each update you can specify a
rectangle in an independant 8bpp buffer as a source. For now I am just
using a single buffer. But yes, that construction could be used to do
some multi plane stuff.

> Blitting/Blending in Software
> =============================
> There are multiple layers to this topic (pun slightly intended):
> 1) Today's userspace does not expect a grayscale framebuffer.
> Currently, the driver advertises XRGB8888 and converts to Y4
> in software. This seems to match other drivers (e.g. repaper).
>
> 2) Ignoring what userspace "wants", the closest existing format is
> DRM_FORMAT_R8. Geert sent a series[4] adding DRM_FORMAT_R1 through
> DRM_FORMAT_R4 (patch 9), which I believe are the "correct" formats
> to use.
>
hmm R=red? That sounds strange. I am unsure whether doing things with
lower bit depths actually really helps.

> 3) The RK356x SoCs have an "RGA" hardware block that can do the
> RGB-to-grayscale conversion, and also RGB-to-dithered-monochrome
> which is needed for animation/video. Currently this is exposed with
> a V4L2 platform driver. Can this be inserted into the pipeline in a
> way that is transparent to userspace? Or must some userspace library
> be responsible for setting up the RGA => EBC pipeline?

hmm, we have other drivers with some hardware block doing rotation, but
in that cases it is not exposed as v4l2 mem2mem device.

On IMX6 there is also the PXP doing RGB-to-grayscale and rotation but
exposed as v4l2 device. But it can also be used to do undocumented
stuff writing to the 16bpp working buffer. So basically it is similar.
But I would do thoso things in a second step and just have the basic
stuff upstreamed

Regards,
Andreas

\
 
 \ /
  Last update: 2022-04-21 08:46    [W:0.157 / U:0.060 seconds]
©2003-2020 Jasper Spaans|hosted at Digital Ocean and TransIP|Read the blog|Advertise on this site