lkml.org 
[lkml]   [2023]   [Jan]   [6]   [last100]   RSS Feed
Views: [wrap][no wrap]   [headers]  [forward] 
 
Messages in this thread
/
Date
SubjectRe: [RFC PATCH 02/14] ASoC: qcom: qdsp6: Introduce USB AFE port to q6dsp
From

>>> The QC ADSP is able to support USB playback and capture, so that the
>>> main application processor can be placed into lower CPU power modes. 
>>> This
>>> adds the required AFE port configurations and port start command to
>>> start
>>> an audio session.
>>
>> It would be good to clarify what sort of endpoints can be supported. I
>> presume the SOF-synchronous case is handled, but how would you deal with
>> async endpoints with feedback (be it explicit or implicit)?
>>
>
> Sure, both types of feedback endpoints are expected to be supported by
> the audio DSP, as well as sync eps.  We have the logic there to modify
> the audio sample size accordingly.

did you mean modify samples per USB frame (or uframe), so as to change
the pace at which data is transferred? If yes it'd be the same for Intel.

>>>     static const struct snd_soc_dapm_route q6afe_dapm_routes[] = {
>>> +    {"USB Playback", NULL, "USB_RX"},
>>
>> ... but here RX means playback?
>>
>> I am not sure I get the convention on directions and what is actually
>> supported?
>>
>
> The notation is based on the direction of which the audio data is
> sourced or pushed on to the DSP.  So in playback, the DSP is receiving
> audio data to send, and capture, it is transmitting audio data received.
> (although we do not support capture yet)

ok, it'd be good to add a comment on this convention. Usually RX/TX is
bus-centric.

>
>>> +struct afe_param_id_usb_cfg {
>>> +/* Minor version used for tracking USB audio device configuration.
>>> + * Supported values: AFE_API_MINOR_VERSION_USB_AUDIO_CONFIG
>>> + */
>>> +    u32                  cfg_minor_version;
>>> +/* Sampling rate of the port.
>>> + * Supported values:
>>> + * - AFE_PORT_SAMPLE_RATE_8K
>>> + * - AFE_PORT_SAMPLE_RATE_11025
>>> + * - AFE_PORT_SAMPLE_RATE_12K
>>> + * - AFE_PORT_SAMPLE_RATE_16K
>>> + * - AFE_PORT_SAMPLE_RATE_22050
>>> + * - AFE_PORT_SAMPLE_RATE_24K
>>> + * - AFE_PORT_SAMPLE_RATE_32K
>>> + * - AFE_PORT_SAMPLE_RATE_44P1K
>>> + * - AFE_PORT_SAMPLE_RATE_48K
>>> + * - AFE_PORT_SAMPLE_RATE_96K
>>> + * - AFE_PORT_SAMPLE_RATE_192K
>>> + */
>>> +    u32                  sample_rate;
>>> +/* Bit width of the sample.
>>> + * Supported values: 16, 24
>>> + */
>>> +    u16                  bit_width;
>>> +/* Number of channels.
>>> + * Supported values: 1 and 2
>>
>> that aligns with my feedback on the cover letter, if you connect a
>> device that can support from than 2 channels should the DSP even expose
>> this DSP-optimized path?
>>
>
> My assumption is that I programmed the DAIs w/ PCM formats supported by
> the DSP, so I think the ASoC core should not allow userspace to choose
> that path if the hw params don't fit/match.

Right, but the point I was trying to make is that if the device can do
more, why create this DSP path at all?

>
>> Oh and I forgot, what happens if there are multiple audio devices
>> connected, can the DSP deal with all of them? If not, how is this
>> handled?
>>
>
> This is one topic that we were pretty open ended on.  At least on our
> implementation, only one audio device can be supported at a time.  We
> choose the latest device that was plugged in or discovered by the USB
> SND class driver.

Similar case for Intel. I have to revisit this, I don't recall the details.

>>> +    u32                  dev_token;
>>> +/* endianness of this interface */
>>> +    u32                   endian;
>>
>> Is this a USB concept? I can't recall having seen any parts of the USB
>> audio class spec that the data can be big or little endian?
>>
>
> No, this is probably just something our audio DSP uses on the AFE
> commands that it receives.

ok.


\
 
 \ /
  Last update: 2023-03-26 23:31    [W:0.985 / U:0.060 seconds]
©2003-2020 Jasper Spaans|hosted at Digital Ocean and TransIP|Read the blog|Advertise on this site