Hi Pavan,
see my answers below.
>The questions here are sort of general if android happens to use ALSA
>for it's HAL (which I agree is not always the case...)
>So most FM chipsets on linux would/should have a V4L2 interface for
>the controls and alsa PCM devices for the audio.
This FM radio proposal does not make any assumptions on whether ALSA
or V4L interfaces are used in the lower layers.
Whether ALSA is used or not is hidden beneath the
AudioHardwareInterface (indicated by the arrow between AudioFlinger
and AudioSource/AudioSink in the figure), and is a design selection by
the underlying platform vendor (through the implementation of the
AudioSource/AudioSink).
Whether V4L is used or not is hidden beneath the interface below the
FM radio handler, ), and is a design selection by the underlying
platform vendor (through the implementation of the FM radio plugin).
>1. When you say a plugin/fm radio handler, does it mean it would use a
> V4L2 interface?
The reason for the design with an FM radio handler/plugin is to allow
vendors to use different implementations on how to control their FM
radio HW. How it is implemented is abstracted within the FM radio
plugin. So using a V4L interface is one possibility, but any other
interface could be used. The interface that the plugin must implement
is the interface exposed by the FM radio handler, which basically is a
mirroring of the FM radio API. This interface is then defined within
Android, similar to the AudioHardwareInterface (if this proposal
becomes part of Android).
>2. Is this interface extendible to designs where we don't have a
>separate pcm device for a FM device, but control using certain mixer
>settings on the default device itself?
The AudioHardwareInterface handles separate devices, and does not know
anything about mixer settings. However, this interface is transparent
to how the devices are handled at layers below this interface. So in
that sense it is extendible.
>As in what constitutes a Audio Source/Sink ?
The Audio Source/Sink shown in the figures represents the device
realizations which the AudioFlinger communicates with via the
AudioHardwareInterface. They are part of the vendor specific
underlying platform, and their realization is vendor specific.
>3. Is this interface proposal, involve different types of FM on
>android platform architectures such as FM audio data not being routed
>to apps processor at all?
>(kind of related to question 2 & may eliminate recording scenario,
>thereby having sort of a NULL stream on the AudioSource ... )
In this proposal the FM audio data is routed via the Media Player /
AudioFlinger, which suggests that it is indeed routed to the apps
processor.
It is not directly extendible to an architecture where the audio data
is not routed via the apps processor. However, that does not rule out
the possibility that it can be done. A discussion on how that could be
done is welcome.
>4. Also does this proposal have extensions where I can playback while
>recording, features which phones generally don't have but media
>players tend to have?
This proposal will support simultaneous playback and recording. To
achieve this, we have added a stream splitting service in
AudioFlinger, so that both the Media Player and the Media Recorder
independently can pull data from the same source.
>5. FM Tx cases have also been seen in very rare cases to take in
>analog input, as in cases where data is not stored on a music file,
>but analog input which can be from a mic, or an input which is
>connected to another analog output, so those cases also tend to be
>difficult if we imagine AudioSource/Sink to be PCMs right?
I think this question relates to the question no 3. As this proposal
routes the FM audio data via the Media Player/AudioFlinger as a PCM
stream, the case you mention becomes indeed difficult.
Regards,
Ulf
> >
https://docs.google.com/leaf?id=0Bx0OR2cNmeUrNmU2MTg4ZjAtNGNmMS00N2Ex...
>
> >
https://docs.google.com/fileview?id=0Bx0OR2cNmeUrM2M5ZTNjMGQtMDk0My00...