doc: format the objects design page

Mostly replacing the lists so they actually render as lists
This commit is contained in:
Peter Hutterer 2021-08-02 12:29:48 +10:00 committed by Wim Taymans
parent 25d15765b7
commit c2fef6caa2

View file

@ -39,8 +39,8 @@ A **port** is attached on a **node** and provides an interface for input
or output of media on the node. A node may have multiple ports.
A port always has a direction, input or output:
* Input: it allows media input into the node (in other terms, it is a _sink_)
* Output: it outputs media out of the node (in other terms, it is a _source_)
- Input: it allows media input into the node (in other terms, it is a _sink_)
- Output: it outputs media out of the node (in other terms, it is a _source_)
In an analogy to GStreamer, a _port_ is similar (but not equal) to a
GStreamer _pad_.
@ -76,22 +76,22 @@ An **endpoint** is a session management object that provides a representation
of user-conceivable places where media can be routed to/from.
Examples of endpoints associated with hardware on a desktop-like system:
* Laptop speakers
* USB webcam
* Bluetooth headset microphone
* Line out stereo jack port
- Laptop speakers
- USB webcam
- Bluetooth headset microphone
- Line out stereo jack port
Examples of endpoints associated with hardware in a car:
* Speakers amplifier
* Front right seat microphone array
* Rear left seat headphones
* Bluetooth phone voice gateway
* Hardware FM radio device
- Speakers amplifier
- Front right seat microphone array
- Rear left seat headphones
- Bluetooth phone voice gateway
- Hardware FM radio device
Examples of endpoints associated with software:
* Desktop screen capture source
* Media player application
* Camera application
- Desktop screen capture source
- Media player application
- Camera application
In most cases an endpoint maps to a node on the media graph, but this is not
always the case. An endpoint may be backed by several nodes or no nodes at all.
@ -129,14 +129,14 @@ a _use case_.
For example, the "Speakers amplifier" endpoint in a car might have the
following streams:
* _Music_: a path to play music;
- _Music_: a path to play music;
the implementation will output this to all speakers, using the volume
that has been configured for the "Music" use case
* _Voice_: a path to play a voice message, such as a navigation message or
- _Voice_: a path to play a voice message, such as a navigation message or
feedback from a voice assistant; the implementation will output this
to the front speakers only, lowering the volume of the music (if any)
on these speakers at the same time
* _Emergency_: a path to play an emergency situation sound (a beep,
- _Emergency_: a path to play an emergency situation sound (a beep,
or equivalent); the implementation will output this on all speakers,
increasing the volume to a factory-defined value if necessary (to ensure
that it is audible) while muting audio from all other streams at the
@ -144,10 +144,10 @@ following streams:
In another example, a microphone that can be used for activating a voice
assistant might have the following streams:
* _Capture_: a path to capture directly from the microphone; this can be used
- _Capture_: a path to capture directly from the microphone; this can be used
by an application that listens for the assistant's wake-word in order
to activate the full voice recognition engine
* _CaptureDelayed_: a path to capture with a constant delay (meaning that
- _CaptureDelayed_: a path to capture with a constant delay (meaning that
starting capturing now will actually capture something that was spoken
a little earlier); this can be used by the full voice recognition engine,
allowing it to start after the wake-word has been spoken while capturing
@ -157,12 +157,12 @@ Endpoint streams may be mutually exclusive or they may used simultaneously,
depending on the implementation.
Endpoint streams may be implemented in many ways:
* By plugging additional nodes in the media graph that link to the device node
- By plugging additional nodes in the media graph that link to the device node
(ex. a simple buffering node linked to an alsa source node could implement
the _CaptureDelayed_ stream in the above microphone example)
* By using a different device node (ex. different ALSA device on the same card)
- By using a different device node (ex. different ALSA device on the same card)
that has a special meaning for the hardware
* By triggering switches on the hardware (ex. modify ALSA controls on the
- By triggering switches on the hardware (ex. modify ALSA controls on the
same device)
### Endpoint Link
@ -185,11 +185,11 @@ When this is done, the session manager is asked to create the link using the
provided information.
This mechanism allows stream implementations:
* to prepare for linking, adjusting hardware paths if necessary
* to check for stream linking compatibility; not all streams can be connected
- to prepare for linking, adjusting hardware paths if necessary
- to check for stream linking compatibility; not all streams can be connected
to all others (ex. streams with media flow in the hardware cannot be linked
to streams that are backed by nodes in the media graph)
* to provide implementation-specific information for linking; in the standard
- to provide implementation-specific information for linking; in the standard
case this is going to be a list of _ports_ to be linked in the media graph,
but in a hardware-flow case it can be any kind of hardware-specific detail