These two articles are not for the case to open back and front cameras at the same time.
About the first article, please refer to the
Logical and physical cameras section. A logical camera is then a grouping of two or more of those physical cameras. One example is that a phone device might want to support the bokeh function. The application will need the image depth information to process the bokeh effect on the output image. The image depth information might be calculated by the output images of two RGB cameras in the same lens facing. These two RGB cameras will be the physical cameras. The camera on top of these two physical cameras will be the logical camera which allows the application to get the resulting bokeh image and don't need to care about the detailed bokeh implementation. I believe that the group of physical cameras which belong to a logical camera usually will be located on the same side. The logical camera design is not to open the front and front cameras at the same time. CameraX does not support configuring the output configuration to receive images from the physical cameras yet.
The second article is to explain that multiple streams can be output simultaneously. There have been the following examples mentioned in the article.
Video recording: one stream for preview, another being encoded and saved into a file.
Barcode scanning: one stream for preview, another for barcode detection.
Computational photography: one stream for preview, another for face/scene detection.
When an application supports a camera function, usually it needs multiple streams. These streams come from one camera. This is not used for opening the front and front cameras at the same time, either. The streams mentioned above basically can be mapped to CameraX's use case types - Preview, ImageCapture, ImageAnalysis and VideoCapture (VideoCapture is under implementation).