I don’t really know where to post this question, but I hope I could get any answers from the
developers.
As I know by doing some research,
popular CDN based live streaming platform(e.g, twitch.tv ) provides recommended encoder settings (resolution, bitrate, fps) for broadcasters who use advanced software encoder (e.g, obs, xsplit).
Before doing live stream, broadcaster should test one’s upload bandwidth and select one of the recommendations. And once the encoder setting is selected, it can’t be changed during live stream.
On the delivery side however, there is popular adaptive bitrate streaming (DASH, HLS) to cope with heterogeneous bandwidth state of the viewers.
[CDN based live streaming architecture]
<--------------Ingest Side-------------------> <------Delivery Side--->
RTMP HLS
[Broadcaster] ----------------> [Media Server]--->[CDN]--------> viewer 1
constant bitrate | 720p
ABR |-----> viewer 2
| 360p
|-----> viewer 3
240p
My question is,
why live streaming platform, like twitch, does not provide any bitrate adaptation during ingest to media server?
Or do they have bitrate control only on their mobile apps?
In my opinion, adaptviely changing bitrate according to publisher’s bandwidth seems necessary and reasonable in case of bad network or bandwidth fluctuations.
Is there any bitrate adaptation for live ingesting side that I don’t know of?
I know that realtime video systems(e.g, webRTC, Hangouts) has their control logic to deal with congestion and packet loss.
Therefore, I assume that mobile streaming apps for youtube live, twitch.tv have their own bitrate control logic as well.
However, I couldn’t find any docs or information about it and also couldn’t find any, for the case where broadcasters use advanced encoder to do better live streaming.