annihalator
loss of 40fps is not that bad - you will still get well over 100fps
have in mind - games like Apex you play with 1080p 100% renderscale and low
if your CPU handles more stuff it cant handle more of Apex - and you are streaming
thats why i use NotCPUCores as the game still runs on physical cores while OBS with x264 runs on the "weaker" SMT cores
btw - you dont want to stream Apex with AMD AMF (settings for StreamFX provided - StreamFX uses FFmpeg to use HWacc AMD AMF) you still cant play the game on ultra because AMF also needs some ressources AND the stream quality is 10-15% worse than with x264. So fast high motion games like Apex or CounterStrike etc should always be streamed with x264 (and maybe NVenc on RTX 2000 or better)
why use StreamFX and not AMD AMFenc directly from OBS?
StreamFX and AMD AMFenc for OBS was both coded by Xaymar
StreamFX has the benefit that it uses ffmpeg and allows custom settings - you squeeze out 5% better quality with the settings i provided compared to "standard" AMF
but as i said - DONT USE AMF for games like Apex - the bitrate
twitch.tv allows is not big enough to make it look good
when i test and use -usage=2 on AMF ffmpeg i got minuss 10-30fps and what about this parametr? -g 0 its corect? or need -g=0. ? many do not want to use x264 because then it has to be used in addition nocpcores or process lasso.And not all games work correctly with this tools.Example in rust game i got more cpu load when i use notcpucores lol kekw What e weight to say never in the future I will no longer buy AMD GPU. And I would not recommend to others.If they for years can't fix the encoder.Encoder without b-frames support OMG.Many people view what video cards streamers have and buy themselves such.And that's good free advertising.But amd everything doesn't matter. When prices stabilize I will buy myself an RTX and never buy an AMD product again.With processors is the same .Prices for the new generation to high!I don't see the point in buying a processor with less fps at same intel price!We expect normal prices, sell our amd gpu and buy nvidia products and never buy AMD again.Thanks, AMD and bye.
Man AMD gpu for streaming = trash!!! My friend have rtx ! Quality very very nice no lags he play & stream on same pc no lags no stutters !!! And for streams use chat software etc cpu work on game and chat tools.GPU encode stream all fine all smooth.No b-frames on new generation gpu AMD HAHA.AMD gpu suck!check twitch all TOP streamers on NVIDIA because amd encoder realy
suck.in 2021 not normal stream games on x264 if u have single pc.NVIDIA make great encoder.If someone ask me buy or not AMD GPU my answer 100% dont buy this trash!Next time i pay 50-100$ more but i got nice encoder!And about drivers all fine with nvidia.All new modern game support more nvidia then amd!call of duty etc.I dont see reason to buy trash amd gpu product in 2021.And video editors "movavi" and more others dont like trash amd gpu.thats all
if nvenc eat 1% more cpu and give me beter quality its OK!On desktop i use windows because i like play modern games and dont like use wine etc on linux who eat more cpu,ram etc! On VPS i use only debian and bsd so?i mean when if amd can't in 2021 year give normal encoder for customers!Then the quality what offers amd is disrespect to the viewer.AMD please to stop production trash GPU without normal encoder and better make toasters so that people can enjoy breakfast.
I've been playing with the settings in OBS for a better quality on twitch to absolutely no avail. My issue isn't so much game quality, it's the webcam quality. The game usually looks okay no matter what settings I use but when I encode using the hardware AVC/AMF encoder, my webcam is completely and totally pixelated while any slight movement is happening in the game. I've gone through and tried these settings you've given to others and they don't work. I usually stream in 720p 60fps.
The reason I use these x264 options is because a few posts ago, someone with a 3900x asked for your help. I have adopted these settings for now and tested them myself. A few posts earlier, however, you said that all your recommendations are adapted to the respective hardware.
Do you have a recommendation for me? (Games: currently no FPS, rather games like Satisfactory, Anno 1800, Witcher 3). However, I do have one request: can you explain why the settings should be used? I am trying to some extent to understand why certain settings are applied.
Then I have two more questions:
1) I also tested your suggestions regarding 720p50 / 720p60 with 4500bitrate and it didn't look good for me. I may have set something wrong there. On Twitch I see many streaming at 6000bitrate on 720p60 and the picture looks fine. By the way, I know the Twitch encode settings, but I wouldn't have guessed that the bitrates given there are to be taken as the exact bitrate for the respective resolution. Am I perhaps misunderstanding something here?
Deutsch:
Diagramm der Bandbreite fr 720p50. An der H-Achse ist die Hhe der Auflsung abzulesen, an der W-Achse die Weite der Auflsung und an der T-Achse wird die Frames pro Sekunde oder auch Hertzzahl abgelesen. Das Vollbild-Verfahren wird symbolisiert durch die rechteckige Form. Alles in Bezug auf das Maximum der Fernsehbandbreite 1080p60.
English:
Diagram of the bandwidth of 720p50. The H-Axis is used for the height, the W-Axis for the Width and the T-Axis for the frames per second or Hertz. Progressive Scan is symbolized with the rectangular shape. In relation to the maximum of television bandwidth of 1080p60.
Maybe because i watch everything on my computer (and i assume ppl that do this are the minority of dig tv watchers) it isn't as good for me, and having the double frame rate is better for tv, but it seems they are using a lot of bandwidth to create an image identical to the one before it.
Also, with channell 7 and sbs, why do ppl refer to there hd channell as being 576p and the sd channell being interlaced. New programs such as lost or desperate housewives etc are all broadcast in progressive scan, and if you take a frame from sd and compare it to a frame from Hd, it looks heaps better.
Take for example a game of football. When chris judd runs with the ball, because he is running so fast, at 25fps he is a blur. However, at 50fps he appears less blurred and you can see what amazing skill he is showing off. Note that SDTV and 1080i is at 50Hz interlaced where as 720p is at 50Hz progressive
your confusing Shutter Speed and frame rate. At 25fps if there is enough light you can wind the shutter speed up to 1/250 or 1/500 and Juddy pusting through a pack and slotting a goal is tack sharp, well if the cameo can pull focus quick enough.
ABC HD doesn't look great during movement (fades counts as this) because of the low bitrate, not because of the broadcast frame rate or resolution. AFAIK 720p25 is not an acceptable broadcast resolution/framerate?
When done properly, film-based programming would comprise of repeated frames (25 unique frames per second, but each repeated once, so total 50 frames per second), and since MPEG-2 compresses based on the differences between each frame, there is minimal loss. ABC HD at the moment uses bob deinterlacing to convert from 1080i50 to 720p50, so there is some inefficiency, but adjacent frames are relatively the same. Rest assured, they're doing their best.
Though of course, the first few episodes of Season 1 of Spicks & Specks actually did use 25fps (using 'drop field effect' ... I should really update the wikipedia entries which use the wrong terminology).
The main reason 25fps is better than 50fps or vice versa, is down to a director's decision in the end really. News and sport are almost always 50fps (or 60fps); films and dramas are almost always 25fps (or 24fps).
It's just that the film industry has had such a huge tradition to do 24fps (and the fact that it's technically easier to use low frame rates) that HD / digital production hasn't seen higher frame rates become commonplace yet. There's a very good BBC document that talks about "film look" being partly to do with low frame rates, but more importantly, it's more about lots of other factors actually.
You are missing the point. To give 720p50 the image to make it work, you need to 'shoot' or 'acquire' at 50p. If you take any think shot on SD/HD at 25p or 50i, then you will not get the benefit of 720p50, it will only (at best) give you 720p25 (converted). This has nothing to do with shutter speed, but the electronic frame rate of transmission. No TV station in Australia can shoot, edit and TX at a true 720p50.
In other words transmitting a progressive frame derived from true interlace source (ie not intelace fields of a film frame but interlace fields from different instances in time) is a kludge, and nothing but a kludge.
Actually I don't think you have paid attention to what I have been saying. I was only replying to a comment that in fast action sport, a player will look less blurred at a higher frame rate which is false. Higher frame rate will result in smoother motion. That is why when you slow down footage down to 50% the image doesn't blur unless the shutter speed isn't high enough.
Really? This would be interesting. However it might have zero effect for many things, like dramas shot on film (whether from the US, UK or Australia) or even outside material like sports ... I wonder if these will continue to have all the interlace shimmer that they have now. Do you think this 720p50 pipeline means anything other than studio?
& why follow the Yanks, we've already have a PM who's Bush's lapdog? We'er Australian, we should do things our way. It wasn't long ago that if you tell the Yanks that PAL is better than NTSC, they will tell you all these bs about 60Hz.
3a8082e126