My ASUS ROG Swift PG27AQ 4k monitor says it is only running at 1080p at 60hz. I have a Xbox one X and I am trying to enable 4k mode. The xbox says that this monitor is not capable of running 4k at 60 hz but my monitor can. Does anybody know how to fix this and make my monitor run 4k? I have tried going on to the monitor settings.
You can also choose (which is most probably what you want) to have it run scaled to "Looks like 1080p". This means that the signal sent to the monitor is really a 4k signal, but user interface elements are scaled to be the same physical size as if you had been running 1080p. This ensures that the system is usable without requiring extraordinarily good eye sight. Note that this means that text is really drawn at the large resolution, which ensures crisp and sharp text, which is easier to read. Images and video are also displayed at a larger resolution (if available), which means you get better picture quality on the 4k monitor compared to a 1080p monitor.
In addition to the two scenarios outlined in your question and above, you also have the third option of running the monitor in a 4k non-scaled resolution. This means that you'll have a very large amount of screen real estate (i.e. room for lots of windows, content, etc.) - but everything will be 4 times smaller physical size - so this is usually only a good choice if you buy a really large monitor (like for example when using a 50 inch TV as a monitor).
I have a RX 580 8gb running Windows 10.
I can choose the 1920x1080 resolution on Windows, but the display goes to 1080i instead of 1080p. It's really weird because on GAMES it goes properly to 1080p, only on Windows it stays on 1080i
Ok, I've found out why I wasn't able to output at 60fps. I was using HDMI scaling at 4%. But without it the screen is larger than my display and I lose borders.
And no, there's no option to turn off overscan on this TV, and I never needed it on any other device running at 1080p.
That's a freaking nightmare, Jesus Christ.
Question: Would I be better running the monitor in 1080p (exactly 1/4 the native resolution of 4K), or would I be better running this in native 4K with the highest render scale the system can tolerate - have no idea what it is, but heard it might be able to do up to 1440p equivalent.
Hey Brian!
thanks buddy this is running great. Apart from this I have two doubts. It would be great if you could help me out.
I am working for a college project in which I have to use 3.5"LCD screen to show different animation clips which are shuffled by input from PIR sensor.
1. Is it possible to run videos (not necessarily HD) through RCA video output using omxplayer. (One option that I can think of is using a HDMI to RCA converter, but let me know if it's possible to do without it)
2. How can I use the signal from sensors to toggle between videos?
for example if I have video1 and video2. I want video1 to be played as default and video2 when the PIR is in high state.
thanks,
In a perfect world, you want to run all of your games at your monitor's native resolution. I started gaming back when we hooked up bulging TVs to our computers (C-64), and we were happy to play at 320x200. These days, I have multiple 4K and ultrawide monitors, and the difference in graphics quality is amazing. Still, there are plenty of games where even the fastest current hardware simply isn't capable of running a new game at 4K, maximum quality, and 60 fps. Look no further than Control, Gears of War 5, and Borderlands 3 if you want examples.
Depending on the game, it might be possible to play at 4K with a lower quality setting, and the difference between ultra and high settings is often more of a placebo than something you'd actually notice without comparing screenshots or running benchmarks. Dropping from ultra to medium on the other hand might be too much of a compromise for some. There are also games like Rage 2 where even going from maximum to minimum quality will only improve framerates by 50 percent or so.
Back when we all used CRTs, running at a lower resolution than your native monitor resolution was commonplace. However, CRTs were inherently less precise and always had a bit of blurriness, so we didn't really notice. I hated dealing with pin cushioning, trapezoidal distortion, and the other artifacts caused by CRT technology far more than the potential blurriness of not running at a higher resolution.
Consider a simple example of a 160x90 resolution display with a diagonal black stripe running through it. Now try to stretch that image to 256x144. We run into a problem of not easily being able to scale the image, and there are different techniques. One option is to use nearest neighbor interpolation, or you can do bilinear or bicubic interpolation. There are pros and cons to any of those, but all look worse than the native image.
Additional info: I'm running unRaid v6.1.9 with the Plex application in Docker. Parity + 5 data drives, no cache drive. One of the data drives is entirely for Docker (set to 200GB) and Plex, the rest contain my media.
Media playback for most is a complicated process, because there are many codecs for audio, video, and captioning/subtext. And what one media handler may prefer may be different from another, especially from the various playback devices. And there's the connection speed and reliability to workaround. If the playback device can't handle the specific formats that the media file or server handle, then the server has to convert or transcode it, a CPU intensive task. And if the connection can't support the amount of data, then the server will also have to transcode it to a lower resolution. Clearly your server is not up to the task, or the software it's running is not correctly using the available hardware boosts.
Hey there! Just got a new Dell 4k monitor (U2720Q). It's currently connected via USB-C to my 2020 13" mbp. It appears to be running in 1080p by default, which was a bit of a surprise to me. I figured default would be, yaknow, 4k.
I got a LG 4K-TV. I want to have Kodi running 1080p@60Hz in the menus. I set it in the settings, and I also adjust it to the movie I watch. If I play a 23.96Hz I switch to that, and then back to 60Hz when I stop.
You better have a super gaming rig if you want to run 4K, which I would suggest if you are using anything bigger than 32". There is no point in running 1080p on a big screen then you might as well run 1080P on a small er screen closer to you.
I was able to briefly run Zwift with the same setup above using a Dell PC, also same i7 CPU but with 1060 GPU. Certainly was better than the MBP, but still was getting some stuttering when running ultra mode. When switched to the 1070TI, Ultra mode cranks at 60 ffs, and absolutely no stuttering of the graphics on the same 4k TV. Seemed pretty noticeable.
The GTX 1650 Super brings GDDR6 to the table versus the non-super using GDDR5. The new memory sits on the same 128-bit bus, using four 32-bit memory controllers. This update significantly improves memory bandwidth from 128 GBps to 192 GBps. Memory capacity remains the same at 4GB, which today can be a concern for some titles running ultra settings. Running out of VRAM can significantly affect performance, so users will have to be careful with memory-heavy games, especially as time goes on and VRAM requirements increase.
Rebel Galaxy by Double Damage Games debuted in October on PC to great reviews (including our own), and it's almost ready to launch on PlayStation 4 and Xbox One too. The developers announced this yesterday via Twitter, while an earlier tweet published a few days ago suggested that the PS4 version of Rebel Galaxy would be running at 1080P/30FPS.
aa06259810