Recently, I have replaced GPU in my PC and have a lot of CTD until figured out what is wrong.
After switching GPU, game CTD few sec. after main menu pop up, renderer error.
I need to delete "user.ltx" to be able to play game again and with DX9 everything runs fine.
It sounds strange, becouse DX11 is not smart thing to enable with this game, but with my rig it helped.
With DX10 I was unable to have stable game, only with vanilla defaul "static lighting" renderer.
Now, above settings may or may not work stable on your PC, but I posted it here for reference.
To find good settings for you, try first set those Sun/shadow options and SSAO options to off.
Then turn on one settings at time, check that game runs stable with settings, then turn on another settings, check if the game is stable, etc.
This kind of error is also coused by faulty driver/game settings. If that happens you will have to delete "user.ltx" file and start the game with default settings. Drawback with deleting "user.ltx" could be that your ingame brightnes will be too bright. Metal surfaces could reflect too much light. It is good to backup your "user.ltx" prior deleting so you can compare values with new "user.ltx" file.
What you need to edit to get back normal brightnes settings is explained in Yasti's sticky thread.
Alright I don't know why this would fix it, but it seems to be fixed for me, for the time being.
I selected the lowest settings on the config menu, and also did just static lighting and played. No artifacting. Then I did maximum graphics with static lighting and no artifacting.
Then I did full dx 11 lighting and max graphics and there is still no artifacting. I will update if it changes at all.
EDIT:
Just to put note about unlimited FPS in game main menu. It was mentioned lot of times in various forum threads, that x_ray game engine issue can overheat GPU cards. It seems that latest generations of GPU cards suffers more of this problem then older GPU cards.
Jasper34 mentioned this in few post below and in bugs thread as well, but it was left unnoticed for some people.
To prevent overheating of your GPU, install some third party application like MSI afterburner, Xprecision or Nvidia Inspector that limits maximum frames to max frame that your monitor support, usualy around 60 since most novadays LCD have 60Hz refresh rate.
EDIT:
Jasper34 wrote some good tips to improve your game performace if you have powerfull PC with lot of RAM. You can use that extra RAM to setup a RAM drive. That will outmatch performance of any SSD hard drive and it will save your SSD drive from degradation over time. More info in his post down below.
EDIT:
User deldero have also issue with CTD on main menu. It was unrelated with anything mentioned above. In his case, it was due to corrupted save game file somewhere inside savegame folder. Soulution is to move or delete all savegame files and try to run game.More info about all story can be found in bugs thread.
Hi, I have problems with graphical settings. I tried many combinations of settings. Buildings, NPC stalkers, mutants and other non-floral objects are good, but trees are very fuzzy. I dont know whats wrong.
Some of atmosfear effects couse that distant object looks blured, mimics distant fog effect by that way. Another reason why trees become blured is if your actor didn't sleep for a while - part of gameplay.
If none of mentioned is not your case then I realy don't have a clue, you could search on google for some advanced settings in user.ltx file, but I can't tell for that for sure if anything can be done.
Using most of the above settings I played around and settled on grass - minimum, sun shadows off, anistropic half-way, switch distance 500, and everything else pretty much maxed out on DX11 and I am running a very steady 60FPS. I did install a limiter (Nvidia Inspector) to get rid of the 1500 FPS I was pulling on the menu screens and set my max to 60. I'm running super quiet and cool with virtually no stuttering.
While trying to set up a new GPU yesterday I nailed down a couple things to help new GPU owners and first time Misery Stalkers get their rigs set up. If you want to start fresh with the least demanding and arguably most compatible set of video settings do the following: Go to R:\SteamLibrary\SteamApps\common\Stalker Call of Pripyat\_appdata_ (substitute your particular install location) and delete tmp.ltx and user.ltx. When you load the game you will have the base low video options set. Each time you want to try a new set of parameters out do the following: Select Medium, High, or Extreme base settings, apply. Before going out of the game and restarting adjust anything else you want changed. Apply again. Exit game and go to the above directory. Delete the tmp.ltx. Copy the user.ltx to the SAME directory; it will make a copy 2. Click once on the second copy and change the name to tmp. Start game and try your new settings. Yes, you will have to bind all your keys and change your non-video options back after deleting both files, but only once. I don't know why, but if you just change options and don't replace the tmp.ltx with the changed user.ltx, some or all of your changes will display in game, but won't actually be applied. After doing a bit more research into HDAO vs HBAO I found the following: HDAO is an AMD ATI product High Definition Ambient Occlusion and Their Products including COP which was ATI branded will use HDAO more efficiently than Nvidia products (with less performance drop). DX11 also has a performance efficiency advantage with HDAO.
HBAO is a lower definition process that will work on both brands. Neither is required and may be turned off if your performance is inadequate.
I have better performance on Dx11 than Dx10, plus Dx11 don't crash here, while Dx10 is absolutely unplayable. Crash every time. Sometimes takes several minutes, sometimes i can play for a couple of hours. But always crash on Dx10.
I really don't get why that performance hit happens with Misery or some of other mods. Complete is the only that have smooth performance (95% of the time we can get 60FPS and almost no stuttering), while any other mod have fps drops when you look for certain areas and stuttering. A lot of stuttering. Almost unplayable.
Performance hit is nothing to do with alife.ltx, because even if you change for original or lower values, the game will have a lot of performance hit (drops to 40fps or 50fps). You can get better results for stuttering only.
But there's a obvious CPU bottleneck. I have a Core i5, but XR3 Engine always try to use my first CPU Core to 100% before starting use others. Even with this CPU bottleneck, I don't think the problem is it, because the CPU usage and performance hit don't happens always in the same time. It's random.
What I noticed is low GPU usage. All the time. At all. It's very hard to see a 100% gpu usage on Misery, but I don't remember how this particular thing works on Complete or Vanilla. Maybe it's the same, I just don't remember. In fact, is very commom to see 0% GPU usage. Happens a lot!
So it's a mistery why the Misery performance is so bad. I don't think 40FPS is good, for a FPS game, 60FPS is the target rate. All the time. And even if you put the graphics on low, you will not beat 60FPS on Misery Mod.
You do know Xray engine only uses 1 core period? The MDT squeezed a ton of extra features in as well as lots of new items and variants. These all use resources. Brute single core CPU speed is critical. The faster your hard drive the better your performance. Higher alife switch_distance coupled with objects_per_update of ONE will give you drastically reduced stuttering and shorter durations. See my thread Moddb.com for easy video setup. My R9 290 runs flat out maxed with only grass density turned down (just because I like it more barren). The first thing to lose if your performance drops off is Sun Rays/Sun Shadows. I found HBAO was much smoother (low) than HDAO even though our AMD cards run HDAO more efficiently, it's still more load than HBAO. The default alife in Misery is double that in Vanilla/Complete. Double the default is better if your CPU can keep up. Complete was pretty, but really virtually no new features or difficulty. Even my old GTX 660 ran smooth 60-80 FPS limited until Nvidia finally got it's drivers to self limit at 60FPS. It's really about getting your rig tuned for the game.
jasper34 wrote: You do know Xray engine only uses 1 core period? The MDT squeezed a ton of extra features in as well as lots of new items and variants. These all use resources. Brute single core CPU speed is critical. The faster your hard drive the better your performance. Higher alife switch_distance coupled with objects_per_update of ONE will give you drastically reduced stuttering and shorter durations. See my thread Moddb.com for easy video setup. My R9 290 runs flat out maxed with only grass density turned down (just because I like it more barren). The first thing to lose if your performance drops off is Sun Rays/Sun Shadows. I found HBAO was much smoother (low) than HDAO even though our AMD cards run HDAO more efficiently, it's still more load than HBAO. The default alife in Misery is double that in Vanilla/Complete. Double the default is better if your CPU can keep up. Complete was pretty, but really virtually no new features or difficulty.
I don't think it's graphical settings either. Even if you running it at low settings, you will get serious framerate drops one time or other. Even if you disable SSAO, lower grass density, turn off sun rays, sun, anything. It's impossible get 60FPS on Misery in my experience. At least with my new R9 285 card and my old GTX660.
What is your CPU core speed? I consistently was able to run limited (as in I would otherwise run higher) anywhere I set Xprecision to limit myself from 80 FPF down to 60 on my GTX 660. Turning something down/disabling advanced features is pointless unless you follow the procedures in my thread. Whatever the reason, the changes aren't correctly applied otherwise. Also, look for a bogus user.ltx inside your Stalker call of Pripyat folder. Deleting it gets rid of a lot of stuttering too. There should only be the one inside your _appdata_ directory and a duplicate file named tmp.ltx in the same place. The two should be exactly the same for proper results. I was over my in-laws looking at one of my legacy hand-me-down systems with only a 3.2GHZ CPU and old HD 4850 GPU and an SSD. I installed the game and patch, set my switch_distance to 500 Objects to 1 and maxed everything out on static lighting (for some reason the menu didn't offer DX11, maybe card wasn't capable) the only thing turned down was anistropic filtering and antialiasing. I had to check of the 60HZ and vert sync for it to be compatible with the old LCD monitor. Game runs dead smooth at 75 FPS dropping to no less than 45-50 when looking at a lot of distant trees. Now I have something to do at my in-laws
b1e95dc632