I don’t like this line of thinking because especially now where new games seem to always suffer from performance issues, it lowers the bar that these developers feel like they’d need to set as far as the experience they’re offering for their games.
I think the minimum standard should be at least 60fps, in cases with steam deck and other low-end hardware of course concessions must be made, so either lower graphics settings or deal with lower framerates.
But there’s no reason a new game should be suffering poor framerates on modern desktop hardware (looking at you Dragons Dogma 2).
Lower frame rates can be perfectly fine, I find I’m far more bothered by inconsistent frametimes.
The main reason 40fps feels fine on the deck is that the display can come down to that same Hz and operate in lockstep.
I’ll take consistent 60 over hitchy 165 most of the time, though VRR means you can occupy kind of a middle ground. But even there frametime inconsistencies can make for a shit experience.
My point is that game developers should aim to deliver games that render at similar framerates throughout.
So many of these recent games do hit decent framerates, but then there’s that one in-game location, enemy type, player ability, or particle effect, that just makes the framerate completely shit itself.
It’s like these studios are designing each element with a given GPU budget, pushing things right up to the limit, and then do a surprised pikachu face when things run like shit once they try to put more than of these elements together to make an actual game.
165 that dips to 100 is unquestionably better than 60 with no dips, especially with GSync.
165 that dips below 60 is very bad.
Yea I was about to say. I play games that stay around 120 on my hardware and dip to maybe 80 sometimes. It’s not that noticeable, especially during action and if the dips aren’t super sudden drops. But 45-60 is noticeable.
That depends. VRR works beautifully when you walk through in-game locations and the framerate smoothly shifts up and down.
What it’s absolutely shit at dealing with, is VFX that cause the frametimes to spike by an order of magnitude for just a frame or two. Something which is common in some games with a lot of player ability and enemy attack effects going off.
In these cases I will actually just turn VRR off, and play at lower framerate that are consistently achievable.
VRR is nice, and I absolutely do use it most of the time, but its very nature means that the latency in the human processing loop that is hand-eye-coordination becomes inconsistent. When it’s working smoothly with the framerate smoothly shifting around, it’s fine.
But the kind of hitching I’m talking about isn’t the kind where the overall framerate shifts, but the kind where just couple frames take orders of magnitude longer to render, and that interfering with my hand-eye-coordination. I would have been in better shape to pull off a turn, shot, movement or whatever, had the game been running at that framerate the whole time.
My point is that game developers should aim to deliver games that render at similar framerates throughout.
Scenes in most games usually have a high variety of complexity, so the way you’d achieve that is through getting a baseline quite a bit higher than your target FPS, and then limit FPS to your target FPS. This way the game won’t utilize near 100 % of the GPU most of the time, but peaks in scene complexity won’t cause FPS to drop below the set cap.
This is how it works or at least use to work for a lot of games on console. On PC, you almost always have to make the choice yourself (which is a good thing if you ask me).
For many games with a lot of changing scenery I have to target around 45 FPS with graphics settings to even have a chance of achieving somewhat consistent 30 FPS/33.33ms on the Deck.
On the one hand the Deck is heavily underpowered compared to even lower-end PCs. On the other hand tests show that the Z1 Extreme/7840U isn’t much faster at these lower wattages (10-15 watts TDP), so there hasn’t been a lot of progress yet.
But it’s also that many games don’t scale so well anymore. I feel like half the settings in many modern games don’t affect performance to any noticeable degree, and even fewer settings affect CPU usage. And if there’s low settings, the game often looks unrecognizable because these lower setting models, textures and lighting/shadows are simply generated by the engine SDK and rarely given second thoughts.
Tech like nanite rendering does bring a potential of maybe solving that variability. But even before that, LODs, detail render distance limits etc. already allow frame rates to be leveled out, if utilized.
And I would consider 30 and 45 within that “similar” range. I’m not asking the framerate to stay within even 10% of an average at all times. But games are getting a lot worse than that.
A recent game even my desktop has been struggling with is Forbidden West, which I tuned the settings on to achieve 80-100 fps, yet in some locations (larger settlements) it will chug down to 20-30.
Some newer games aren’t just losing 33% fps at worst vs best. But more like 70%. At that point you end up having to target 200fps just to never drop below 60, and that’s tricky even on high end desktops.
Eh, not caring too much about frame rates can be healthy in terms of how long you go between upgrades. Which could have a knock on effect of forcing devs to more often consider weaker builds as people don’t upgrade as often
Just depends on how you approach it. If you have relatively new parts, then yeah, you should expect at least 60 FPS. If you have an older system, then the only thing that matters is that it’s still enjoyable
Now that they don’t have to optimize for last gen console hardware anymore, that’s going to be even more rare for any triple-A game. Even a well optimized PS5 game is going to seriously struggle to run on the Deck as even if you reduce the graphical setting, the PS5 essentially has an 8 core version of the 4 core CPU in the Deck.
Combine that with the 15W shared TDP limit and the game would basically have to be able to run using only roughly 25% the CPU load.
Yup.
And in the wider PC space, VRAM is a big issue as well.
Everybody keeps acting like every single big game that comes out is “unoptimised” because it uses more than 8GB of VRAM when playing at 1440p or above at ultra settings, and people with 8GB or lower GPUs are struggling.
Are all these games unoptimised? Or is it simply that they’re done targeting consoles with 8GB RAM, are now exclusively focusing on ones with 16GB, and because of Nvidia being notoriously stingy with VRAM, lots of people on PC are suddenly finding their cards falling short?
Thank fuck Valve had the sense to put 16GB of unified memory in the Deck. Could you imagine if they only went with 8GB, or even 12GB?
deleted by creator
I’m still a snob and just play games on my steam deck that are able to maintain 60fps. As soon as they offer a 120fps model I’ll probably jump.
deleted by creator
Yeah it’ll be Steam Deck 2 or whatever.
It’s because of the screen. Low frame rates are much less impactful on a small screen. When the screen takes up most of your field of view, you notice the frame rate more.
It’s not you, its the screen.
It also depends on the type of game. Quick turns with lower frames an fps shooter will seem very different from low frames in a side scrolling platformer. Third person games often feel smoother with the same frames than with the first person view because the way the game turns is different.
I think a second factor is a lowered expectation of immersion that comes with playing elsewhere. The author specifically says they can play in the pub waiting for friends or on the train or whatever. You’re not as invested in those situations, so a game is more of a distraction than an experience.
Joke’s on you, I have ADHD so my attention is always laser focused on my dopamine cow, no matter the screen size haha
I agree. Context matters!
-
The bigger/closer the screen is the more you’ll be discomforted by low fps.
-
Different genre of games feels different. (Fixed camera games are less likely to dizzy you compared to FPS)
-
I found that input method plays a role too. M&K or Gyro input will feel sluggish in low fps. Wile pure stick inpit might be fine.
-
I also don’t have any problems with a consistent 30 fps on my 65"
I’ve been playing some PS1 classics on my PC, and I can safely say, playing FF7 at 240 fps does not meaningfully improve the experience over what it was back in nineteen ninety eight.
back in nineteen ninety eight.
The Undertaker threw Mankind off Hell In A Cell, and plummeted 16 ft through an announcer’s table
I’ve never needed high frame rates. What is more important is consistent frame rate. I’d rather have a consistent 30 than a range of 40-60.
Consistency is nice for all games, but some just don’t play well at low frame rates.
I struggled to get into Dark Souls, but after installing DSfix the effect was transformative and I was able to read the game a lot better.
Same here. I went through applying a 60 fps patch for dark souls 1 (cause I do prefer it) and once I hit a listed bug of getting flung off a ladder I unapplied that shit immediately. It’s not worth it.
Just so people understand, there are bugs that simply make you die because 60fps and ladder don’t agree with each other.
I just play it at 30 because of this.
I’m getting kinda tired of the slideshow snobs, telling everyone how 30 FPS is enough. The games are supposed to be fun, and not cause nausea. I’m willing to compromise on the former, but not the latter.
I’m not a snob. I’ve just never had a PC good enough to run most games at 60 fps. I’ve just never acoustumed to this level of confort 😛
It must be impossible for you to watch TV or any movie as they are all recorded at 24 FPS.
Some people easily get motion sickness and it can be aggravated by many factors, including low and/or irregular framerates.
I’d be interested to know if people complaining about motion sickness at low fps have that issue with all games, or only FPS/TPS. And if they have the same issue with “first person” segments in movies (which are pretty damn rare in the first place, and basically always at a very consistent but low framerate)
You don’t control a movie, those two aren’t comparable
I can control a movie. Play, pause, fast forward , rewind…
You must know you are being purposely disagreeable, right? If not, I would love a clear explanation on how you think that is comparable. Educate me.
Sorry, figured it was an obvious joke, but sometimes that doesn’t come across well in text form.
But to look at it at a different angle, GTA 5 on the Xbox 360 and the PS3 sold millions, and typically ran at mid 20s FPS, same as a TV. I don’t recall there being an issue or outcry of it causing motion sickness, and yet with million in sales, it would have been played by enough people.
Why didn’t this have such issues claimed? Or was there reports and claims it caused issues that I missed?
There are numerous factors to this. First off, the natural motion blur caused by film allows for the brain to track the information better and gives an illusion of fluidity. Games on the other hand render images statically, one by one, often inconsistently. And depending on the motion of the camera in game, the next frame may be dramatically different. (This is partially why some games can run at 24~fps looking smooth, while others look choppy even up to and past 60fps).
And while you are right that folks who played GTA IV, and other games that rendered at a usually smooth 24-30 fps, didn’t often complain about motion sickness, this is a biased sample. The reality is that we know frame rates and frame times are linked to motion sickness. This has been a very prevalent problem with VR headsets, in which the proximity to the screen exacerbates any issues. But folks playing GTA IV at the time were not likely to be part of the group that was susceptible to the motion sickness induced by low but consistent frame rates.
Compared that fact to now though, where it’s very possible to run games at a higher frame rate, which means that people who would experience motion sickness at lower frame rates can join everyone else in the glorious hobby. Also, if you are having low frame rates on a PC nowadays, it’s more likely to be paired with inconsistent frame rates, increasing the choppy feeling.
Fwiw, just Googeling “GTA IV causes motion sickness” and adjusting the search date range to '08 to '13 brings up no end of results, including this forms post about GTA IV causing motion sickness for at least one gamer.
To be fair, after getting a OLED TV, I can’t stand 24 FPS content at all. With LCD, the blur between frames is just enough to mask the issue, but on OLED movement gets extremely stuttery, and if you get distracted focusing on it, you can even see the steps in each individual frame. It’s nauseating.
I had to do the unthinkable and enable the less intrusive motion smoothing option on my TV, otherwise I’d straight up get a headache. This does not happen at any higher framerates. And I’m not talking about gaming at all, I mean TV and movie content.
Probably has something to do with the screen size. 30 fps on a small screen is way less exhausting on the eyes than sitting Infront of a 27" or bigger screen.
I the portability of the steamdeck. Years ago I had a desktop gaming PC that over time I used less and less because the lass thing I wanted to do after sitting at a desk all day was to sit at a desk some more.
I bring my deck with me when I travel for work, go on vacation, when my wife and I spend the afternoon at my parents. I’ll even bring it with me if I have to bring my car to the shop and I have to wait. I play games in my recliner, and in bed. I’ve never been a frame rate snob so the steamdeck suits me just fine.
theres a lot of factors that impact tolerance. for example motion blur. I personally hate a lot of post processing effects, so having motion blur off makes low fps gameplay pretty jarring.
Currently im going through pokemon violet, not on my switch, but on emulation, and the 30fps is really rough, and the 60fps mod has tradeoffs. If im complaining about ot on emulation on a reletively high end pc, I couldnt even imagine how bad the performance was on native hardware, given it was one of its biggest complaints.
This depends on the game and the viewing distance + screen size for me. 30 fps in a tbs game like civilization is perfectly fine for me, but too slow for an rts like total war. 60 fps in total war works for me on the big screen living room tv, but I find it too slow on a desktop computer screen. I expect shooters with jumping and fast turns to benefit even more from faster fps than my rts games, but it’s been years since I played one.
Never really had the privilege of being a frame rate snob. My teenager years was pretty much playing games that my computer didn’t have the minimum requirements to run.
Now I am a steam deck owner and I am very happy. My only complaints are when games have really bugged UIs or when controller have these nonsensical key binds you can’t change like L3 + left D pad (Path of Exile).
I’ve intentionally kept myself desensitized to higher frame rates over the years. I’ll occasionally go up to 60fps on some titles that really need it, but usually if I play most everything at 30fps I never am bothered by it.
Same, having only budget build computers tend to make you not pathologically chase high frame rates.
I’ve sort of done the same thing. Most console games are optimized around their ‘quality’ game mode. Many games will have the quality mode be a solid locked 30 fps while performance will be a low res, blurry mess running at an unlocked frame rate between 40-55 fps. I’ll happily play at 30 fps to avoid those issues.
Do not, my friends, become addicted to 60 fps. It will take hold of you, and you will resent its absence!
Don’t get addicted to 60? That’s the standard. We’ve moved on to “don’t get addicted to 144” because that is both easily achievable with moderately affordable hardware and widely supported now.
It’s a play on a Mad Max: Fury Road quote.
I find framerate more noticeable when I’m closer to the screen. Which is why I don’t mind the framerates of my phone/Steam Deck, but on a desktop monitor 60 Hz is not enough
I’ve been solid in my belief that framestes are for suckers. In my experience anything running at 60 fps is fine, probably 30 even. I have to stress myself to really take note of fps in any meaningful way.
I still love my high frame rates, but playing with a controller is much more forgiving in that regard and i mostly play games with no camera controls on the Deck so it is perfectly fine.