I’m thinking of moving from my desktop PC to a Framework Laptop 16. I currently have a GeForce RTX 3060, and from what I’ve seen looking at 3DMark Results, the RTX 3060 and RX 7700S have similar performance outside of ray tracing. I’ve been using dual 1080p 60hz monitors and really want to get full use out of the Laptop 16’s 165hz display (if I get one), but I’m concerned that the higher resolution of the Laptop 16’s display will cause games to run at a much lower frame rate if I keep the same graphics settings. I was thinking that I could just drop the resolution back down to 1080p, but I’m seeing online that a lot of monitors struggle with lower resolutions. Does the Laptop 16’s display struggle with that?
Display itself won’t struggle with anything, but there will be interpolation happening, and resulting image will be more blurry than at native resolution. How blurry can you tolerate depends on your own visual acuity. Pixel density on FW16 is almost the same as on FW12 (slightly higher - 188.7 on FW16, 185.6 on FW12), and setting FW12 to 1440x900 resolution will give about the same scaling experience as putting 1920x1200 (equivalent to 1080p if you preserve aspect ratio) on FW16. The lines do look noticeably more fuzzy, but fonts are still plenty readable (and I set my fonts quite small by most people’s standards). Within game environment I’d recon you won’t notice too much difference. I used to play Borderlands 2 on TP W541, which had 2880x1620 screen, at 1280x720, and the fuzziness of scaling never bothered me in the heat of a firefight. Also, consider that two 1080p monitors are actually an equivalent of a single 2560x1620 screen in terms of raw pixel count, so slightly larger than FW16’s screen.
I played for 1.5 years on R7 7840HS and 7700S many many hours of AAA games.
Consider that AMD has two awesome technologies to make games look and feel better in Gpu-bound scenarios:
RSR, Radeon Super Revolution: gives you AI upscaling to ANY game (no support list, no bullshit: any game, new, old, emulator will work): you set it in Adrenaline and the game will render at (example) 1600x1000 or 1080p but will be ai-upscaled on the go to look closer to native 2560x1600. Doesn’t look as good as native but way better than just playing at low res. Takes almost no GPU power and gives plenty of FPS boost in games such as Helldivers 2.
AFMF, Amd Fluid Motion Frames: gives you 2x frame generation in again, any game, new old emulator etc. This is amazing because it just doubles the FPS of your game. The first versions had huge artifacts, now works way better, only some minor visual weirdness around fast moving HUDs, but you get used to it. And it works very very well.
If you combine them, you can take Helldivers 2, native 1920x1080, upscaled to 2560x1600, with framegen, medium settings, AFMF and make ~140fps.
It’s the universal “patch” for games without FSR/DLSS, you gain performance for free. Of course, if FSR is available it’ll look better.
Again, not as good as native res/native FPS, but for a 100W GPU it’s a miracle.
Without getting into the whole conversation about integer scaling, if you want higher framerates, the screen looks fine at 1080p, especially for games with built-in upscaling and an untrained eye. It’s lower-quality, yes, but it’s fine unless you teach yourself to spot the differences. I personally run at native resolution though because I find the performance in most games is quite adequate. If you’ve been using 60hz this whole time, I think you’ll have a good experience either way. I have a 42” 144hz 4k monitor hooked up to my desktop with an RTX 3090, but it’s a VA panel and so I’ll often play on my Framework 16 anyways because the panel refreshes the pixels so much faster.
Side note: performance wise, the FW 16 seems to be about the same framerate at 1600p as my 3090 does at 4k. That’s not a benchmark score, that’s just me playing games.
Frame generation is not a straightforward 2x of frame rate. First, if you select 2x frame generation you resulting frame rate will be somewhere in the range of 40-80% higher, not 100%. Second, it will only increase the visual smoothness of motion, but at the cost of dropping your base frame rate, the rate at which the world is rendered and your input is processed. You’re basically trading lower latency for higher throughput. Now, one side effect of those frame generating techniques is they force triple buffering, but that’s usually something you can enable independently to get lower input latency (though in some scenarios triple buffering can actually increase that latency).
2nd to everything here. I’ll add that from a personal experience perspective, as long as your starting FPS is high enough (at least 60-100), then the latency hit is fine and the perceived smoothness increase does “feel” better.
Sorry, yes. I didn’t mention that your original FPS MUST already be around 60fps or you start getting diminishing returns.
AFMF wise, in my experience you can’t set it to 2x or 3x or 4x like Nvidia MFG, it’s either on or off (with some minor settings, search mode etc), but it delivers very close to 2x the original FPS. Yes you have a bit more latency, but Adrenaline’s own overlay used to give me around 10-15ms of extra added latency. Well worth it IMHO.
Of course, I wouldn’t run AFMF on competitive games (CS, Valorant etc), but they are almost always light enough to run at high framerates anyway on the FW16.