Help! If anyone from FW is seeing this, we need more VRAM ;)

I asked the same some months ago, fully agree ;). But note that the laptop would also crash on older titles (couple of years). One example is Far Cry 6 with HD textures.

If you want the foremost in graphics hardware, you shouldn’t be gaming on a laptop.

The current graphics card option has 8 GB of VRAM for a 2560x1600 display. Assuming 24-bit color depth with 8 bits for alpha that’s 32 bits = 4 bytes per pixel. 4 B x 2560 x 1600 = 16.384 MB per frame buffer. Assuming two back buffers (for triple buffering), that’s a grand total of 49.152 MB of VRAM that you’ll need for the frame buffers, or 0.6% of 8 GB. That leaves the programmer with 99.4% of those 8 GB for textures, meshes, lighting info, compiled shaders, and other things such as the depth buffer.

Texture size (“texture quality”) and mesh size (“model quality”) are commonly and easily adjusted in a game’s graphics settings. Lighting info size can be adjusted via “shadow/lighting quality”, but blows up if you turn on raytracing. An important thing to keep in mind is that if your card doesn’t have enough VRAM for some settings configuration, it probably doesn’t have enough compute power for those settings anyways. Gaming at 4K with raytracing enabled requires massive amounts of VRAM, but that’s not something this graphics card (or this laptop with its 2K display) was designed to do.

Future titles may very well require more than 8GB of VRAM, but that’s why the expansion bay allows you to swap out for a newer card if/when the need arises ;​) You’re never going to have a completely future-proof option with all the crap (raytracing, AI, etc.) that’s rapidly being foisted upon graphics cards, but 8 GB is plenty for a card in a laptop being delivered Q4 2023.

Go run your favorite game and check to see how much VRAM it’s actually using! You might be surprised by what a half-decent game developer can do with far less than 8 GB of VRAM. This article written before the raytracing craze has reasonable VRAM recommendations.

As a final note, crashes have nothing to do with VRAM capacity, and everything to do with poorly-written applications/drivers.

EDIT: How awkward… This was originally written in response to a low-quality shitpost and has since been moved over to this discussion which has more nuance. I will say, however, that if you’re trying to do AI shenanigans, you should prolly look into a proper workstation. AI/ML by nature is a very power-hungry brute-force way to solve problems. That kind of thing clearly isn’t what you should design mobile devices for. People are also acting as if eGPUs don’t exist either, but throwing unholy amounts of compute at something from a laptop is exactly the use case eGPUs cover. I’m pretty happy that neither Framework nor AMD had AI in mind when they designed the Framework 16’s graphics card – I’d be stuck paying for something I don’t need or want!

9 Likes

You might be overlooking the fact that GPUs in general, and dGPUs in particular, are not used only for gaming and graphics these days. ML tasks can be very VRAM hungry. For example, the recommended VRAM size for Stable Diffusion is 12 to 16 GB. And SD can run relatively efficiently with less memory compared to, say, transformer-based LLMs.

This is also why making sure that the BIOS on the AMD models allows setting the 780m iGPU’s UMA frame buffer to sizes larger than 8 GB is hugely important for making the FW laptops not only a viable, but easily a preferred option for ML-related work.

3 Likes

I’m not in the market for the 16" system, but I know that some people would like to see a higher end GPU option for it. That card would probably require a separate power supply for the GPU for maximum performance; that would allow the GPU to pull 240W (assuming USB-PD is used), or maybe even a bit more if it can supplement with power from the main system. (One possible way to squeeze out a tad more might be to power the GPU itself from the second supply, but to run the cooling system with system power.) It should also be designed to allow operation with reduced performance without the second power supply.

Although the market for such a card exists, it’s not large. I think Framework will do one eventually but it’s not their highest priority right now; they need to actually ship the 16" system first! They might also consider partnering with a graphics card manufacturer to make that rather than doing it in-house.

2 Likes

I’m completely aware of these details and I severely disagree with your conclusion.
I’ve run into vram limitations multiple times in titles I played this year. I’ve had to upgrade my partners gtx 1080 to something with more VRAM just to make it stable in our titles without droppping the size of the frame buffer until it looks like sludge. The card had plenty of compute left in it, but VRAM is absolutely the reason for its obsolescence. I’ve run the games, scrutinized the frame pacing, measured memory usage.

We can make assumptions and demands for hours about how developers will use modern APIs correctly, build in proper texture map settings for baked lighting, shadows, geometry, and implement functioning LOD. But at the end of the day, games ship, developers build what they will, and all that matters is our hardware copes with the games we play.
And for that reason, having 8 gB today is unacceptable in a windows gaming device that is expected to run all sorts of games today, and whatever comes tomorrow.

4 Likes

Go console. The rest of your post talked about hardware isn’t really the concern…but it has more to do with software / coding…(and I agree with you)

Consoles are such that the game HAS to run on that dedicated hardware, and so coding, testing and gaming experience is against that device.

3 Likes

That’s fine. I severely disagree with your conclusion.

The solution is to adjust your expectations to realistic levels and for games programmers to optimise their asset use.

1 Like

If you are really interested in the discussion, please also consider the previous thread with several arguments: Help! If anyone from FW is seeing this, we need more VRAM ;) - #19 by Philam

PS: and otherwise you seem to agree that it’s not enough for high-end use cases (4K gaming with raytracing). Note that there are also loads of other fun stuff you can do such as in-game resolution scaling or installing HD texture packs which many people spending 2000 EUR would love to be able to do - it’s not only about raytracing or the general detail you mentioned. Raytracing in itself has been developed for decades in the quest for pathtracing technology and was also critical in the movie industry- not sure it is “crap” - but that seems to be a subjective opinion only. Finally, there are professional users of course, many point out they need much more than 8gb of VRAM. In this discussion, we should also see what type of VRAM we are talking about, note that for example GDDR6X or similar is much different from previous generations. On the shipping date, you mentioned FW16 laptop will be “delivered Q4 2023”. Sorry, but that’s plain wrong for most people - this is the case for some batches, but the wild majority of people that ordered will receive it throughout 2024. Just some notes, but as I said, this discussion is wildly subjective. I personally think companies like Nvidia made huge mistakes building GPUs with massive computing power and VRAM can’t keep up. We have all seen awkward instances where the same GPUs (same name) came out again with added VRAM in the 30s series… And on your point about game crashing, sorry, but my personal “experiment” was to play Far Cry 6 (HD texture pack installed) with 8gb of VRAM (GDDR6X) and sorry to say that it crashed when I looked at the wrong mountain. When playing the same game in same settings with more than 11gb of VRAM, the game never crashed again. So just to say that it seems it may have something to do with it in some cases.

And please, this is not meant as any trolling or anything, as you mentioned, it’s not meant for high-end use. Just to explain my own experiences and use cases, and for those I think 8gb of VRam is getting low pretty soon.

This is not even close to correct for two reasons. First, many games this year (and even last year) have run like :poop: on consoles despite oPtImIzAtIoN. And consoles are generally worse performers compared to even mid range dGPUS.

Second, they are so locked down they are compared to even “normal laptop hardware”, much less Framework Laptops. If you think repairing a modern laptop is bad, well, consoles take that to the next level. I don’t think people who value right to repair should be looking at ANY modern console, given even the best right to repair laws seem to exclude game consoles and how many of the console makers seem to screw over consumers in other ways to. That’s not so pro-consumer after all!

And I agree with Wrybill_Plover; high amounts of vRAM is not just for gaming; certain workloads that are GPU dependent can make quick work of vRAM. Given AMD just announced a new laptop GPU (the Radeon 7900M), if they make an S version, I think Framework should 100% offer it. 8 GB of vRAM just isn’t enough nowadays for many reasons above entry-level configurations.

6 Likes

I’m not in the business of making and selling laptops. That said, the decision to go with 8GB here was made for any number of specific reasons, to the advantage of the interested players here. I expect AMD has the final say, and didn’t want this laptop to do the work of a desktop workstation.
Really unfortunate, as I would have liked to future proof my purchase at 16GB, and also avoid the eventual e-waste that comes with buying an upgraded GPU module.
I ran my last laptop’s GPU into the ground before I replaced it (replacing an MXM card is a serious and ongoing nuisance). It was a Nvidia 880M, and it had 8GB.
That was an 8GB card, produced 10 years ago. And here we are.

Really important points in my opinion. I honestly believe that higher VRam can be connected to sustainability - it could seriously be a useful priority for Framework as a company.

1 Like

My contributions here are also aimed for any gamer/professional reading this who is considering buying the notebook now. If you are, then you are running the risk of actually receiving the notebook in May/June 2024. Beware of your gaming/work expectations for a 2000+ Euro notebook - 8gb of VRam in May 2024 will be underwhelming out of the box.

1 Like

So which AMD mobile GPU that pulls up to 100W from the range that was available back in Q4 '22/Q1 '23 should Framework have used instead?

2 Likes

You have me at a loss. What does a GPU available a year ago have to do with the engineering and product placement decisions of a GPU that will be delivered in the next few months? I’m not connecting the parts you’ve laid out here.
AMD could have put whatever RAM they wanted on the card. I’ve never seen a card manufacturer build a card with a quantity of RAM other than exactly what was sanctioned by AMD or Nvidia, so I have to assume that’s all part of the contract. They chose what they did for a reason. The reason–I feel, IMHO–being that they didn’t want to cut into their desktop market, let alone into AI training. Also to sell an upgraded GPU sooner rather than later. Like three, maybe four years out. Instead of six or eight.
Look, it’s all good. I’m in for it. It’s time for a new laptop, and this is what I’m going with. For what it’s worth, I woulda paid more for 16GB. And ran it into the ground.

1 Like

Long story less long: It’s not possible with the RX7700S. It might and probably will be possible with expansion bay GPUs in later/different chipsets. If 8GB VRAM is going to cramp your style, FW16 may not be a good fit for you.

2 Likes

But they didn’t. Therefore we’ve got the 7700S with 8GB.

I notice you didn’t answer the question.

2 Likes
  1. You just proved my point.
  2. Your question was incoherent, and you’ve ignored my request for clarification.
  3. This is boring. Take care. Have a nice life. :slightly_smiling_face:

Please stop the arguing or we’ll have to lock the thread.

The point’s been made anyway.

1 Like

If people can’t play certain titles because they won’t run on reasonable hardware, those titles shouldn’t be purchased! You act as if there’s nothing we can do to prevent devs from releasing unoptimized, Denuvo-“enhanced” trash, but it’s literally as simple as not paying for crap. I got Bugthesda’s Starfield free with my FW16 reservation and I wish I could return the game just to lower their engagement numbers. The fact that I shelled out ~$700 for a fucking 6800 XT only to be met with 60fps at 1440p is unacceptable. If we keep playing this game of “oh these moron devs can’t do their job and everyone knows it so we’ll just throw more hardware at the problem and call it a day”, we’ll lose. Over and over again. There are no winners in an arms race between shit code and over-built systems.

As a side note, while you correctly identify Windows as a niche operating system for videogames, you’d be surprised how well Proton takes that only remaining use case from it.

I can’t find anywhere in the marketing that says the FW16 was designed for “all sorts of games today, and whatever comes tomorrow”, so I’d be interested if you could point that out.

1 Like

I can see both sides. I think 8GB is enough for tons of games, including some newer games. I’m confident it would suit any needs I would have for a laptop. So it’s definitely a totally viable and sufficient option for some people.

On the other hand, I can absolutely understand wanting more than 8GB of VRAM, either for the games that benefit or for use cases beyond gaming. Nothing wrong with putting it out there to Framework that there are people who wouldu appreciate (and pay) for additional VRAM. It lets Framework know there is a market for potential, future GPU options.

6 Likes