Help! If anyone from FW is seeing this, we need more VRAM ;)

Well, vapor chamber is newer and sounds cool, so it must be better, right?!

2 Likes

Depends on the heat pipe thickness etc. Vapor chamber has a vastly higher conductance. However, I may be wrong in the ideal application of such as the framework 16 has the GPU separate and a vapor chamber may be overkill for an APU. Only just thought about it and it probably explains why it is not used.

1 Like

I did a bit of googling because I was curious, but couldn’t find any numbers for the amount of watts each solution can dissipate. All I found was some articles about how vapor chambers can conduct heat across their entire area efficiently while heat pipes only conduct along the axis of their wick efficiently.

I’m not an expert on this by any means, so I could be missing the key info here, but I would be curious to see how a vapor chamber might improve performance… Maybe future versions of the framework 16 mainboard could handle a sustained load higher than the currently quoted 45W with upgraded heat dissipation.

edit: I just realized I’m off topic for this thread… second time I’ve done that in as many days on this forum. I suppose vapor chambers could also be useful for cooling more ambitious memory subsystems on the gpu module though, so maybe I’m not too far off the mark lol.

I think as it does not have a discrete GPU, it simply does not need a vapor chamber. Maybe future revisions will be different like you said.

I have been looking as well and didn’t find a lot, but here at least some info from Dave2D, putting it at about a RTX 4060: https://www.youtube.com/watch?v=Zy5MWFrb0Y8&t=3s

However, unfortunately graphs in the video from Dave2D are not optimal, as they mention for example “1080p Ultra” for Far Cry 6, while we all know that Far Cry 6 would, if HD package is installed, send an error message to the brand new Framework 16 that VRAM is missing - so FPS numbers are a bit misleading if you want really good graphics ;).

Some general arguments for more VRAM: should be more sustainable as you can simply use GPUs with more VRAM longer, they are competitive for a longer time, even in a second life as a Framework EGPU. In addition, of course there is productivity - I heard Framework 16 is aiming to be a creator laptop as well. Not with 8gb of VRam - sorry.

4 Likes

It’s hard to justify purchasing a laptop, let alone something over $2000, that will struggle to not crash on modern games.

I asked the same some months ago, fully agree ;). But note that the laptop would also crash on older titles (couple of years). One example is Far Cry 6 with HD textures.

If you want the foremost in graphics hardware, you shouldn’t be gaming on a laptop.

The current graphics card option has 8 GB of VRAM for a 2560x1600 display. Assuming 24-bit color depth with 8 bits for alpha that’s 32 bits = 4 bytes per pixel. 4 B x 2560 x 1600 = 16.384 MB per frame buffer. Assuming two back buffers (for triple buffering), that’s a grand total of 49.152 MB of VRAM that you’ll need for the frame buffers, or 0.6% of 8 GB. That leaves the programmer with 99.4% of those 8 GB for textures, meshes, lighting info, compiled shaders, and other things such as the depth buffer.

Texture size (“texture quality”) and mesh size (“model quality”) are commonly and easily adjusted in a game’s graphics settings. Lighting info size can be adjusted via “shadow/lighting quality”, but blows up if you turn on raytracing. An important thing to keep in mind is that if your card doesn’t have enough VRAM for some settings configuration, it probably doesn’t have enough compute power for those settings anyways. Gaming at 4K with raytracing enabled requires massive amounts of VRAM, but that’s not something this graphics card (or this laptop with its 2K display) was designed to do.

Future titles may very well require more than 8GB of VRAM, but that’s why the expansion bay allows you to swap out for a newer card if/when the need arises ;​) You’re never going to have a completely future-proof option with all the crap (raytracing, AI, etc.) that’s rapidly being foisted upon graphics cards, but 8 GB is plenty for a card in a laptop being delivered Q4 2023.

Go run your favorite game and check to see how much VRAM it’s actually using! You might be surprised by what a half-decent game developer can do with far less than 8 GB of VRAM. This article written before the raytracing craze has reasonable VRAM recommendations.

As a final note, crashes have nothing to do with VRAM capacity, and everything to do with poorly-written applications/drivers.

EDIT: How awkward… This was originally written in response to a low-quality shitpost and has since been moved over to this discussion which has more nuance. I will say, however, that if you’re trying to do AI shenanigans, you should prolly look into a proper workstation. AI/ML by nature is a very power-hungry brute-force way to solve problems. That kind of thing clearly isn’t what you should design mobile devices for. People are also acting as if eGPUs don’t exist either, but throwing unholy amounts of compute at something from a laptop is exactly the use case eGPUs cover. I’m pretty happy that neither Framework nor AMD had AI in mind when they designed the Framework 16’s graphics card – I’d be stuck paying for something I don’t need or want!

9 Likes

You might be overlooking the fact that GPUs in general, and dGPUs in particular, are not used only for gaming and graphics these days. ML tasks can be very VRAM hungry. For example, the recommended VRAM size for Stable Diffusion is 12 to 16 GB. And SD can run relatively efficiently with less memory compared to, say, transformer-based LLMs.

This is also why making sure that the BIOS on the AMD models allows setting the 780m iGPU’s UMA frame buffer to sizes larger than 8 GB is hugely important for making the FW laptops not only a viable, but easily a preferred option for ML-related work.

3 Likes

I’m not in the market for the 16" system, but I know that some people would like to see a higher end GPU option for it. That card would probably require a separate power supply for the GPU for maximum performance; that would allow the GPU to pull 240W (assuming USB-PD is used), or maybe even a bit more if it can supplement with power from the main system. (One possible way to squeeze out a tad more might be to power the GPU itself from the second supply, but to run the cooling system with system power.) It should also be designed to allow operation with reduced performance without the second power supply.

Although the market for such a card exists, it’s not large. I think Framework will do one eventually but it’s not their highest priority right now; they need to actually ship the 16" system first! They might also consider partnering with a graphics card manufacturer to make that rather than doing it in-house.

2 Likes

I’m completely aware of these details and I severely disagree with your conclusion.
I’ve run into vram limitations multiple times in titles I played this year. I’ve had to upgrade my partners gtx 1080 to something with more VRAM just to make it stable in our titles without droppping the size of the frame buffer until it looks like sludge. The card had plenty of compute left in it, but VRAM is absolutely the reason for its obsolescence. I’ve run the games, scrutinized the frame pacing, measured memory usage.

We can make assumptions and demands for hours about how developers will use modern APIs correctly, build in proper texture map settings for baked lighting, shadows, geometry, and implement functioning LOD. But at the end of the day, games ship, developers build what they will, and all that matters is our hardware copes with the games we play.
And for that reason, having 8 gB today is unacceptable in a windows gaming device that is expected to run all sorts of games today, and whatever comes tomorrow.

4 Likes

Go console. The rest of your post talked about hardware isn’t really the concern…but it has more to do with software / coding…(and I agree with you)

Consoles are such that the game HAS to run on that dedicated hardware, and so coding, testing and gaming experience is against that device.

3 Likes

That’s fine. I severely disagree with your conclusion.

The solution is to adjust your expectations to realistic levels and for games programmers to optimise their asset use.

1 Like

If you are really interested in the discussion, please also consider the previous thread with several arguments: Help! If anyone from FW is seeing this, we need more VRAM ;) - #19 by Philam

PS: and otherwise you seem to agree that it’s not enough for high-end use cases (4K gaming with raytracing). Note that there are also loads of other fun stuff you can do such as in-game resolution scaling or installing HD texture packs which many people spending 2000 EUR would love to be able to do - it’s not only about raytracing or the general detail you mentioned. Raytracing in itself has been developed for decades in the quest for pathtracing technology and was also critical in the movie industry- not sure it is “crap” - but that seems to be a subjective opinion only. Finally, there are professional users of course, many point out they need much more than 8gb of VRAM. In this discussion, we should also see what type of VRAM we are talking about, note that for example GDDR6X or similar is much different from previous generations. On the shipping date, you mentioned FW16 laptop will be “delivered Q4 2023”. Sorry, but that’s plain wrong for most people - this is the case for some batches, but the wild majority of people that ordered will receive it throughout 2024. Just some notes, but as I said, this discussion is wildly subjective. I personally think companies like Nvidia made huge mistakes building GPUs with massive computing power and VRAM can’t keep up. We have all seen awkward instances where the same GPUs (same name) came out again with added VRAM in the 30s series… And on your point about game crashing, sorry, but my personal “experiment” was to play Far Cry 6 (HD texture pack installed) with 8gb of VRAM (GDDR6X) and sorry to say that it crashed when I looked at the wrong mountain. When playing the same game in same settings with more than 11gb of VRAM, the game never crashed again. So just to say that it seems it may have something to do with it in some cases.

And please, this is not meant as any trolling or anything, as you mentioned, it’s not meant for high-end use. Just to explain my own experiences and use cases, and for those I think 8gb of VRam is getting low pretty soon.

This is not even close to correct for two reasons. First, many games this year (and even last year) have run like :poop: on consoles despite oPtImIzAtIoN. And consoles are generally worse performers compared to even mid range dGPUS.

Second, they are so locked down they are compared to even “normal laptop hardware”, much less Framework Laptops. If you think repairing a modern laptop is bad, well, consoles take that to the next level. I don’t think people who value right to repair should be looking at ANY modern console, given even the best right to repair laws seem to exclude game consoles and how many of the console makers seem to screw over consumers in other ways to. That’s not so pro-consumer after all!

And I agree with Wrybill_Plover; high amounts of vRAM is not just for gaming; certain workloads that are GPU dependent can make quick work of vRAM. Given AMD just announced a new laptop GPU (the Radeon 7900M), if they make an S version, I think Framework should 100% offer it. 8 GB of vRAM just isn’t enough nowadays for many reasons above entry-level configurations.

6 Likes

I’m not in the business of making and selling laptops. That said, the decision to go with 8GB here was made for any number of specific reasons, to the advantage of the interested players here. I expect AMD has the final say, and didn’t want this laptop to do the work of a desktop workstation.
Really unfortunate, as I would have liked to future proof my purchase at 16GB, and also avoid the eventual e-waste that comes with buying an upgraded GPU module.
I ran my last laptop’s GPU into the ground before I replaced it (replacing an MXM card is a serious and ongoing nuisance). It was a Nvidia 880M, and it had 8GB.
That was an 8GB card, produced 10 years ago. And here we are.

Really important points in my opinion. I honestly believe that higher VRam can be connected to sustainability - it could seriously be a useful priority for Framework as a company.

1 Like

My contributions here are also aimed for any gamer/professional reading this who is considering buying the notebook now. If you are, then you are running the risk of actually receiving the notebook in May/June 2024. Beware of your gaming/work expectations for a 2000+ Euro notebook - 8gb of VRam in May 2024 will be underwhelming out of the box.

1 Like

So which AMD mobile GPU that pulls up to 100W from the range that was available back in Q4 '22/Q1 '23 should Framework have used instead?

2 Likes

You have me at a loss. What does a GPU available a year ago have to do with the engineering and product placement decisions of a GPU that will be delivered in the next few months? I’m not connecting the parts you’ve laid out here.
AMD could have put whatever RAM they wanted on the card. I’ve never seen a card manufacturer build a card with a quantity of RAM other than exactly what was sanctioned by AMD or Nvidia, so I have to assume that’s all part of the contract. They chose what they did for a reason. The reason–I feel, IMHO–being that they didn’t want to cut into their desktop market, let alone into AI training. Also to sell an upgraded GPU sooner rather than later. Like three, maybe four years out. Instead of six or eight.
Look, it’s all good. I’m in for it. It’s time for a new laptop, and this is what I’m going with. For what it’s worth, I woulda paid more for 16GB. And ran it into the ground.

1 Like