Long story less long: It’s not possible with the RX7700S. It might and probably will be possible with expansion bay GPUs in later/different chipsets. If 8GB VRAM is going to cramp your style, FW16 may not be a good fit for you.
But they didn’t. Therefore we’ve got the 7700S with 8GB.
I notice you didn’t answer the question.
- You just proved my point.
- Your question was incoherent, and you’ve ignored my request for clarification.
- This is boring. Take care. Have a nice life.
Please stop the arguing or we’ll have to lock the thread.
The point’s been made anyway.
having 8 gB today is unacceptable in a windows gaming device that is expected to run all sorts of games today, and whatever comes tomorrow.
If people can’t play certain titles because they won’t run on reasonable hardware, those titles shouldn’t be purchased! You act as if there’s nothing we can do to prevent devs from releasing unoptimized, Denuvo-“enhanced” trash, but it’s literally as simple as not paying for crap. I got Bugthesda’s Starfield free with my FW16 reservation and I wish I could return the game just to lower their engagement numbers. The fact that I shelled out ~$700 for a fucking 6800 XT only to be met with 60fps at 1440p is unacceptable. If we keep playing this game of “oh these moron devs can’t do their job and everyone knows it so we’ll just throw more hardware at the problem and call it a day”, we’ll lose. Over and over again. There are no winners in an arms race between shit code and over-built systems.
As a side note, while you correctly identify Windows as a niche operating system for videogames, you’d be surprised how well Proton takes that only remaining use case from it.
I can’t find anywhere in the marketing that says the FW16 was designed for “all sorts of games today, and whatever comes tomorrow”, so I’d be interested if you could point that out.
I can see both sides. I think 8GB is enough for tons of games, including some newer games. I’m confident it would suit any needs I would have for a laptop. So it’s definitely a totally viable and sufficient option for some people.
On the other hand, I can absolutely understand wanting more than 8GB of VRAM, either for the games that benefit or for use cases beyond gaming. Nothing wrong with putting it out there to Framework that there are people who wouldu appreciate (and pay) for additional VRAM. It lets Framework know there is a market for potential, future GPU options.
If you are really interested in the discussion, please also consider the previous thread with several arguments: Help! If anyone from FW is seeing this, we need more VRAM - #19 by Philam
PS: and otherwise you seem to agree that it’s not enough for high-end use cases (4K gaming with raytracing). Note that there are also loads of other fun stuff you can do such as in-game resolution scaling or installing HD texture packs which many people spending 2000 EUR would love to be able to do - it’s not only about raytracing or the general detail you mentioned. Raytracing in itself has been developed for decades in the quest for pathtracing technology and was also critical in the movie industry- not sure it is “crap” - but that seems to be a subjective opinion only. Finally, there are professional users of course, many point out they need much more than 8gb of VRAM. In this discussion, we should also see what type of VRAM we are talking about, note that for example GDDR6X or similar is much different from previous generations. On the shipping date, you mentioned FW16 laptop will be “delivered Q4 2023”. Sorry, but that’s plain wrong for most people - this is the case for some batches, but the wild majority of people that ordered will receive it throughout 2024. Just some notes, but as I said, this discussion is wildly subjective. I personally think companies like Nvidia made huge mistakes building GPUs with massive computing power and VRAM can’t keep up. We have all seen awkward instances where the same GPUs (same name) came out again with added VRAM in the 30s series… And on your point about game crashing, sorry, but my personal “experiment” was to play Far Cry 6 (HD texture pack installed) with 8gb of VRAM (GDDR6X) and sorry to say that it crashed when I looked at the wrong mountain. When playing the same game in same settings with more than 11gb of VRAM, the game never crashed again. So just to say that it seems it may have something to do with it in some cases.
And please, this is not meant as any trolling or anything, as you mentioned, it’s not meant for high-end use. Just to explain my own experiences and use cases, and for those I think 8gb of VRam is getting low pretty soon.
-
See edit. I wish the mods hadn’t moved my post over in its entirety without having asked me first :/
-
Fair enough about most people getting the laptop later. I just add that date to contextualize things a bit. Those receiving their devices later should prolly save their expansion bay for later graphics card offerings.
-
I’m sorry, but I don’t see your point when it comes to high-end and professional use. With laptops, you pay for the portability and convenience. If you need performance for 2000 [currency units], you’re much better served by a desktop. Let me be clear: if the FW16 doesn’t have enough performance for you, don’t buy it. Buy a workstation or an eGPU instead.
-
Yeah, Nvidia gonna Nvidia. Corporate greed is a hell of a drug. Gamers seem to love supporting it, though!
-
Again, there are plenty of games that can run into an insufficient VRAM scenario without crashing. Far Cry is owned by Ubisoft, which is known for its incompetence. Even then, I wouldn’t give them too much crap for it – you’re modifying game files in a way the developers did not intend. Are you really so surprised to encounter crashes? The fact that it involved VRAM is almost irrelevant considering the heart of the issue here.
You might be overlooking the fact that GPUs in general, and dGPUs in particular, are not used only for gaming and graphics these days. ML tasks can be very VRAM hungry. For example, the recommended VRAM size for Stable Diffusion is 12 to 16 GB. And SD can run relatively efficiently with less memory compared to, say, transformer-based LLMs.
This is also why making sure that the BIOS on the AMD models allows setting the 780m iGPU’s UMA frame buffer to sizes larger than 8 GB is hugely important for making the FW laptops not only a viable, but easily a preferred option for ML-related work.
Please see my edit. I don’t want the FW16 to be designed for ML stuff, and I don’t think the people who have already sold out 12 batches of it care for ML that much, either!
Consoles are such that the game HAS to run on that dedicated hardware, and so coding, testing and gaming experience is against that device.
Yeah, it’s pretty disappointing to see how far we’ve fallen in comparison to the early days of computing. Apollo engineers got us to the moon and back with what amounted to a fucking pocket calculator, and modern gamedevs can’t draw pixels to a screen without 17 layers of bloated game engine cruft.
I can see both sides. I think 8GB is enough for tons of games, including some newer games. I’m confident it would suit any needs I would have for a laptop. So it’s definitely a totally viable and sufficient option for some people.
On the other hand, I can absolutely understand wanting more than 8GB of VRAM, either for the games that benefit or for use cases beyond gaming. Nothing wrong with putting it out there to Framework that there are people who wouldu appreciate (and pay) for additional VRAM. It lets Framework know there is a market for potential, future GPU options.
Best take in the thread.
- Your question was incoherent, and you’ve ignored my request for clarification.
There is no Navi 33 based GPU with more than 8GB. Even the Workstation cards. You can’t have what AMD doesn’t have.
You have me at a loss. What does a GPU available a year ago have to do with the engineering and product placement decisions of a GPU that will be delivered in the next few months? I’m not connecting the parts you’ve laid out here.
If you cannot connect those two bits then you have never been involved in design and manufacture of an electronics product.
What was available back then would have been in the selection list for a new product, not the items that come to market a month or two before delivery. Decisions have to be made as to what to use, arrange manufacturing and delivery contracts involving batch sizes/delivery numbers per month, and the item chosen designed into the end product with a considerable amount of regression testing over the development and testing period.
modern gamedevs can’t draw pixels to a screen without 17 layers of bloated game engine cruft.
hhaahaha…the truth.
So which AMD mobile GPU that pulls up to 100W from the range that was available back in Q4 '22/Q1 '23 should Framework have used instead?
Why 100W?
The expansion bay connector is designed for 210W on the 20v line (plus 28.1w on other lines) and the expansion bay itself was designed to allow the GPU module to be as big as it needed to be to allow for adequate cooling. So Framework potentially could have offered the AMD RX 6800M (which has 12 GB VRAM and a TDP of 145w).
. . . I will say, however, that if you’re trying to do AI shenanigans, you should prolly look into a proper workstation. AI/ML by nature is a very power-hungry brute-force way to solve problems for people who can’t do real math. That kind of thing clearly isn’t what you should design mobile devices for. People are also acting as if eGPUs don’t exist either, but throwing unholy amounts of compute at something from a laptop is exactly the use case eGPUs cover. I’m pretty happy that neither Framework nor AMD had AI in mind when they designed the Framework 16’s graphics card – I’d be stuck paying for something I don’t need or want!
Interesting perspective.
So, if a card with more VRAM was also offered - we see many video card options available for the desktop, for example - you would be “stuck paying” for that option even if you didn’t want it?
For me - and I really didn’t get an impression I was an exception here - the major appeal of the FW16 design is in its modularity and the potential for having different options for components like the dGPU. How is asking for more options puts those who don’t need them at a disadvantage?
I don’t want the FW16 to be designed for ML stuff, and I don’t think the people who have already sold out 12 batches of it care for ML that much, either!
That’s all right that you personally don’t want FW16 to be designed for ML (although I am not quite sure what part of such design you would specifically object to - arguably, FW16’s design is fully compatible with ML already). But, as someone who is a part of the 12 sold out batches, I do want the option of not only doing ML, but any other work that can benefit from GPU-based computations and large VRAM. And I want to be able to do that untethered from either an eGPU or the cloud. Is that not all right as well?
The main point here was simply that the discussion shouldn’t be limited to gaming needs. Whether one, as an individual, shares those other needs or not.
So, if a card with more VRAM was also offered - we see many video card options available for the desktop, for example - you would be “stuck paying” for that option even if you didn’t want it?
I do want the option of not only doing ML, but any other work that can benefit from GPU-based computations and large VRAM. And I want to be able to that untethered from either an eGPU or the cloud. Is that not all right as well?
My apologies; I have been unclear. I’m completely happy that FW16 is modular, and I completely support the concept of having different expansion bay options for different use cases.
My responses were mainly addressing this idea that a card with an entire 8 GB of VRAM is a waste of time/money for a laptop. Some in this thread are acting as if any graphics card that comes out with less than 16 GB of VRAM is worthless, despite the fact that 8 GB of VRAM is more than enough for plenty of use cases.
With that in mind, I don’t want to be stuck paying for a single 16 GB offering, but I’m more than happy if they come out with an additional 16 GB offering. This response sums up my feelings quite nicely.
If AMD or Nvidia were to announce tomorrow that they’re working on a 48 GB expansion bay option, I’d be 1) very confused as to why they’d cannibalize their enterprise options, but 2) very happy for those in this thread that are so interested.
At the end of the day, everyone in this thread (should) have the same underlying goal: a repairable, modular laptop that can cover different use cases and stay out of the landfill for as long as possible.
At the end of the day, everyone in this thread (should) have the same underlying goal: a repairable, modular laptop that can cover different use cases and stay out of the landfill for as long as possible.
Well said.
- I’m sorry, but I don’t see your point when it comes to high-end and professional use. With laptops, you pay for the portability and convenience. If you need performance for 2000 [currency units], you’re much better served by a desktop. Let me be clear: if the FW16 doesn’t have enough performance for you, don’t buy it. Buy a workstation or an eGPU instead.
- Yeah, Nvidia gonna Nvidia. Corporate greed is a hell of a drug. Gamers seem to love supporting it, though!
- Again, there are plenty of games that can run into an insufficient VRAM scenario without crashing. Far Cry is owned by Ubisoft, which is known for its incompetence. Even then, I wouldn’t give them too much crap for it – you’re modifying game files in a way the developers did not intend. Are you really so surprised to encounter crashes? The fact that it involved VRAM is almost irrelevant considering the heart of the issue here.
Hi, thanks for the replies.
On 3: yes, desktops are better, still pursuing the dream of laptop and EGPU.
On 4: I opted out and won’t upgrade at all until Nvidia somehow wakes up, or AMD saves us ;).
On 5: I agree, in fact, I was surprised I could even run it at all. I am not surprised it crashed, it was just to make the link with missing Vram which I think was the reason.
I can see both sides. I think 8GB is enough for tons of games, including some newer games. I’m confident it would suit any needs I would have for a laptop. So it’s definitely a totally viable and sufficient option for some people.
On the other hand, I can absolutely understand wanting more than 8GB of VRAM, either for the games that benefit or for use cases beyond gaming. Nothing wrong with putting it out there to Framework that there are people who wouldu appreciate (and pay) for additional VRAM. It lets Framework know there is a market for potential, future GPU options.
This is my opinion by the letter. 8gb of VRam is solid for a lot of applications and games, a lot. But some enthusiasts (like me) wish for more in laptops ;). The reasons are turning on HD textures, upscaling in-game resolution to 200%, and yes, raytracing (although more sceptical as some have said).
So which AMD mobile GPU that pulls up to 100W from the range that was available back in Q4 '22/Q1 '23 should Framework have used instead?
Also, there is the RX 6850M XT from 2022; it was in the Legion 7 2022 and was quite competitive with the RTX 3080 mobile GPU in some games like Red Dead Redemption 2, where it was the fastest laptop GPU until the RTX 4080 and 4090 mobile over took it.
It had 12 GB of vRAM; still not amazing, but far more future proof than 8 GB.
Given that there’s only one mobile Navi 31 based model so far, odds are that a 6850M replacement could be on the way with more than 8GB VRAM.