I asked the same some months ago, fully agree ;). But note that the laptop would also crash on older titles (couple of years). One example is Far Cry 6 with HD textures.
If you want the foremost in graphics hardware, you shouldnât be gaming on a laptop.
The current graphics card option has 8 GB of VRAM for a 2560x1600 display. Assuming 24-bit color depth with 8 bits for alpha thatâs 32 bits = 4 bytes per pixel. 4 B x 2560 x 1600 = 16.384 MB per frame buffer. Assuming two back buffers (for triple buffering), thatâs a grand total of 49.152 MB of VRAM that youâll need for the frame buffers, or 0.6% of 8 GB. That leaves the programmer with 99.4% of those 8 GB for textures, meshes, lighting info, compiled shaders, and other things such as the depth buffer.
Texture size (âtexture qualityâ) and mesh size (âmodel qualityâ) are commonly and easily adjusted in a gameâs graphics settings. Lighting info size can be adjusted via âshadow/lighting qualityâ, but blows up if you turn on raytracing. An important thing to keep in mind is that if your card doesnât have enough VRAM for some settings configuration, it probably doesnât have enough compute power for those settings anyways. Gaming at 4K with raytracing enabled requires massive amounts of VRAM, but thatâs not something this graphics card (or this laptop with its 2K display) was designed to do.
Future titles may very well require more than 8GB of VRAM, but thatâs why the expansion bay allows you to swap out for a newer card if/when the need arises ;â) Youâre never going to have a completely future-proof option with all the crap (raytracing, AI, etc.) thatâs rapidly being foisted upon graphics cards, but 8 GB is plenty for a card in a laptop being delivered Q4 2023.
Go run your favorite game and check to see how much VRAM itâs actually using! You might be surprised by what a half-decent game developer can do with far less than 8 GB of VRAM. This article written before the raytracing craze has reasonable VRAM recommendations.
As a final note, crashes have nothing to do with VRAM capacity, and everything to do with poorly-written applications/drivers.
EDIT: How awkward⌠This was originally written in response to a low-quality shitpost and has since been moved over to this discussion which has more nuance. I will say, however, that if youâre trying to do AI shenanigans, you should prolly look into a proper workstation. AI/ML by nature is a very power-hungry brute-force way to solve problems. That kind of thing clearly isnât what you should design mobile devices for. People are also acting as if eGPUs donât exist either, but throwing unholy amounts of compute at something from a laptop is exactly the use case eGPUs cover. Iâm pretty happy that neither Framework nor AMD had AI in mind when they designed the Framework 16âs graphics card â Iâd be stuck paying for something I donât need or want!
You might be overlooking the fact that GPUs in general, and dGPUs in particular, are not used only for gaming and graphics these days. ML tasks can be very VRAM hungry. For example, the recommended VRAM size for Stable Diffusion is 12 to 16 GB. And SD can run relatively efficiently with less memory compared to, say, transformer-based LLMs.
This is also why making sure that the BIOS on the AMD models allows setting the 780m iGPUâs UMA frame buffer to sizes larger than 8 GB is hugely important for making the FW laptops not only a viable, but easily a preferred option for ML-related work.
Iâm not in the market for the 16" system, but I know that some people would like to see a higher end GPU option for it. That card would probably require a separate power supply for the GPU for maximum performance; that would allow the GPU to pull 240W (assuming USB-PD is used), or maybe even a bit more if it can supplement with power from the main system. (One possible way to squeeze out a tad more might be to power the GPU itself from the second supply, but to run the cooling system with system power.) It should also be designed to allow operation with reduced performance without the second power supply.
Although the market for such a card exists, itâs not large. I think Framework will do one eventually but itâs not their highest priority right now; they need to actually ship the 16" system first! They might also consider partnering with a graphics card manufacturer to make that rather than doing it in-house.
Iâm completely aware of these details and I severely disagree with your conclusion.
Iâve run into vram limitations multiple times in titles I played this year. Iâve had to upgrade my partners gtx 1080 to something with more VRAM just to make it stable in our titles without droppping the size of the frame buffer until it looks like sludge. The card had plenty of compute left in it, but VRAM is absolutely the reason for its obsolescence. Iâve run the games, scrutinized the frame pacing, measured memory usage.
We can make assumptions and demands for hours about how developers will use modern APIs correctly, build in proper texture map settings for baked lighting, shadows, geometry, and implement functioning LOD. But at the end of the day, games ship, developers build what they will, and all that matters is our hardware copes with the games we play.
And for that reason, having 8 gB today is unacceptable in a windows gaming device that is expected to run all sorts of games today, and whatever comes tomorrow.
Go console. The rest of your post talked about hardware isnât really the concernâŚbut it has more to do with software / codingâŚ(and I agree with you)
Consoles are such that the game HAS to run on that dedicated hardware, and so coding, testing and gaming experience is against that device.
PS: and otherwise you seem to agree that itâs not enough for high-end use cases (4K gaming with raytracing). Note that there are also loads of other fun stuff you can do such as in-game resolution scaling or installing HD texture packs which many people spending 2000 EUR would love to be able to do - itâs not only about raytracing or the general detail you mentioned. Raytracing in itself has been developed for decades in the quest for pathtracing technology and was also critical in the movie industry- not sure it is âcrapâ - but that seems to be a subjective opinion only. Finally, there are professional users of course, many point out they need much more than 8gb of VRAM. In this discussion, we should also see what type of VRAM we are talking about, note that for example GDDR6X or similar is much different from previous generations. On the shipping date, you mentioned FW16 laptop will be âdelivered Q4 2023â. Sorry, but thatâs plain wrong for most people - this is the case for some batches, but the wild majority of people that ordered will receive it throughout 2024. Just some notes, but as I said, this discussion is wildly subjective. I personally think companies like Nvidia made huge mistakes building GPUs with massive computing power and VRAM canât keep up. We have all seen awkward instances where the same GPUs (same name) came out again with added VRAM in the 30s series⌠And on your point about game crashing, sorry, but my personal âexperimentâ was to play Far Cry 6 (HD texture pack installed) with 8gb of VRAM (GDDR6X) and sorry to say that it crashed when I looked at the wrong mountain. When playing the same game in same settings with more than 11gb of VRAM, the game never crashed again. So just to say that it seems it may have something to do with it in some cases.
And please, this is not meant as any trolling or anything, as you mentioned, itâs not meant for high-end use. Just to explain my own experiences and use cases, and for those I think 8gb of VRam is getting low pretty soon.
This is not even close to correct for two reasons. First, many games this year (and even last year) have run like on consoles despite oPtImIzAtIoN. And consoles are generally worse performers compared to even mid range dGPUS.
Second, they are so locked down they are compared to even ânormal laptop hardwareâ, much less Framework Laptops. If you think repairing a modern laptop is bad, well, consoles take that to the next level. I donât think people who value right to repair should be looking at ANY modern console, given even the best right to repair laws seem to exclude game consoles and how many of the console makers seem to screw over consumers in other ways to. Thatâs not so pro-consumer after all!
And I agree with Wrybill_Plover; high amounts of vRAM is not just for gaming; certain workloads that are GPU dependent can make quick work of vRAM. Given AMD just announced a new laptop GPU (the Radeon 7900M), if they make an S version, I think Framework should 100% offer it. 8 GB of vRAM just isnât enough nowadays for many reasons above entry-level configurations.
Iâm not in the business of making and selling laptops. That said, the decision to go with 8GB here was made for any number of specific reasons, to the advantage of the interested players here. I expect AMD has the final say, and didnât want this laptop to do the work of a desktop workstation.
Really unfortunate, as I would have liked to future proof my purchase at 16GB, and also avoid the eventual e-waste that comes with buying an upgraded GPU module.
I ran my last laptopâs GPU into the ground before I replaced it (replacing an MXM card is a serious and ongoing nuisance). It was a Nvidia 880M, and it had 8GB.
That was an 8GB card, produced 10 years ago. And here we are.
Really important points in my opinion. I honestly believe that higher VRam can be connected to sustainability - it could seriously be a useful priority for Framework as a company.
My contributions here are also aimed for any gamer/professional reading this who is considering buying the notebook now. If you are, then you are running the risk of actually receiving the notebook in May/June 2024. Beware of your gaming/work expectations for a 2000+ Euro notebook - 8gb of VRam in May 2024 will be underwhelming out of the box.
You have me at a loss. What does a GPU available a year ago have to do with the engineering and product placement decisions of a GPU that will be delivered in the next few months? Iâm not connecting the parts youâve laid out here.
AMD could have put whatever RAM they wanted on the card. Iâve never seen a card manufacturer build a card with a quantity of RAM other than exactly what was sanctioned by AMD or Nvidia, so I have to assume thatâs all part of the contract. They chose what they did for a reason. The reasonâI feel, IMHOâbeing that they didnât want to cut into their desktop market, let alone into AI training. Also to sell an upgraded GPU sooner rather than later. Like three, maybe four years out. Instead of six or eight.
Look, itâs all good. Iâm in for it. Itâs time for a new laptop, and this is what Iâm going with. For what itâs worth, I woulda paid more for 16GB. And ran it into the ground.
Long story less long: Itâs not possible with the RX7700S. It might and probably will be possible with expansion bay GPUs in later/different chipsets. If 8GB VRAM is going to cramp your style, FW16 may not be a good fit for you.
If people canât play certain titles because they wonât run on reasonable hardware, those titles shouldnât be purchased! You act as if thereâs nothing we can do to prevent devs from releasing unoptimized, Denuvo-âenhancedâ trash, but itâs literally as simple as not paying for crap. I got Bugthesdaâs Starfield free with my FW16 reservation and I wish I could return the game just to lower their engagement numbers. The fact that I shelled out ~$700 for a fucking 6800 XT only to be met with 60fps at 1440p is unacceptable. If we keep playing this game of âoh these moron devs canât do their job and everyone knows it so weâll just throw more hardware at the problem and call it a dayâ, weâll lose. Over and over again. There are no winners in an arms race between shit code and over-built systems.
As a side note, while you correctly identify Windows as a niche operating system for videogames, youâd be surprised how well Proton takes that only remaining use case from it.
I canât find anywhere in the marketing that says the FW16 was designed for âall sorts of games today, and whatever comes tomorrowâ, so Iâd be interested if you could point that out.
I can see both sides. I think 8GB is enough for tons of games, including some newer games. Iâm confident it would suit any needs I would have for a laptop. So itâs definitely a totally viable and sufficient option for some people.
On the other hand, I can absolutely understand wanting more than 8GB of VRAM, either for the games that benefit or for use cases beyond gaming. Nothing wrong with putting it out there to Framework that there are people who wouldu appreciate (and pay) for additional VRAM. It lets Framework know there is a market for potential, future GPU options.