I think it’s interesting that framework specifically mentioned that Cooler Master collaborated on the design of the cooling system in the expansion bay modules. Cooler Master also produced a 3rd party case for the framework 13 mainboard. I could imagine a scenario where other partners like asus, or maybe a brand that doesn’t also make laptops like zotac could design a module for the system with a variety of gpu configurations since the specs are openly available, and based on open standards like PCIe.
I know it might sound crazy, but I really hope we see a Quadro for the 16.
Right now I have a couple of clients who use Quadro equipped laptops, but like most laptops serviceability is awful, and getting a committment in terms of longevity of servicing is very difficult. The idea of a laptop which can have a Quadro GPU dropped in would be brilliant, and it is a part of the market where the price would be less of an issue.
I know we are probably looking down the road quite a bit here, but I truly hope the Framework expansion port will become a standard and we see 3rd party manufacturers releasing GPU’s such as these.
For game development, AI and simulation - nVidia GPUs are key. It’s unfortunate but the reality is that RTX platform is miles ahead in support and software than AMD.
I run a 20+ person game dev shop that likes the flexibility on working on high performance laptops - self-upgradable laptops would’ve been perfect. My team was excited about the Framework. But without nVidia GPUs, Framework is a hard pass unfortunately.
This thread seems to be veering off into the direction of another “what GPU do you want for FW16” discussion, like this one: [Poll] CPU and GPU combinations
Nothing wrong with that, but just to circle back to the original question:
Framework have put a lot of effort into creating or preserving user choice with every component they can. Their response to user feedback has been outstanding, in my opinion. I don’t know anything more than anyone else in this thread, but I think not eventually providing some kind of choice for this component would be somewhat contrary to the spirit of the project, and a bit of a departure from their excellent track record for responding to community feedback.
I think we just need to give them more time to work out the details, and more choices will emerge. After all, the laptop has not even been released yet!
While I agree that Nvidia tends to lead the pack, you’re talking about a company with a market cap of 1.13T USD vs AMD with a market cap of 171.3B USD; and let’s not even get into the actual corporate financials. Nvidia’s entire GPU division is larger than AMD in it’s entirety, it’s incredible that AMD is even as close as they are in the grand scheme of things.
And what do we get for it? Ever rising GPU prices, further market lock-in, more winning for Nvidia resulting in more winning, which results in more winning. You can see where I’m going. If you make stuff tailored for Nvidia, you lock everyone else out. I don’t doubt that there’s a reason why Unreal and modern Unity run like complete garbage on modern systems, and I have a sneaking suspicion that Nvidia is involved, given their shenanigans in the past.
Tailoring your stuff for AMD/Intel though? Works great across all sorts of systems. Sure, AMD/Intel and their push for interoperability and open-source is a marketing play, but at the end of the day, that’s what lets us, the consumer, win. Intel couldn’t have even the half-decent GPUs they have today without AMD and Valve so they could sidestep the problem Arc graphics have with older APIs.
As long as AMD and Intel have “good enough” graphics, I’m going to support them. I’ve been rolling AMD graphics since 2012, I’ve seen their bad products and skipped them, but they’re on the upswing, it just takes time before they can properly compete against an industry giant that, unlike Intel, isn’t complacent with their position.
Perhaps getting FW16 laptops with TB or Oculink eGPUs? And maybe in the future, nVidia expansion cards will come out. Only if you’re really set on getting a FW16 this generation.
Just like FW13, FW had to start somewhere. They started with Intel. People asked for AMD (hard pass if no AMD! they said). So finally we have AMD. Took them 3 generations, but we’re here.
FW16 is probably going to be in the same boat. I’m actually surprised this time they started with AMD. I guess with FW13 + AMD, the door was open for a FW16 + AMD start.
I thought with a FW13 + Intel start that the FW16 would already have a head-start with Intel as well. I’m not sure why an Intel option wasn’t available as well. Maybe cost? Decide to dip their toes in the 16 inch model with one partner first? With the clamor over AMD, FW may have decided this would be a better first choice than Intel?
Though just like with the numpad situation, the Intel/AMD/Nvidia they may be learning the split is more even.
Even so, this is how things are now, but looking at how the FW13 story played out, I feel the FW16 will also have multiple choices eventually.
It’s already amazing that a small company like FW can offer Intel and AMD on a product line already. A new product line I wouldn’t have expected multiple choices. However, with their track record, I have no doubts we’ll have the choices we wanted in time, just like with the 13 inch model.
However I’d suspect the bus bandwidth would be severely limiting when the data is being transmitted over thunderbolt, no? I’m no hardware expert so I would have to ask how to the bandwidth speeds would compare. I doubt it would ever get to be as fast as plugged-in into the mobo.
Strictly speaking, the AMD partner programs do not actually preclude companies from working with other chip makers or graphics card brands, which is pretty cool, but from what I remember they heavily incentivize via pricing and software support… in this instance, hardware support, too!
I only have seen the terms of the partnership deal as a software developer (Games) but yeah, they are very permissive. So it SHOULD be completely down to Intel and Nvidia if they wish to game a Battlemage or RTX solution for framework… fingers crossed they do! Because that would mean insanely awesome stuff for the entire laptop market! Not just framework… though it would probably benefit Framework A TON of course haha.
Why do you think that’s frightening and concerning?
You should actually be grateful.
Despite all the financial troubles and unfair competition AMD has been facing so far, they emerged with an amazing product, consumers take advantage of.
And unlike Intel, they didn’t resort to foul play. Here are a few examples of penalized foul play by Intel:
According to nrp, Framework started with Intel Processors due to technical limitations on AMD’s part.
Couldn’t find any issues about nVidia beside their Linux driver. Maybe their rumored B2B harassment is being shielded by NDAs.
Lately they’ve taken a half-hearted approach towards an open source graphics driver (Nvidia takes first step toward open source Linux GPU drivers | Ars Technica), being concluded by a linux developer as:
[…] “a net win for practical purposes” since the blob of proprietary code can be sandboxed more readily. “But no freedom was gained, for people who care about that. […] [About] the same amount of code is closed [as before].”
The fact that one of their board partners straight up decided that making GPUs was no longer worth it (EVGA) despite it being basically the majority of their business tells me enough about NVIDIAs B2B practices.
I do hope that an Intel GPU comes to the 16" at least though, I’ve never owned one (I’ve owned AMD only for a really long time) I do think that having competition, especially competition on a laptop where you can swap GPUs is good for consumers.
Agreed, I have never bought a Nvidia GPU and I never will. What happened with EVGA was just sad. I think eventually you will see other options, but I also think that the Nvidia module will not be a direct Framework product, but instead a third party product.
As to Nvidia being light years ahead, interesting things are happening on the ML front and I see Nvidia taking a hard dive in the not too distant future, it is what happens when you have been on top for too long.
Seeing how my 7900XTX and hundreds of other people’s… randomly cannot run certain games because of driver timeout errors that can crash the entire system etc, I don’t think Nvidia is at any risk. From this point onward after Framework I’m switching from AMD (25 years straight!) to Nvidia or Intel GPUs only.
Especially since AMD is ignoring hundreds of us, censored our threads on their reddit… made us go to the AMD support forum, and now ignore us there lol. I have very little faith AMD will even bother fixing these random crashes. Cyberpunk 2077 I can run maxed out in raytracing etc 1440P at 65 FPS, amazing. But Phantom Brigade I can play for as little as 2 minutes and my drivers CRATER. Darktide, Deus Ex Mankind Divided, such a random assortment of games. Unreal editor too, like I am a game developer, I need this for work and I can’t go into the UE editor lol.
That indeed does sound concerning and AMD needs to get reminded of the goal of their “AMD Advantage” program in that case.
I’ve had my issues with nVidia dGPUs mostly burning out due to failing coolers, so I switched to AMD APUs and still am happy with that choice, currently owning a 3500U and a 5600G.
But I’m not a hardcore gamer, so I’m aware that’s not an option for you.
I was unaware of these issues as I generally don’t keep track of what is happening on the Windows side of things. On linux the 7900XTX to my understanding works fine, and when I am talking about ML workloads…Windows does not even register at all.
Hope they figure out what is going on as it is clearly a hardware combination issue (RAM incompatiblity most likely) or a software (driver) issue. On another note this is literally (at least in the past) every linux users experience with Nvidia, never knowing when an update is going to break everything related to your card. Got one hybrid laptop (work issued) and it is still a fairly frequent issue these days. Good luck.
The thing though… people with dozens of different hardware combinations have the same issue and yeah it hits random applications, games, etc.
From G.Skill, Kingston, Samsung to HyperX… 16gb-64gb, it seems squarely int he driver suite and we’ve been hit with it for 6 months now. Like, I have run R9 290x for years, 7970hd, ATI 5600HD (Kept that one haha), I’ve always preferred ATI/AMD because of how long they last and despite small issues of the drivers… usually they were fixed in a few weeks.
This has been eight months now with no response or even acknowledgment, and they culled us from the Reddit moving us to the forums to where its less public, and continue to ignore us :/.
I spent $1350 CAD for this 7900XTX on a good discount! And when it works… I love it. XFX did a good job making the cooler etc. But not being able to rely on it for work or my gameplay? Heck… it’s stressful. I don’t even know if AMD is addressing it, I think they are honestly just making sure AAA big releases are fine.
I have a 7900xtx, how would I trigger this, I wanna test if mine does that too?
Main issue I had with this card so far (appart from the first one hard crashing the whole computer and having to be RMAd, replacement was fine) was the high idle power but that was massively reduced a while back and pretty much fixed with the latest drivers. Other than that smooth sailing.