Introducing the Framework Desktop

I personally find disappointing the approach here.
The ai max+ 395 128Gb model seems to be oriented towards the AI market yet albeit you can assign a amount enough VRAM (96gb in windows, in Linux it seems you can use more) it lacks the computational power a 4090 card can offer (129 TOPs against 508 TOPs in the Nvidia).
A nice solution would be to offer a proper pcie4 x16 as the processor is capable to support that unfortunately that is not the case, and thus this product has no market.

I would understand the power limitations in a laptop but this is a desktop. No excuses.

Yeah… revolutionary for making a few people and companies very rich while taking away jobs from others, stealing IP and violating copyright, hallucinating information, being very bad at relatively simple math, having a negative impact on education and learning, etc etc.

But I guess you get a low effort JPEG after expending 33Wh, so i guess that’s great…

I also never said it was the biggest. But your locally running GPU isn’t the real problem.

$500 billion for AI companies, $0 for climate change mitigation.

That’s lots of misconceptions.

But I guess you get a low effort JPEG after expending 33Wh, so i guess that’s great…

I was always told art is subjective, and I can get images I consider very high quality, and more importantly I can make images of what I have in mind for e.g. my D&D campaigns. It’s truly transformative.

E.g. Cobbe GIovatta is a dislexic orc journalist that runs the most popular tabloid in Waterdeep. This image needed 349W for 39s. That’s 3.78Wh or the equivalent of boiling 6.02g of water. You’d need a lot more energy if you did it with a traditional photoshop workflow, say 200W for 3h totaling 600Wh/955g of boiled water.

GenANI assist SAVES energy.

Compare with E.g. running an hairdrier for 30 minutes at 1500 W and that’s 770Wh or the energy to boil 1194g of water. Your household activities use lots of energy.

Yeah… revolutionary for making a few people and companies very rich while taking away jobs from others, stealing IP and violating copyright, hallucinating information, being very bad at relatively simple math, having a negative impact on education and learning, etc etc.

GenANI assist steals NOTHING. My Cobbe Giovatta never existed before I diffused it. It’s my derivative artwork and it stole nothing from nobody. I also believe in open source, so when I make good homebrews, I share that with other DMs so they can use them in their campaigns.

That’s how creativity works. E.g. Anybody can look at Warhol artwork, and sell derivative work inspired by that and it’s perfectly legal. Everything human make is derivative, that dragon you make is a remix of millenia worth of dragon lore.

Using a GenANI assist tool to make Warhol cats make no difference. More to the point, art has a long, and I mean LONG history of bullying new tools. Literally every new tool has been bullied before being considered mainstream art. Here is my favourite quote from the 1800s about photography. A machine where you click and it makes a copyrighted image of somethign you do not own:

Charles Baudelaire wrote, in a review of the Salon of 1859: “If photography is allowed to supplement art in some of its functions, it will soon supplant or corrupt it altogether, thanks to the stupidity of the multitude which is its natural ally.”

“At the other extreme, there was outright denial and hostility. One outraged German newspaper thundered, “To fix fleeting images is not only impossible … it is a sacrilege … God has created man in his image and no human machine can capture the image of God. He would have to betray all his Eternal Principles to allow a Frenchman in Paris to unleash such a diabolical invention upon the world”[12]. Baudelaire described photography as “art’s most mortal enemy” and as “that upstart art form, the natural and pitifully literal medium of expression for a self-congratulatory, materialist bourgeois class” [13]. Other reputed doom-laden predictions were that photography signified “the end of art” (J.M.W. Turner); and that painting would become “dead” (Delaroche) or “obsolete” (Flaubert) [14].”

1 Like

Thanks for registering an account in order to display your disappointment. It’s appreciated.

Not sure I follow your point about the 4090 comparison though, one has a lot of computational power but limited VRAM, the other has lots of VRAM and limited computational power.

Perhaps these are different tools for different problems? You know, the same way that it’s entirely possible for a Ferrari and a Jeep to exist. I’d love a Ferrari … but I’m not sure I’d go rock crawling in one. Similarly, I’d not spend a lot of time racing the Nürburgring in my Jeep.

As for your question about PCIe lanes … of course, 16 total PCIe lanes are available with this processor so it is entirely possible to have all 16 dedicated to a GPU … and with the space saved by removing the now useless NVME slots, they could probably fit the x16 slot on there as well. Might have a spot of trouble booting it up though.

Ah well, never mind.

1 Like

LOL with smug comments like this I understand why people dislike this brand.

Right now the minis forum ai X1 pro looks like a better purchase at less half the price and better modularity. At least you can get it to run games with an egpu.

Let that sink in.

When I consider to get similar performance using X1. It is almost the same price.

1089 (X1 Pro with 96G RAM) + 99(eGPU dock) + 450(RTX 4060) + 100(Power supply for eGPU)

This total solution still have less cores and less memory speed tham AI MAX 395+.
After all, that is not an apple to apple comparison.

1 Like

Defining and quantifying the meaningfulness of art by the number of watts it cost to produce is the kind of rhetoric a lot of us are railing against.

2 Likes

LOL the disinformation. The minisforums already have a igpu with a 890m which is more than capable to play games by less than half the price and still offers more connectivity being able to use egpus.
But I say minisforums because it’s already out, you can be sure there are coming more minipcs at better pricing and same practical results

As I said absolutely no reason to buy this.

The Minisforum Ai ones can be compared to the new laptop amd 13" by specs.

So its pretty much in different league than the Desktop.

Its ok if its not for you, but there are people that would prefer something like this compared to a Mac Studio for example.

For other it might be perfect dev+gaming+local AI machine. Atleast that is what I am going to be using it for.

3 Likes

I’ve had a few jobs recently where I didn’t really need a laptop, as I rarely needed to work on the go but a small, portable but powerful desktop that I could move between offices would have been great. I see this filling that niche nicely.

1 Like

Which messages are considered disinformation?

Although the HX370’s built-in 890m graphics deliver decent performance, they are far from the performance level of an RTX 4060. Not to mention, they are nowhere near sufficient for someone who genuinely wants to run 3D games. You are deliberately misleading people by attempting to compare two entirely different things. This machine may not be suitable for you, but that is not a reason for you to spread misinformation.

Moreover, this VRAM actually has enough capacity to accommodate a 70B AI model, providing more room for development.

As mentioned earlier, your example is not an apples-to-apples comparison.

5 Likes

What is the point trying to create clusters of these things using USB-4 and 5G Ethernet? What a dumb joke! Minimum should be 2 x 10G Ethernet which is not nearly fast enough but is at least equal to Apple… Tenstorrent RISC-V based systems with unified memory pools support 2 x 100G Ethernet for each board just so you know. That’s useful. I wish companies would stop and think more about this stuff before jumping on a hype train claiming to run a decent model like 70B when reality is 5 Tokens per second at best, and Deepseek R1 671B at 2 Tokens per second which is totally unusable. Telling people to cluster using USB as a pipe is silly.

USB4 is a lot faster than 5gbit or even 2x 10gbit.

When it is pretty much the fastest io the device has it isn’t the silliest thing ever.

2 Likes

Where would one be able to acquire those computers, and what amount of money would they have to pay per functional CPU?

What’s the difference between straight and diagonal tiles?

The lines on the tiles are straight (from side to side) vs the lines on the tiles are diagonal (from corner to corner).

Next Year, The company should refresh with the next generation Intel Core Ultra Processors.

This is an extremely… misleading or misinformed way of measuring generative AI data consumption. AI Data centers that train these models need massive amount of energy. They want to get to 6GW and beyond per data center. Do you know how much that is? New Delhi, in India, 20+ million people and industries, in summer (45C during the day), sees a peak demand of 8.5 GW. What justifies a similar consumption of energy for mediocre images of orcs wearing fedoras? We do not currently have a surplus of de-carbonized energy to justify any of this.

Not to mention the demand for chips, DRAM, NAND, etc. which is also very intensive in terms of energy, water and raw materials. And who pays for this JPEG? The average person who sees their bills go up, and all of us globally as we march ever towards the climate disaster that is coming towards us.

2 Likes

I literally measured it on my computer…

Data centers are so low on emissions as to be a rounding error. It’s always a good thing to go for efficiency, but you need to go for the big things if you want to make a difference. Data center are a small thing. You mentioned New Delhi. You know their agriculture is such that they cloud the skies when they burn biomass for fertilization? Tens of thousand die each year because of that. They also use most of the water. That is a big problem, mechanizing their agriculture will make a huge difference. The data center won’t.

There is a pattern forming here. More automation increases efficiency! This is how our civilization went from everybody needing to work the field, to almost nobody working the fields.

AI training is a one off thing, it’s energy use as percentage will always go down to 0 relative to inference as models are used, and that energy use is already tiny compared to the stuff that is making emissions.

If you want to mitigate climate change, the biggest actions are:

  • Eat less meat. Each steak is thousands of image generations
  • Use less AC.
  • Travel less, especially by plane and use more public transports.

Reducing AI assist will actually increase your footprint because AI assisted workflows that work use 1/10 to 1/100 of pure legacy workflows.

If you ever paid for a game, you also paid for jpgs. Those are called assets, and game developers are paid to make them. Almost all art is a commodity that is part of a product.

AI assist allows smaller teams to do bigger better project for less cost and less resources.

The USA is notoriously terrible at infrastructure. The USA did have cities running out of water, because the private company doing it couldn’t do it profitably, so the company just stopped delivering water. No AI needed.

The USA doesn’t have worker protection or environmental regulations, so companies can do terrible things to the local communities, like placing noisy bitcoin farms near citizens or having water intensive industries in water starved regions. It’s just a misalignment in incentives, shareholders there are valued above citizens need.

This is not a problem with technology. Technology cannot solve society issues.

The solution is for USA citizens to demand slightly stronger worker and consumer protections. Vote leaders that will give you that. When someone want to build a datacenter, there should be an environmental analisys so it can only be built if it’s sustainable and doesn’t harm the community. Same with all other industries.

Singling out a workload for datacenter is ignoring the forest for the tree. When someone like Musk places extra turnines with no permits the problem isn’t technology. The problem is the fines being less than the cost of doing it properly. USA citizens deserve better. Demand better from your elected leaders. Have those fines bite, enforce them, and the problem will go away.

Many townfolks in the US can’t get water due to AI data centers. Increasing AI will increase footprint