ETA for Intel processor

Hi,
I know FW16 is new.
I would like to buy a FW16 but I dislike AMD processor. I prefer Intel processor.
Anyone can tell me if there is an ETA for FW16 with an Intel processor ?

Thanks.

1 Like

Next week

3 Likes

We dont even have an ETA for Batch 1 shipping so we cant really speak for what future configuration will include but I’m curious, why do you dislike AMD processors and prefer Intel? I know there are bad and good processors on each side but cant really see a reason to dislike one as a whole.

Why?

No ETA has been announced, however Framework previously did a poll that concluded that 83% of poll participants preferred AMD CPUs over Intel so I doubt bringing Intel CPUs will be a high priority.

3 Likes

Same tbh, but this is because i have never had AMD processor before and AMD processor naming is not as intuitive as it is for intel. However, I think FW16 in a longrun will offer you to upgrade to better processors, so, why bother? Just try AMD for some time.

Because he has a lack of information. Only those people can (and are willing to) afford intel processors.

No ETA. No announcements.

1 Like

Pleas check this post and you will understand why I prefer Intel over AMD. :wink:

Thanks for all your comments and have a great Christmas times.
Peace!
Sebastien

That is talking exclusively about desktop CPUs.

In laptop CPUs AMD’s superior efficiency places them ahead in battery life and performance. So the “Content Creation/Productivity” category should be at least a tie if not a win for AMD.

Some of the criticisms in that are also straight up bad. For example they give Intel a bonus point because “Intel offers the most overclocking headroom.” In other words AMD’s CPUs are better tuned from the factory to offer maximum performance, which IMO is a good thing. But they’re penalizing AMD for it.

They also criticized AMD in two separate categories for requiring DDR5. However Intel’s new laptop chips also require DDR5, so if you were making an up to date comparison of laptop chips then either both or neither should be criticized for that.

That used to be a more major criticism when 16 GB of DDR5 cost like $80, but now I frequently see that on sale for under $40, reducing the price advantage of supporting DDR4. Furthermore DDR5 tends to make a bigger difference in laptops than on desktops, so that criticism is more meaningful on desktop CPUs.

So that’s 4 categories were they penalized AMD for things that I don’t think are accurate. If you eliminate those 4 categories then the conclusion of that changes from Intel winning 7-5 to AMD winning 5-3.

This is just from a quick skimming of that article on my phone. There’s probably other issues that I missed.

15 Likes

I can see that there might be a preference if that’s the only processor brand someone has used. I know for my case I’ve only ever owned Intel CPUs. So as much as I would like an Intel CPU there’s definitely some benefits to AMD, particularly for laptops, that I’m interested in trying. I know I was considering an AMD system for my desktop a few years ago, however that’s not getting an upgrade and the FW16 will be replacing my current laptop and the desktop.

This will also be my first AMD GPU so we’ll see how that goes. I’d still prefer a Nvidia GPU and I would definitely like to see an Intel option for that.

The single point that makes me want to go to Ryzen - is simply the used tech (5/7nm structures) where Intel still is playing in the 10nm.
Only this reduces the power consumption drastically enabling AMD to use 8 cores + HT, where intel needs to go to 6cores + ht - then again, they start using Power cores and efficiency cores cause they actually can’t create a single core that can handle both …

Using AMD zen 2 cpu in my server, I can definitely say that the performance increase was more than doubled overall … and that server’s IDLE power went down from 52W to 27W just by switched CPU Tech/Motherboard.

Then one thing where intel has always been good at, is Marketing and FUD. Was there not intel, we’d have ECC on all devices by now.
And where that leads in the end, we can all relate with Windows 11 …

2 Likes

The naming of process nodes is terrible and inconsistent between brands.

What you’re referring to as Intel 10 nm is actually physically similar in size to TSMC 7 nm (slightly smaller than original TSMC 7 nm, slightly larger than the refined version of TSMC 7 nm).

This is why Intel rebranded their 10 nm process to Intel 7. IMO that rebranding was reasonable, although some people (I think people that are either misinformed or want to push the idea that Intel is further behind than they are) insist on continuing to refer to it as 10 nm.

What is annoying is that Intel rebranded what they originally called 7 nm to Intel 4. That is making it sound better than it is. In terms of physical size it is actually between TSMC 6 nm and TSMC 5 nm.

So realistically if Intel was actually trying to do what they said follow a naming scheme to make it easy to compare Intel process nodes against TSMC they should’ve renamed their 7 nm process to Intel 5.5 instead of Intel 4.

AMD currently uses TSMC 4 nm on their laptop CPUs and a mixture of 5 nm and 6 nm on their desktop CPUs (the latest and greatest processes always cost a premium, so AMD sacrifices a bit of efficiency on desktop CPUs to use a cheaper process, the cores are TSMC 5 nm while the rest of the CPU is 6 nm).

Intel recently announced their next generation of laptop CPUs. They are using Intel 4 (like 5.5 for comparison to TSMC) for the cores, TSMC 5 nm for the iGPU, and TSMC 6 nm for the rest of the CPU. So they are definitely a bit behind AMD who is using TSMC 4 nm for all parts of their laptop CPUs.

I’ve been impressed with how good AMD’s Zen cores seem to be at seamlessly being able to do both very low power/efficiency when under a light load (only a couple watts) and also boosting up to high performance when under heavier loads (15w+).

Although oddly they seem to have a bit of a bad zone between around 5w and 10w with the latest architectures. In most use cases loads are either below or above that range, but in low power devices like handled game consoles or fanless devices loads are often limited to around that level. That is why AMD released a refined version of Zen2 that matches their other chips in sub-5W efficiency while beating them in 5-10W efficiency. That is used in the Steam Deck and in the some other CPUs like the 7520U (which Intel recently compared to snake oil because it’s based on Zen 2).

6 Likes

Yeah, that article makes me mad.

  • DDR5 RAM has freefallen in terms of price over 2023.
  • Nobody buying high-end systems will use DDR4 RAM with an i9 system anyway, so that point is bogus. Intel with DDR4 can often make Intel worse than AMD, so that’s not valid, as people who afford Intel Core i9 or AMD Ryzen 9 CPUs can often afford to spend the now $50 - $100 for DDR5 and enjoy the performance advantages of DDR5.
  • Arrow Lake will also drop DDR4 support, so you’ll still be stuck buying DDR5 RAM, just later and will also have to buy a new motherboard, while AMD will keep supporting AM5 until the end of 2025, if not even further.
  • That writer glossed over Intel’s insane power usage. Like, the Ryzen 9 7950X3D often uses close to HALF of the power of the Intel i9-13900K, and that is a difference that is not negligible. Not to mention your higher power bill and need for a higher-end cooler.

That article (and the i9-13900K review) is very misleading and honestly should have been taken down. I almost bought that CPU, but because my local Micro Center was out of those i9-13900K CPUs but not Ryzen 9 7950X CPUs (this was before the X3D CPUs were even announced), I ended up building my current system with AM5 and that Ryzen CPU and feel like I dodged a serious bullet, even though I had to go Nvidia RTX 4090 for the GPU (Radeon 7000 wasn’t out yet and I had forgotten that Ryzen 7000 had a basic iGPU I could have used while I waited). The power usage of Intel 13th and 14th gen is freaking insane.

Given I use Linux now, I really did wish I could have waited for the 7900 XTX, given I couldn’t use Wayland until Plasma 5.27.6 came out (because 5.27 was unusable until 5.27.6 due to screen issues only on Wayland) but sometimes you just gotta do it then and you can’t wait… (Same goes for the Ryzen 7950X3D.)

5 Likes

The only thing I see Intel has over Ryzen on laptop is Thunderbolt certification on all 4 ports. But functionally, if you can deal with just TB3 on the rear 2 ports (USB 4), the Ryzen will be fine.

4 Likes

I think for the vast majority of users having only two USB4 ports instead of four is just fine.

The main advantage of Intel’s USB4 implementation is that it support all implementations of multi-monitor setups through docking stations.

There are basically three ways for a docking station to operate a multi-monitor setup:

  1. The computer can send a single high-bandwidth display signal over USB-C which can then be split and shared between multiple monitors by the dock using DisplayPort MST (not supported by Apple, used by the vast majority of docking stations).
  2. The computer can send multiple separate lower bandwidth display signals over USB-C (not supported by AMD or the Apple M1/M2/M3 (Apple Max/Pro/Ultra skus do work), used by some docks intended for Mac computers like the Caldigit docks).
  3. Some docks have their own separate chip (DisplayLink or InstantView) that is essentially a small GPU that handles outputting the signal to the display (ie. The computer’s main GPU still does 98% of the work, but the GPU in the dock controls the displays). This allows those docks to overcome limitations on what the computer would otherwise support, but can result in artifact and software bugs (in my experience it works a lot better than a lot of people claim, but it’s not perfect).

So if you have an Intel computer you can use any type of docking station, whereas if you have an AMD system you must use the first or third implementation to get multiple monitors working properly. This is especially impactful to any user that wants to be able to easily switch their setup between a Mac and AMD computer.

8 Likes

Yup, that too. I have a CalDigit TS4 so it’ll affect me, but I have a USB 3.0 DisplayLink adapter that may help. However, the CalDigit TS4 is mostly used for my work PC (Intel based), so I’m mostly fine. If I plug my FW16 in and only get one monitor out, it’s not critical. It’ll mostly for gaming and personal use.

I can see it being an issue for other people though, so they definitely need to pair up with a dock that has a HST hub or (worst option) DisplayLink.

1 Like

I also have one, but I dont really need more than one external monitor for my laptop at a time. (My desktop has two 4k 144 hz monitors and a ultrawide 1440p 175 hz QD-OLED Samsung G8; the latter is also used as a TV in addition to gaming, and it’s only going to be even better on KDE Plasma 6 when HDR gets preliminary support on Wayland.)

Also, doesn’t AMD have better battery life with their U and HS series of CPUs? And aren’t they also more efficient, so less heat and fan noise than a comparable Intel CPU?

I did check it and I still don’t understand. Maybe you haven’t set your priorities straight.
For Laptops the following topics are essential:
Power consumption and heat: winner AMD
Security: winner AMD
Litography: winner AMD

Most other topics seem quite biased to me, so here’s my perspective on them:
Pricing and Value: Intel has amassed huge amounts of money by practicing illegal marketing (see my other posts for details) in the decades until those practices were finally brought to light, so of course they now have the finanical power to disrupt the market using dumping prices.
Gaming Performance: Again, rather than the raw performance, intel’s dumping prices play a role way too big in this review. Also, the integrated graphics performance is way more important for Laptops, which makes AMD the winner.
Productivity and Content Creation Performance: Due to the bigger overall market share intel gained over the years due to its unfair marekting practices, software vendors optimize their products for intel processors, so there’s just no chance for AMD to win in this category.
Processor Specifications and Features: Again weighted unfaily due to intels dumping prices. The reviewers also forgot to mention, that intel likes to introduce propertiary technologies to force AMD (and formerly Apple) to pay licensing fees. It all started with CPU-Features like MMX, and Thunderbolt was meant to become the current cashcow, but unlike in the past, AMD learnt from its mistakes and didn’t adopt it, until it became an open standard through USB4.
Drivers and Software: Anything else beside the basic functions only is of matter to gamers. Intels iGPUs play no role here as they’re way too weak and therefore usually paired with nVidia-Chips, which is a wholly different company and outside of the scope of this review.

3 Likes

Guys, don’t be rude.

If you feel someone is mistaken, you can state why without being rude about it. And this was Sebastien_Boulianne’s first post here. Not the best welcome.

5 Likes

Agreed and I’ve also seen many people get downvoted to oblivion on the FW subreddit whenever they voice their opinion or ask a question that isn’t in-line with most of this community members’ thoughts/beliefs or if people think it’s a “stupid” question.
The last thing I would like is for this community to be full of gatekeepers and just drive away any potential future FW community members, let’s keep this friendly and welcoming/inclusive!