Audio Card Upgrade

Gotcha. I’m a little disappointed the controller isn’t part of the module in the laptop. If it was, then they could just make a higher quality option in the future and it would be a simple upgrade. It should be possible but maybe it just takes too much on the cost and r&d side to separate it from the MB?

Well, Bluetooth quality isn’t really down to any audio chip or anything in the computer. It’s possible to have software signal processing applied to the digital signal before it is transmitted over Bluetooth, but other than that, the audio quality is going to come down to the codecs and transmission quality of the digital signal via the Bluetooth radio, and the quality of the radio and DAC in the headphones.

The same goes for anything plugged into USB. Any audio sent to any device that is plugged into a USB port is sent as-is, in digital form. From there, it’s going to be down to whatever hardware and software is in whatever device is connected.

The built-in audio chip/controller on the laptop only applies to the built-in speakers and the built-in headphone jack (if there is one). When it comes to USB audio and Bluetooth audio, the source files are just sent, digitally, to whatever device is on the other end, and the audio conversion and processing is all done at the other end.

That was extremely informative. I thought it had to be processed through the DAC no matter what connection type is used. It really won’t matter in my use case then because I prefer to use BT with my laptop anyway. All of my video editing and what not gets done on the desktop so as long as I can enjoy media with the laptop, I’m good. I would still like to see better audio support in the future but that’s just me being a nerd about it

2 Likes

The DAC only exists at the point where the signal is converted from digital (what the computer outputs) to analog (what speakers/headphones require).

Bluetooth carries digital signals, meaning that it must be converted to analog after the Bluetooth connection. So the DAC is in the headphones.

USB-C can carry analog audio, however that feature of the USB-C spec is usually only supported by some phones. In this case USB-C (the connection between the laptop and expansion card) carries digital audio, so the DAC must be in the expansion card.

Personally, I don’t see a big benefit to increasing the built-in audio quality beyond “good enough.” Because there are always going to be trade-offs. If you want a high quality, super powerful headphone output on the laptop, that takes larger components, more power, better isolation from electronic noise-making components, more heat, etc. An external DAC/AMP will sound exactly as good when plugged into the USB port on a computer with crappy internal audio as it would plugged into a laptop with world-class internal audio, because at that point, the internal audio of the laptop is totally out of the equation. The audio files are simply being sent to the external DAC/AMP via USB and all the processing, conversion, and amplification is happening in the external device.

In my opinion, Bluetooth is always a compromise, but as long as you have a Bluetooth module that supports modern, high-quality codecs, the audio can be pretty good. But again, other than the Bluetooth module, the quality will come down to the quality of the processing, conversion, and amplification going on in the headphones.

I can get a pretty darn decent external DAC/AMP for about $150-$200 that can be used on any desktop or laptop, and even most tablets and phones. So I’d just as soon have a laptop with “good enough” built-in audio, and have something external if I ever want better audio.

Again, that’s just me.

1 Like

First, I want to make it clear that I’m not trying to bash Framework in any way. If I have come across that way, I apologize. I love Framework and what they are doing here. The quality of the 13" is truly impressive, especially for a company as young as FW. I simply think there are areas with room for improvement. The display (which they are already improving) and audio are the only real areas I see for enhancement. They aren’t problems, just opportunities for improvement.

Now, regarding the Bluetooth option: Audio quality depends on both the Bluetooth codec and the device’s internal audio processing capabilities. While Bluetooth supports higher quality codecs, if the internal audio chip doesn’t support those codecs or higher bit rates, the overall audio quality will be limited. So even with a capable Bluetooth module, the internal audio limitations can restrict you to 16-bit, 48,000 Hz.

Next, let’s talk about the audio chip itself. Upgrading the audio chip is generally straightforward. It involves selecting a higher-quality chip that is supported by the CPU. In many cases, this can be done without significant changes to the motherboard’s design. The existing motherboard traces might already be isolated enough, so it could be as simple as replacing the current chip with a better one.

There is some necessary programming work to ensure that the new chip’s firmware is compatible with the BIOS and other components, but this is standard for any motherboard design. Additionally, while the new audio chip might be slightly larger, any adjustments required would be minimal, typically involving shifting components by millimeters. This wouldn’t be substantial enough to change the overall dimensions of the motherboard or its compatibility with current Framework models.

As for external DACs, while they are great, I don’t want to carry around a DAC and wired headset or spend $150-$200 just to achieve the sound quality that is standard on most other out-of-the-box devices, including phones and laptops. Most modern devices support at least studio-quality audio (24-bit, 48,000 Hz) without the need for additional equipment. It seems a bit excessive to spend extra money just to meet a quality standard that is already included or exceeded in almost every other device on the market.

I fail to see why that would be the case.

When audio is being transmit over Bluetooth it is sending the raw digital audio data to the Bluetooth device and just handling the audio processing on the Bluetooth device. Only the OS, Bluetooth chip, and the Bluetooth device should have any influence (as the supported digital format is limited to the best one supported by all 3).

One of my former desktop motherboards, the Asus Rog Strix B350-I Gaming, has the audio circuitry on a separate daughterboard (along with one of the two M.2 slots). At one point while trying to diagnose crashing issues (prior to the board completely killing itself) I actually removed that daughterboard and used it without that for a week, however my Bluetooth headphones still functioned without issue. If Bluetooth audio depended on the normal audio circuitry, then why did my headphones work with that circuitry removed completely?

Right, apologies this was a misunderstanding on my part.

However, audio quality on bluetooth is still locked at 16 bit, 48000 Hz. I have no idea why but across 3 different bluetooth devices and 3 different PCs, they are all limited to 16 bit over bluetooth. Perhaps this is an issue with windows on the bluetooth side? The FW is utilizing the AAC codec, and should be utilizing 24bit but for some reason it only allows 16 bit.

Bluetooth supporting better audio quality would be the preferred work around until Framework updates the internal chip. DACs are good too I just don’t have the money and for some reason carrying it around would just be annoying to me.

Back on the BT, I’m using the AX210 so there should be no reason that is the issue. Also the headsets all support aptX or AAC at minimum. So 24bit 48000 Hz or better should be an option, but everything only shows 16bit. This is really leading me to believe that windows itself doesn’t have the codec support for bluetooth? I’m just confused at this point lol

Update to the main topic

It turns out the driver package from Framework isn’t complete or something? It never installed the Realtek HD audio driver. After installing that from Realtek’s website (which was an utter pain to track down; their website is an absolute mess), I now have the option for 24-bit, 48000 Hz audio on the speakers and headphones.

However, Bluetooth is still limited to 16-bit. I’m pretty convinced this is an issue with Windows 11 at this point, so I’ll continue digging into that.

Also, the Realtek Audio Console no longer works after installing the HD audio driver, perhaps because the driver is for Windows 7 to Windows 10. My workaround for this is Dolby Access. I care more about the actual quality of the audio than fancy settings I never actually use anyway, and Dolby Access gives you a decent EQ and good spatial audio. Now that I know the Framework laptop supports better audio, I’ll spend some time this week digging deeper and seeing if I’m missing something. Any input from Framework themselves would be helpful here too, if possible.

(edit: Since I am running Windows 11, I’m not sure if the Windows 10 package from Framework makes the audio work properly. I just know on the 2 FW 13s I have, they both only showed options for 16-bit audio - until now, that is.)

(edit 2: if you use SteelSeries Sonar with the driver from realteks website, you can get 8 channel, 24bit, 96000 Hz)

Bluetooth is an ancient protocol. You get high-ish quality audio playback until you start to voice chatting, then the quality goes to 90s-ish dial-up bitrate, regardless of what the OS you use

Right, I don’t use bluetooth for VC though. I am looking at playback only. I get 24 bit on my phone, so there is no reason it should be stuck at 16bit on the computer - especially when I confirmed the same protocol is being used.

@Shenanaguy

For listening, 16 bit is enough. I seriously doubt there is any music/audio that is mastered with anywhere near enough dynamic range to exceed the capabilities of 16 bit. And if there was, you’d either have to have your volume set to literally ear damaging levels to hope to hear the quieter parts, or turn it down and have no chance of hearing the quieter parts.

24 bit (or 32 bit in some, specific cases) can be useful for recording/mixing/mastering, because you have more range to manipulate levels of various parts without running into a noise floor. But once the audio is finalized, nobody would have audio mastered such that anything you actually want the listener to hear is more than 80db quieter than the loudest parts of the audio.

CDs sound very good. They have uncompressed, digital audio at 16 bit, 44.1 KHz.

Bluetooth is a low bandwidth connection, so the audio is sent in a compressed format. The codec used for this compression can have an impact on sound quality. 16 vs. 24 bit depth will not. In fact, since the bandwidth is limited, the extra data required for 24 bit could increase the required level of compression and, therefore, could potentially reduce audio quality. That’s hypothetical because I don’t know the details of the allowable bandwidth. But I certainly wouldn’t want to force higher compression by insisting on having extra, needless bit depth.

Remember, bit depth is not the same as bit rate. So while an MP3 at a bit “rate” of 96 kbps will likely sound worse than one at 256 kbps, the bit depth only affects the dynamic range. It doesn’t directly affect sound quality. However, it DOES affect the bit rate. A higher bit depth will necessitate a higher bit rate in order to maintain the same audio quality. And if the audio doesn’t have a dynamic range that extends outside the range of 16 bit depth, then it’s basically empty data.

Let’s look at a hypothetical scenario. Imagine you are dealing with a wireless transmission with a maximum data rate of 500 kbps. You play a 16 bit file that has a bit rate of 1000 kbps. It will require 2:1 compression in order to be transmitted across that wireless connection. Now you want to transmit a 24 bit file. But that 24 bit file has a bit rate of 1500 kbps in order to have the same audio quality as the 16 bit file. Since you are limited to 500 kbps of bandwidth, you now require 3:1 compression. So your audio quality actually dropped.

I never thought you were attacking Framework. Even if you were criticizing them, there’s nothing wrong with voicing your opinion on areas where you’d like to see improvement. I’m only trying to help you focus your time and energy towards things that may actually hope to affect your audio quality, rather than getting hung up on things that won’t make a difference on the listening end. If Framework was going to install a better audio chip, I wouldn’t worry about them ensuring it could output greater than 16 bit, 48KHz. I would be more concerned with things like the noise floor/dynamic range of the amp, as well as the output power and frequency response. And when listening via Bluetooth, literally none of that matters, because the internal audio chip has nothing to do, whatsoever, with Bluetooth audio quality. Only wired headphone quality.

Rather than increasing the price of every mainboard for people who aren’t concerned with audio quality, they have released an audio expansion card, which has a better quality headphone out. That gives you better wired headphone quality, without increasing the cost of every machine. It’s a win-win, in my opinion.

Cheers!

1 Like

For anyone who wants more information on why 16 bit / 48khz is enough for lossless audio over the entire range of human hearing, Instead of trusting random people on this forum I’d recommend these three sources:

  1. https://www.youtube.com/watch?v=FG9jemV1T7I
  2. https://people.xiph.org/~xiphmont/demo/neil-young.html
  3. https://www.youtube.com/watch?v=cIQ9IXSUzuM
1 Like

I’ve acknowledged several times that 16-bit audio is good enough for most people. However, I personally notice a difference between 16-bit and 24-bit audio. Whether it’s due to my background in audio and computers, my video editing work, or the quality of my speakers and headphones, 24-bit sounds clearer and fuller to me. Maybe I’m crazy. I know some people say you can’t hear a difference, some people say you can. Well, I can.

I never said 16-bit wasn’t sufficient; I simply prefer 24-bit for its marginally better quality. This preference might be subjective, but it matters to me. I understand that most audio is mastered at 16-bit, and it’s fine for general use. However, for those who are particular about audio quality, the difference is noticeable.

Regarding the technical side, I’ve resolved my initial issue by downloading drivers directly from Realtek for the AMD boards. So, this thread’s main concern is no longer relevant.

(Also, just to mention, 24-bit audio has a lower noise floor, which means less background noise and more detail in quiet passages. While proper isolation might mitigate some issues, from a purely technical perspective, 24-bit is generally better and there’s no reason not to use it over 16-bit if given the option.)

Thanks for the input.

1 Like