DaVinci Resolve Studio 20.2 — My “Working” Setup on Linux

DaVinci Resolve Studio 20.2 on AMD Radeon 8060S (gfx1151 / Strix Halo) — My “Working” Setup on Linux

Platform: Framework Desktop (AMD Ryzen AI MAX+ 395 w/ Radeon 8060S)
GPU Architecture: gfx1151 (RDNA 3.5 / Strix Halo iGPU)
Host OS: Debian 13 Trixie (kernel 6.18)
Container: Distrobox + Podman (Arch Linux image)
Resolve version: DaVinci Resolve Studio 20.2.0.0013
Status:
:white_check_mark: GPU rendering, OpenCL, audio, project persistence
:double_exclamation_mark: LibProResRAW does not work


Background

I haven’t found many online informations on getting DaVinci Resolve Studio to work with full GPU acceleration on gfx1151 (Strix Halo). As I found out after a few days of tinkering (and with the help of Claude), it requires solving several stacked issues.

If anyone wants to replicate this, I show everything I did here. I hope this helps.

The main blocker is a ROCm HSA runtime bug specific to gfx1151 that causes an immediate segfault on startup. On top of that, the container setup requires careful device passthrough configuration, and a CUDA-only library (libProResRAW.so) also needs to be stubbed out.


DISCLAIMER

Be aware that :

  • I am very new to Linux so I may have made some critical mistakes that I did not notice (please tell me if you find any);
  • also not having much experience troubleshooting these kind of problems, I used Claude to help (which may have ignored other problems, or led me in the wrong direction).

So please (especially if you are - like me - new to all this) treat this cautiously.


Architecture Overview

I run Resolve inside a Distrobox container (Arch Linux) on the Debian host. It avoids dependency conflicts and makes the setup reproducible.

It’s also easier to modify and experiment with. For that I really recommend it.

/data/media and /data/projects are specific directories I use with DaVinci mounted from the host for persistent storage. Feel free to replace them with whatever suits your needs.


Step 1 — Create the Container with GPU Passthrough

Careful (fooled me a few times :sweat_smile:) : the container must be created with explicit device flags. Without --device, the GPU is visible in /dev but not actually accessible to ROCm.

distrobox create \
  --name resolve-arch \
  --image docker.io/library/archlinux:latest \
  --additional-flags "--device /dev/kfd --device /dev/dri/card0 --device /dev/dri/renderD128 --group-add video --group-add render" \
  --volume /data/media:/data/media \
  --volume /data/projects:/data/projects

Note: If you already have a container without these flags, commit it first (podman commit resolve-arch localhost/resolve-arch-backup:latest), then recreate it from the backup image using the command above.


Step 2 — Install DaVinci Resolve and ROCm inside the Container

distrobox enter resolve-arch

# Install ROCm (opencl-amd provides libhsa-runtime64 + OpenCL)
# Use the AUR or install opencl-amd from unofficial repos
# Example via yay:
yay -S opencl-amd

# Install Resolve (run the official .run installer)
sudo bash DaVinci_Resolve_Studio_20.2_Linux.run


Step 3 — Fix the HSA Crash (gfx1151 VGPR bug) :warning: Critical

ROCm 7.1.x (and earlier) has a bug in libhsa-runtime64 that causes an incorrect VGPR count for gfx1151, resulting in an immediate SIGSEGV on any HSA-enabled application including Resolve. The fix landed in ROCm nightly builds (TheRock) in December 2025 and will be part of ROCm 7.2+.

You can identify this crash with:

coredumpctl info | grep -A5 "Stack trace"
# You'll see: libhsa-runtime64.so.1.18.xxxxx

Fix: extract the patched libhsa-runtime64.so from TheRock nightly wheels

# Install pip if needed
sudo pacman -S python-pip

# Download the nightly wheel (check for newer versions at https://rocm.nightlies.amd.com/v2/gfx1151/)
pip download --no-deps -d /tmp/therock \
  --index-url https://rocm.nightlies.amd.com/v2/gfx1151/ \
  "rocm-sdk-core==7.11.0a20260101"

# Extract
cd /tmp/therock
unzip -q rocm_sdk_core-*.whl

# Backup and replace the broken library
sudo cp /opt/rocm/lib/libhsa-runtime64.so.1.18.* /opt/rocm/lib/libhsa-runtime64.so.1.18.bak
sudo cp /tmp/therock/_rocm_sdk_core/lib/libhsa-runtime64.so.1 /opt/rocm/lib/libhsa-runtime64.so.1.18.*

# Also copy the missing sysdeps libraries (required by rocminfo and HSA)
sudo cp /tmp/therock/_rocm_sdk_core/lib/rocm_sysdeps/lib/librocm_sysdeps_*.so* /opt/rocm/lib/
sudo ldconfig

Verify ROCm now sees the GPU:

/opt/rocm/bin/rocminfo | grep -E "Agent|Name|gfx|Device Type"
# Should show: Agent 2 — gfx1151 — GPU


Step 4 — Stub out libProResRAW.so (CUDA crash)

libProResRAW.so contains CUDA code that crashes on AMD GPUs. Since ProRes RAW is a rare format (which I also won’t use) and Resolve loads this library at startup, it needs to be replaced with a stub that exports all the same symbols but does nothing.

# Export all symbols
nm -D /opt/resolve/libs/libProResRAW.so | grep " T " | awk '{print $3}' > /tmp/proresraw_symbols.txt

# Generate stub C file (returning NULL for all functions)
python3 -c "
with open('/tmp/proresraw_symbols.txt') as f:
    syms = f.read().splitlines()
with open('/tmp/stub.c', 'w') as out:
    out.write('#include <stdlib.h>\n#include <stdint.h>\n\n')
    for s in syms:
        out.write(f'void* {s}() {{ return NULL; }}\n')
"

# Compile
gcc -shared -fPIC -o /tmp/libProResRAW_stub.so /tmp/stub.c \
    -Wno-implicit-function-declaration

# Replace
sudo mv /opt/resolve/libs/libProResRAW.so /opt/resolve/libs/libProResRAW.so.orig
sudo cp /tmp/libProResRAW_stub.so /opt/resolve/libs/libProResRAW.so


Step 5 — Clear the GPU detection cache

Resolve caches GPU detection results in a binary file. If it was generated before the ROCm fix, it will contain stale/invalid data and cause a silent crash. Delete it so it regenerates on next launch:

rm -f ~/.local/share/DaVinciResolve/logs/gpudetect.bin


Step 6 — Fix Audio (ALSA → PipeWire bridge)

Resolve uses ALSA by default. On modern systems running PipeWire, you need the ALSA compatibility layer:

sudo pacman -S pipewire-alsa

Restart Resolve — audio will work automatically through PipeWire with ALSA output selected in Preferences → Video and Audio I/O.


Step 7 — Launch Command

rm -f /tmp/qtsingleapp-DaVinc*
DISPLAY=:1 XAUTHORITY=/run/user/1000/xauth_xxxxxx QT_QPA_PLATFORM=xcb /opt/resolve/bin/resolve

Replace xauth_xxxxxx with the actual file in /run/user/1000/xauth_* on your system (it changes per session).


Step 8 — Desktop Launcher (KDE / any DE)

Create the inner script inside the container in a persistent location (not /tmp, which is cleared on container restart):

distrobox enter resolve-arch -- bash -c "
mkdir -p ~/.local/bin
echo '#!/bin/bash' > ~/.local/bin/resolve-start.sh
echo 'pkill -f /opt/resolve/bin/resolve 2>/dev/null' >> ~/.local/bin/resolve-start.sh
echo 'sleep 0.5' >> ~/.local/bin/resolve-start.sh
echo 'rm -f /tmp/qtsingleapp-DaVinc*' >> ~/.local/bin/resolve-start.sh
echo 'XAUTH=\$(ls /run/user/1000/xauth_* 2>/dev/null | head -1)' >> ~/.local/bin/resolve-start.sh
echo 'DISPLAY=:1 XAUTHORITY=\$XAUTH QT_QPA_PLATFORM=xcb /opt/resolve/bin/resolve' >> ~/.local/bin/resolve-start.sh
chmod +x ~/.local/bin/resolve-start.sh
"

Create the launcher script on the host:

# /home/youruser/.local/bin/resolve-launch.sh
#!/bin/bash
exec > /tmp/resolve-launch.log 2>&1
echo "=== Start $(date) ==="

# Wake up the container
distrobox enter resolve-arch -- true 2>/dev/null
echo "Container ready"

# Run Resolve via the persistent inner script
distrobox enter resolve-arch -- ~/.local/bin/resolve-start.sh
echo "=== End ==="
chmod +x ~/.local/bin/resolve-launch.sh

Note: The XAUTHORITY file (/run/user/1000/xauth_*) changes name at every login. The inner script detects it dynamically so this is handled automatically.

Create the .desktop file:

# ~/.local/share/applications/davinci-resolve.desktop
[Desktop Entry]
Version=1.0
Type=Application
Name=DaVinci Resolve Studio
GenericName=Video Editor
Comment=Professional video editing and color grading
Exec=/home/youruser/.local/bin/resolve-launch.sh
Icon=/home/youruser/.local/share/icons/DV_Resolve.png
Terminal=false
Categories=AudioVideo;Video;AudioVideoEditing;
StartupWMClass=resolve
StartupNotify=true

Copy the icon from the container:

distrobox enter resolve-arch -- cp /opt/resolve/graphics/DV_Resolve.png \
  ~/.local/share/icons/DV_Resolve.png
update-desktop-database ~/.local/share/applications/

Save Your Work

Once everything is working, snapshot the container so you never have to repeat this process:

podman commit resolve-arch localhost/resolve-arch-working:latest


What Works / What Doesn’t

Feature Status
GPU rendering (OpenCL) :white_check_mark: Full acceleration
Color grading / Fusion :white_check_mark: Working
H.264/H.265 decode (Nikon MOV, etc.) :white_check_mark: Working
Audio playback :white_check_mark: Working (PipeWire via ALSA)
Project save/reopen :white_check_mark: Working
Media Pool thumbnails :white_check_mark: Working
ProRes RAW decode :cross_mark: Stubbed (CUDA-only, AMD unsupported)
CUDA-based plugins :cross_mark: Not applicable on AMD

Root Cause Summary

Problem Cause Fix
Segfault on startup HSA VGPR bug in ROCm ≤7.1 for gfx1151 Replace libhsa-runtime64 with TheRock nightly
librocm_sysdeps_* not found Newer ROCm splits these into separate libs Copy from TheRock wheel
GPU not accessible in container Distrobox default: no device passthrough Recreate container with --device flags
Segfault after HSA fix libProResRAW.so CUDA code crashes on AMD Stub the library
Silent crash / no window Stale gpudetect.bin cache Delete it
No audio ALSA not bridged to PipeWire Install pipewire-alsa

References and other useful links

2 Likes

incredible work. This helped me get things working again on a similar system (gfx1150 instead of 1151). I can only hope davinci on linux with amd gets more stable over time so these kinds of incredible hacks are not necessary. stubbing out entire libraries is not ideal! (I did not need the hsa fix on my system)

1 Like

Very glad to hear it !
Also nice that my approach works on something else than gfx1151, I wasn’t expecting to hit a larger area :sweat_smile:.

Regarding the libraries : yeah I hope I don’t suddenly need this particular library :rofl:.

Also, yeah the hsa crash is specific to strix halo.

Again, very glad to see that the time spent on this is useful to someone else !