That seems odd, unless the baud is a lot more bits/sec than I think it is.
header + (288 pixels * 32 bits per pixel) = ~9216 bits
60fps = 552960+header bits
If 115200 baud == 1 bit/sec, 115200 bit/s, not going to work.
If 115200 baud == 4 bit/signal, 460800 bits/sec not going to work
If 115200 baud == 8 bit/signal, 921600 bits/sec OK
I was about to ask. One of my two modules is glitching like this. I’ll upload a video in a bit. I’m going to swap them around and try some things to confirm if it’s the one module or something else.
EDIT: Here is a video of it happening on mine:
The glitchy gets much worse as the brightness is reduced. Right module appears fine. Note, this happens with any color effect, not just the one I am demonstrating.
Ok, great. I’d think then it’s connected at Full Speed, regardless of baud rate set. It’s just because it appears to be a serial device in the OS, that a baud rate makes sense in the applications trying to access it.
Using usbtop, I see about 14.4 kb/s at 10fps, which I’m not sure if it’s kbit or kbyte. kbyte/sec makes more sense, which that’s in the vicinity of 11400 bit/sec.
Ah, I see with the Experimental in usbtop, it’s looking a lot better. Before it seemed like it was doing significant receiving, as well as sending (the host), but now it’s looking a lot better. Roughly 48-49 kb/s to device, and 3.2-3.4 kb/s from device (USB protocol acknowledgements).
So fundamentally the baud rate for the serial is immaterial, since the actual under the hood implementation in the RP2040 hardware is USB Full Speed 12 Mbps. Although there may be a software bit over top of that at both ends that still requires that to be the same value in order to function properly. Because 115200 baud, if 1 bit per signal, is 115 kbit/s is about 14.1 KByte/sec, which isn’t enough bandwidth to handle 60fps, which is what OpenRGB runs at by default.
Might still be nice to track down why the issue occurs with 0.9 vs Experimental. Might be something that’s masked in Experimental, and not fundamentally solved.
Hm. The only way I see to turn off the LEDs is to change the brightness to 0, since the LEDs just stay at the last settings they had when I turn off the effect and/or close OpenRGB, since it doesn’t send a blank signal.
And perhaps nitpicking, but do you have a MOSFET or something to fully disable power from the LEDs to prevent the quiescent current usage? I know it’s really small, I think on the order of ~1ma/pixel at least for full size 5050 ws2812, with 288 ma is ~1.44W, or about 1.7% of the battery capacity per hour.
I think I’ll have to do a real-world test. Run something continuously, and check battery usage with and without the module in, even if nothing is running. Mostly for my own curiosity if in the real world it’s enough power to potentially be something to be aware of if I want to be unplugged for extended amount of time.
The LEDs will power off after 60 seconds if no data is received.
// If no data received for an extended time, turn off all LEDs.
if (SERIAL_TIMEOUT != 0 && (t - lastByteTime) >= (uint32_t) SERIAL_TIMEOUT * 1000) {
for (int i = 0; i < NUM_STRIPS; i++) {
strips[i].clear();
strips[i].show();
}
mode = Header;
lastByteTime = t; // Reset counter
}
I don’t have MOSFETs to fully disable the LEDs, unfortunately. But, if you’re using the RGB LED modules, I figured power consumption wasn’t really that big of an issue. Although, I do acknowledge that I should have included MOSFETs.
One thing to check is to make sure there isn’t an OpenRGB service still running in the background sending black color data to the modules. Although, that would still use less power than having the LEDs lit.