The SPI decoder currently has issues with input data where the samplerate is 0. This can happen e.g. when importing from CSV files and not explicitly setting an assumed/fake samplerate during the import. Example file and discussions: https://forum.digilentinc.com/topic/4343-openscope-discovery-question/?tab=comments#comment-26156 The decoder shows all bytes correctly if a samplerate is set, but it only shows the first byte with samplerate 0 and then aborts. On the command-line you get: srd: ZeroDivisionError: Protocol decoder instance spi-1: : float division by zero
Created attachment 386 [details] Example file Add file from original forum post so we can reproduce more easily. https://forum.digilentinc.com/applications/core/interface/file/attachment.php?id=4650
This was fixed in 956721de58552b05776c8613449f2907196e61e9, thanks!