I have been trying to receive a 62420 byte packet over a USB virtual serial port (STM32 microcontroller in CDC mode). As such data is transmitted at the maximum USB Full Speed rate, around 1MB/sec. The entire packet is sent via DMA in one go, once per second. I wrote a trivial program to test this, and found that some bytes are randomly dropped. The core of the test app is uint8_t buffer[65536]; for(;;) { int bytes_read = sp_blocking_read(port, buffer, sizeof(buffer), 500); if (bytes_read > 0) printf("Read %d\n", bytes_read); } I would expect this to read one entire packet. The output is as follows: Read 60971 Read 60968 Read 61005 Read 60977 Read 60859 Read 60917 Read 60872 Read 60913 Read 60895 Read 60859 There are some potential timing issues here with the 500ms timeout, but I have verified with 250ms and 100ms time-outs that in all cases at least some data is lost. This only seems to affect Linux, which I am running on a Raspberry Pi. On Windows the code works perfectly and I get Read 62420 Read 62420 Read 62420 So it appears that on Linux the OS is not buffering the data so much. I tried using non-blocking reads instead: for(;;) { int bytes_read = sp_nonblocking_read(port, buffer, sizeof(buffer)); if (bytes_read > 0) printf("Read %d\n", bytes_read); } The output was as follows: ... Read 128 Read 127 Read 128 Read 128 Read 128 Read 84 It appears that occasionally bytes are dropped from whatever 128 byte buffer this is, probably the OS as I can't find references to that value in libserialport. I'm not sure what the best resolution would be. Can the buffer be increased? 128 bytes is rather low for USB serial ports that aren't emulating the baud rate.
Additionally, I tested reading from the device directly with read() and didn't drop any bytes. I had exactly 62420 bytes per second coming through.