Hello,
I am using a TX2 with the libserial-dev package to communicate with a microcontroller over serial connection. When I process incoming data from the microcontroller, I am noticing that the data sometimes has the highest bit set on when it should be off. I went through and verified the data coming from the microcontroller with an oscilloscope and the waveform matches the intended signal. But when I access the data on the TX2, this is where I begin to notice my issue.
For example,
0b0000 0000 (0) is sent from the microcontroller to the TX2.
The oscilloscope shows 0b 0 0000 0000 1 where the leading 0 and trailing 1 are the start and stop bits, respectively.
When I print the data after doing a serial read on the TX2 I get 0b1000 0000 (128).
At first I thought it might be due to a noisy signal, but the data looked clean on the scope so I’m not sure what else could be causing this. Any ideas or suggestions on what the issue could be?
Thanks in advance,
Dario