wifi_streaming example returns half-corrupted images
Posted: Thu Jan 14, 2021 2:03 pm
Hi!
I am working on the AIdeck and I am trying to make the wifi streaming example work.
I am using the last bitcraze commit and last gap_sdk version 3.8.1.
My problem is that when I run the viewer.py script to visualize the streamed images, the bottom part of the image is either out of sinc or black.
I debugged hardly the greenwaves code of the streamer, but I didnt find the problem.
On AIdeck side: after the camera acquisition I dump the raw image, this looks correct. So the camera works.
If I run the test camera program (write the image to the PC via debug interface) I always get the right image. This proofs again the working camera.
The problem I think is either in the Jpeg encoder, in the trasmission, or its a buffer size problem at receiving side.
in the viewer.py file i tried many buffer sizes (wrt to the original value of 512). I went down to 100 bytes and up to 10000 bytes, but the same problem occurs.
does anyone has the same problm that I do?
I am working on the AIdeck and I am trying to make the wifi streaming example work.
I am using the last bitcraze commit and last gap_sdk version 3.8.1.
My problem is that when I run the viewer.py script to visualize the streamed images, the bottom part of the image is either out of sinc or black.
I debugged hardly the greenwaves code of the streamer, but I didnt find the problem.
On AIdeck side: after the camera acquisition I dump the raw image, this looks correct. So the camera works.
If I run the test camera program (write the image to the PC via debug interface) I always get the right image. This proofs again the working camera.
The problem I think is either in the Jpeg encoder, in the trasmission, or its a buffer size problem at receiving side.
in the viewer.py file i tried many buffer sizes (wrt to the original value of 512). I went down to 100 bytes and up to 10000 bytes, but the same problem occurs.
does anyone has the same problm that I do?