How to save color images when using Crazyradio

Discussions about the AI-deck
Post Reply
noah2021
Beginner
Posts: 16
Joined: Thu Oct 07, 2021 8:48 pm

How to save color images when using Crazyradio

Post by noah2021 »

I want to know how I can save color images when using Crazyradio. I already switched to a color camera, and I flashed the test_camera code to GAP8. However, when I open viewer.py, the images have gridlines just like in this post (viewtopic.php?p=23386&hilit=color#p23386). I want to save the color images into .jpg format. Could you please give me some hints? Thank you!
gemenerik
Beginner
Posts: 19
Joined: Wed Apr 07, 2021 11:11 am

Re: How to save color images when using Crazyradio

Post by gemenerik »

Since the streamed image is JPEG compressed I would recommend doing the demosaicing on the GAP8, so before streaming. There are examples of demosaicing in the AIdeck_examples repository, for example in the camera test. The demosaicing function is found in the img_proc files. Be aware of the fact that the output of this particular demosaicing function has the same spatial resolution as the input, but three times the channels (grayscale -> RGB), resulting in an image buffer of three times the size. You will have to change the frame_streamer_conf to accomodate for this tripling in streamed data, I believe by changing the depth to 3.

Since the images are JPEG compressed, the viewer.py can automatically detect when it received the full image. The fact that we are now dealing with a color image should also be encoded in the header of the JPEG file, so perhaps it picks up on this automatically. I can help you debug this when you get to this point!

As a sidenote, in the WiFi streamer example, you actually connect to a WiFi hotspot running on the ESP32 NINA (on the AI-deck), the Crazyradio does not come into play here.
noah2021
Beginner
Posts: 16
Joined: Thu Oct 07, 2021 8:48 pm

Re: How to save color images when using Crazyradio

Post by noah2021 »

gemenerik wrote: Tue Nov 16, 2021 9:48 am Since the streamed image is JPEG compressed I would recommend doing the demosaicing on the GAP8, so before streaming. There are examples of demosaicing in the AIdeck_examples repository, for example in the camera test. The demosaicing function is found in the img_proc files. Be aware of the fact that the output of this particular demosaicing function has the same spatial resolution as the input, but three times the channels (grayscale -> RGB), resulting in an image buffer of three times the size. You will have to change the frame_streamer_conf to accomodate for this tripling in streamed data, I believe by changing the depth to 3.

Since the images are JPEG compressed, the viewer.py can automatically detect when it received the full image. The fact that we are now dealing with a color image should also be encoded in the header of the JPEG file, so perhaps it picks up on this automatically. I can help you debug this when you get to this point!

As a sidenote, in the WiFi streamer example, you actually connect to a WiFi hotspot running on the ESP32 NINA (on the AI-deck), the Crazyradio does not come into play here.
Thank you gemenerik! I will let you know soon.
noah2021
Beginner
Posts: 16
Joined: Thu Oct 07, 2021 8:48 pm

Re: How to save color images when using Crazyradio

Post by noah2021 »

@gemenerik, I tried to modify the code, but the viewer.py shows strange gray images. The color mode of the color image is RGB565. I'm not sure if it's correct. I also changed the streamer_conf.depth to 3 as you said. Could you help me? Thank you.


Here is my code:

Code: Select all

#include "bsp/camera/himax.h"
#include "bsp/camera/mt9v034.h"
#include "bsp/transport/nina_w10.h"
#include "tools/frame_streamer.h"
#include "stdio.h"

#include "img_proc.h"
#include "gaplib/ImgIO.h"

#if defined(CONFIG_GAPUINO) || defined(CONFIG_AI_DECK)
#define CAM_WIDTH    324
#define CAM_HEIGHT   244
#else
#define CAM_WIDTH    320
#define CAM_HEIGHT   240
#endif

static pi_task_t task1;
static pi_task_t task2;
static unsigned char *imgBuff0;
static unsigned char *imgBuff1;
PI_L2 unsigned char *imgBuff0_demosaick;
static struct pi_device camera;
static struct pi_device wifi;
static frame_streamer_t *streamer1;
static frame_streamer_t *streamer2;
static pi_buffer_t buffer;
static pi_buffer_t buffer2;
static volatile int stream1_done;
static volatile int stream2_done;

static void streamer_handler(void *arg);


static void cam_handler(void *arg)
{
  pi_camera_control(&camera, PI_CAMERA_CMD_STOP, 0);

  stream1_done = 0;
  stream2_done = 0;

  demosaicking(imgBuff0, imgBuff0_demosaick, CAM_WIDTH, CAM_HEIGHT, 0);
  //WriteImageToFile("../../../img_color.ppm", CAM_WIDTH, CAM_HEIGHT, sizeof(uint32_t), imgBuff0_demosaick, RGB888_IO);
  //frame_streamer_send_async(streamer1, &buffer2, pi_task_callback(&task1, streamer_handler, (void *)&stream1_done));
  frame_streamer_send_async(streamer1, &buffer2, pi_task_callback(&task1, streamer_handler, (void *)&stream1_done));

  return;
}



static void streamer_handler(void *arg)
{
  *(int *)arg = 1;
  if (stream1_done) // && stream2_done)
  {
    
    pi_camera_capture_async(&camera, imgBuff0, CAM_WIDTH*CAM_HEIGHT, pi_task_callback(&task1, cam_handler, NULL));
    
    pi_camera_control(&camera, PI_CAMERA_CMD_START, 0);
  }
}



static int open_pi_camera_himax(struct pi_device *device)
{
  struct pi_himax_conf cam_conf;

  pi_himax_conf_init(&cam_conf);

  cam_conf.format = PI_CAMERA_QVGA;
  

  pi_open_from_conf(device, &cam_conf);
  if (pi_camera_open(device))
    return -1;


    // rotate image
  pi_camera_control(&camera, PI_CAMERA_CMD_START, 0);
  uint8_t set_value=3;
  uint8_t reg_value;
  pi_camera_reg_set(&camera, IMG_ORIENTATION, &set_value);
  pi_time_wait_us(1000000);
  pi_camera_reg_get(&camera, IMG_ORIENTATION, &reg_value);
  if (set_value!=reg_value)
  {
    printf("Failed to rotate camera image\n");
    return -1;
  }
  pi_camera_control(&camera, PI_CAMERA_CMD_STOP, 0);
  
  pi_camera_control(device, PI_CAMERA_CMD_AEG_INIT, 0);

  return 0;
}



static int open_pi_camera_mt9v034(struct pi_device *device)
{
  struct pi_mt9v034_conf cam_conf;

  pi_mt9v034_conf_init(&cam_conf);

  cam_conf.format = PI_CAMERA_QVGA;
  

  pi_open_from_conf(device, &cam_conf);
  if (pi_camera_open(device))
    return -1;


  
  return 0;
}



static int open_camera(struct pi_device *device)
{
#ifdef CONFIG_GAPOC_A
  return open_pi_camera_mt9v034(device);
#endif
#if defined(CONFIG_GAPUINO) || defined(CONFIG_AI_DECK)
  return open_pi_camera_himax(device);
#endif
  return -1;
}


static int open_wifi(struct pi_device *device)
{
  struct pi_nina_w10_conf nina_conf;

  pi_nina_w10_conf_init(&nina_conf);

  nina_conf.ssid = "";
  nina_conf.passwd = "";
  nina_conf.ip_addr = "0.0.0.0";
  nina_conf.port = 5555;
  pi_open_from_conf(device, &nina_conf);
  if (pi_transport_open(device))
    return -1;

  return 0;
}


static frame_streamer_t *open_streamer(char *name)
{
  struct frame_streamer_conf frame_streamer_conf;

  frame_streamer_conf_init(&frame_streamer_conf);

  frame_streamer_conf.transport = &wifi;
  frame_streamer_conf.format = FRAME_STREAMER_FORMAT_JPEG;
  frame_streamer_conf.width = CAM_WIDTH;
  frame_streamer_conf.height = CAM_HEIGHT;
  frame_streamer_conf.depth = 3;
  frame_streamer_conf.name = name;

  return frame_streamer_open(&frame_streamer_conf);
}
static pi_task_t led_task;
static int led_val = 0;
static struct pi_device gpio_device;
static void led_handle(void *arg)
{
  pi_gpio_pin_write(&gpio_device, 2, led_val);
  led_val ^= 1;
  pi_task_push_delayed_us(pi_task_callback(&led_task, led_handle, NULL), 500000);
}

int main()
{
  printf("Entering main controller...\n");

  pi_freq_set(PI_FREQ_DOMAIN_FC, 150000000);

  pi_gpio_pin_configure(&gpio_device, 2, PI_GPIO_OUTPUT);

  pi_task_push_delayed_us(pi_task_callback(&led_task, led_handle, NULL), 500000);

  imgBuff0 = (unsigned char *)pmsis_l2_malloc((CAM_WIDTH*CAM_HEIGHT)*sizeof(unsigned char));
  imgBuff0_demosaick = (unsigned char *)pmsis_l2_malloc((CAM_WIDTH*CAM_HEIGHT*3)*sizeof(unsigned char));

  if (imgBuff0_demosaick == NULL) {
      printf("Failed to allocate Memory for Image \n");
      return 1;
  }
  printf("Allocated Memory for Image\n");

  if (open_camera(&camera))
  {
    printf("Failed to open camera\n");
    return -1;
  }
  printf("Opened Camera\n");



  if (open_wifi(&wifi))
  {
    printf("Failed to open wifi\n");
    return -1;
  }
  printf("Opened WIFI\n");



  streamer1 = open_streamer("camera");
  if (streamer1 == NULL)
    return -1;

  printf("Opened streamer\n");


  pi_buffer_init(&buffer, PI_BUFFER_TYPE_L2, imgBuff0);
  pi_buffer_init(&buffer2, PI_BUFFER_TYPE_L2, imgBuff0_demosaick);
  pi_buffer_set_format(&buffer, CAM_WIDTH, CAM_HEIGHT, 1, PI_BUFFER_FORMAT_GRAY);
  pi_buffer_set_format(&buffer2, CAM_WIDTH, CAM_HEIGHT, 3, PI_BUFFER_FORMAT_RGB565);

  pi_camera_control(&camera, PI_CAMERA_CMD_STOP, 0);
  pi_camera_capture_async(&camera, imgBuff0, CAM_WIDTH*CAM_HEIGHT, pi_task_callback(&task1, cam_handler, NULL));
  pi_camera_control(&camera, PI_CAMERA_CMD_START, 0);
  printf("6\n");

  while(1)
  {
    pi_yield();
  }

  return 0;
}
20211123235754.jpg
20211123235754.jpg (14.6 KiB) Viewed 1458 times
gemenerik
Beginner
Posts: 19
Joined: Wed Apr 07, 2021 11:11 am

Re: How to save color images when using Crazyradio

Post by gemenerik »

Hi Noah,

With the buffer format in RGB565 you will want the image to stream to be in that same format. The default demosaicking function creates RGB888 images. I made some changes to my demosaicking function, and managed to successfully save the RGB565 image with WriteImageToFile. However, the Python viewer still interprets the images as grayscale. Fixing that is a bit less trivial than I thought. To be honest, I'm not sure where in the pipeline it goes wrong anymore; perhaps the frame_streamer on the GAP8 does not even support RGB. I'll try to get some help from those who set up the initial streamer.
noah2021
Beginner
Posts: 16
Joined: Thu Oct 07, 2021 8:48 pm

Re: How to save color images when using Crazyradio

Post by noah2021 »

gemenerik wrote: Fri Nov 26, 2021 2:31 pm Hi Noah,

With the buffer format in RGB565 you will want the image to stream to be in that same format. The default demosaicking function creates RGB888 images. I made some changes to my demosaicking function, and managed to successfully save the RGB565 image with WriteImageToFile. However, the Python viewer still interprets the images as grayscale. Fixing that is a bit less trivial than I thought. To be honest, I'm not sure where in the pipeline it goes wrong anymore; perhaps the frame_streamer on the GAP8 does not even support RGB. I'll try to get some help from those who set up the initial streamer.
Thank you! Do you have any update?
gemenerik
Beginner
Posts: 19
Joined: Wed Apr 07, 2021 11:11 am

Re: How to save color images when using Crazyradio

Post by gemenerik »

There is some reference to color encoding in the gap_sdk. I could not get that to work though. We are now investigating streaming raw image data, instead. We found that JPEG encoding takes a long time anyways, so it might even improve performance. Especially if you do the demosaicking off-board.
noah2021
Beginner
Posts: 16
Joined: Thu Oct 07, 2021 8:48 pm

Re: How to save color images when using Crazyradio

Post by noah2021 »

gemenerik wrote: Tue Dec 07, 2021 2:15 pm There is some reference to color encoding in the gap_sdk. I could not get that to work though. We are now investigating streaming raw image data, instead. We found that JPEG encoding takes a long time anyways, so it might even improve performance. Especially if you do the demosaicking off-board.
Thanks for the update. Do you modify viewer.py to save the raw image data?
kimberly
Bitcraze
Posts: 1050
Joined: Fri Jul 06, 2018 11:13 am

Re: How to save color images when using Crazyradio

Post by kimberly »

Hi!

Sorry for the late reply but it looked like this was a difficult question to answer. This actually means that the Nina wifi module's firmware needs to be very heavily adjusted... we might think of a solution for this one, e but it's not quite ready to share, so please keep an eye on the blogposts, as we will share an guide on how to do this once we got something running.
Post Reply