Raspberry Pi HQ Camera in Jetson Nano

flashsquirrel
I think your camera configs are in the wrong location of the pipe, those commands are part of nvarguscamerasrc.
try:
nvarguscamerasrc exposuretimerange=“34000 34000”, aeLock=true, ispdigitalgainrange=“1 1”, gainrange=“4 4” | all the other stuff.

also if you try your pipe in the terminal does it work with nvoverlaysink?
Such as:
gst-launch-1.0 nvarguscamerasrc exposuretimerange="34000 34000 " wbmode=0 sensor_id=0 sensor-mode=1 ispdigitalgainrange=“1 1” aelock=true gainrange=“4 4” ! nvoverlaysink

1 Like

@CarlosR92
@DavidSoto-RidgeRun,
@juan.cruz
@JerryChang

Hello, I hope the above is clear, and if you have questions feel free to ask.
I’m currently married to the Raspberry Pi HQ Camera for my project and hope I can get long exposure time with the Jetson Nano (because I’m using two cameras). But if it’s not possible at the current stage of the driver, I’ll switch back to using a two Raspberry Pi 4.

I hope I’m not being rude for pinging all 4 of you, but I have reviewed your posts and it looks like you can provide advice on the future state of this project and capabilities.

Hi @siderskiy,

Sorry for the slow response. We have been pretty busy and we had the task of being sure that the sensor works correctly in JP 4.5, we hope to release that in a week or two. Moreover, one of our engineers will start testing the exposure control as you described here in order to see if we can replicate your issue. I think we can have news in one week due to the work of active customers.

-David

1 Like

Thanks. Just saw the post that you updated the Wiki on how to install the driver.
I was using the Wiki instructions to install back then so maybe I missed the installation of the ISP calibration file? Could that have caused the exposure time issue?

I have a question about the Arducam driver. Do you need to use Arducam’s IMX477 camera, or the stock RaspberryPi HD camera work too?

I was under the suspicion that their PCB layout is different and pairs with their driver.

or are you using the one camera (the RaspberryPi HD camera ) and swapping between the drivers (and both work). If so, how do you swap the drivers?

@siderskiy

I have Arducam’s IMX477, but currently I build patch the kernel and build from source since other changes need to be made as well for our use case. As far as I know, there are no differences Arducam has made. I use the RidgeRun patches directly.

The stock camera should work as well if modified, but I’m hesitant to do it since they would no longer work with my Pi boards. Besides, the arducam ones are the same price or cheaper and just as good. They even have a version that can toggle the ir filter on and off.

If you’re US based, just search arducam imx477 on amazon and you’ll get lots of results.

1 Like

Hi @siderskiy

Regarding the minimum exposure of 34000 (34us), this value is hardcoded in nvarguscamerasrc. I patched the element to allow a smaller value in case you want (patch attached). With that modification it will allow you to set an exposure of 10000 (10us).

Regarding the maximum exposure values, as you said, for 60fps the maximum exposure is 16.6ms. If you try to set something beyond that, the exposure will be configured to 8.33 ms.

I updated the way the driver is configuring the sensor for a desired exposure and will be uploading the update in the next couple of days.

An example pipeline that sets the exposure to 10ms is the following:

gst-launch-1.0 nvarguscamerasrc exposuretimerange="10000000 10000000" wbmode=0 sensor_id=0 ispdigitalgainrange="1 1" aelock=true gainrange="4 4" ! 'video/x-raw(memory:NVMM),width=1920,height=1080,format=NV12,framerate=60/1' ! queue ! nvoverlaysink sync=false

nvarguscamerasrc.patch (2.4 KB)

Hope this helps.

2 Likes

Hi everyone,

We just added the support for the new Jetpack 4.5 in our github repo: https://github.com/RidgeRun/NVIDIA-Jetson-IMX477-RPIV3

Please test and let us know how it goes.

Regards,
Carlos R

3 Likes

I see the GitHub issue was closed and fix implemented for the JetPack 4.5 version. I’ll test it when I can.

Hi CarlosR92,

I’ve tested the driver and it works!
Below is a comparison between the Arducam and RidgeRun drivers!

Arducam:

RidgeRun:

The RidgeRun driver seems to have colors closer to reality but in the whites in some areas starts to tend towards pink:

Is it possible to do anything about it?

Thanks

1 Like

I have also got it working (minimal testing!). I was able to use the original kernel from JP 4.5 and just change the device tree generated by your patch (so I dont’ have the changes to imx477.c, obviously).
As I have 2 cameras connected to my Nano, an IMX219 and an IMX477 (RasPi HQ v3), I also managed to combine the DT changes for IMX477 with the original files in such a way that each camera uses the appropriate driver. In this way I can get them both working at the same time. If anyone is interested in doing the same, please let me know and I’ll think about the best way to make the changes available.


Goad

2 Likes

This seems to be the case for raspberry pi v2 camera as well. Yours is the most useful observations I have come across in the entire forum.

I ran an experiment setting fps to 120 for a HD video. I set gainrange and ispdigitalgainrange to 1. I set the exposure time to 1/30, 1/60 and 1/120. All of them are identical. So that means the camera is capping the exposure time to the FPS based max irrespective of what we provide.

I also ran the same thing but with 1/10th of the exposure values (1/300, 1/600 and 1/1200) I could definitely see differences in the images, even if they are underexposed

Third experiment I ran was with 90fps (non-standard FPS). I set exposure time 1/90 and 1/120 with analog and digital gains set to 1. This showed differences.

Let me know if you find a way to increase past the limits of exposure time defined in nvarguscamerasrc and hope my findings help you

@Edison_F_A how can I change the driver for raspberry pi v2 (IMX219) camera?

1 Like

Thanks. Mind posting the commands you used too?

I’ll do another deep dive in a couple of weeks. Most likely we’ll have to use something other than nvarguscamerasrc.
@shravya.boggarapu1

Hi!

How did you manage to set the custom fps? I cant start the stream if I try to increase the FPS over 60.

I would need to modify also the output stream to a lot smaller. I’m looking to get the framerate up as much as possible for high-speed projectile tracking.

Hello,

I’m sorry to distract the main topic of this thread “Driver support for the Raspberry Pi HQ Camera in Jetson Nano”

But I’m excited about the Metal Housing for the HQ camera from the crowdsourcing campaign.

image

P.S. I have no affiliation with this group. But I switched to the Nano because the Pi couldn’t do stereo on the HQ. This camera market is moving so fast! I’m interested in the Metal Housing for my Nano setup.

Hi @CarlosR92, do you know if your drivers are compatible with Jetpack 4.5.1 please?

I’m trying this setup for the first time with an Arducam 12MP IMX477 Mini on my Jetson Nano 4GB A02, but after installing .deb packages from the 4.5 folder, I’m not seeing any /dev/video* devices. When I try to run the streamer command, I receive the “No cameras available” error.

I will try 4.5 in the meantime and report back here what happens.

Thanks.

EDIT - I think it’s because the Arducam uses a different set of drivers (https://www.arducam.com/product/arducam-high-quality-camera-for-jetson-nano-and-xavier-nx-12mp-m12-mount/). I wrongly assumed the same IMX477 sensor in the Raspi HQ Cam and Arducam meant the drivers would be compatible.

Hello everyone,

I bought a Arducam IMX477. Unfortunately I doesn’t seem to be the original and thus have the problem that the circuit/schematics seems to be different. Does anybody have a hint where the R8 is located? If anyone has a schematic it would also be helpful.
Down below you can find the image IMX477:

Best regards

Hello,

It looks like the Raspberry Pi now supports external trigger for this camera and also FSTROBE can be used to indicate start of frame. Is this support you will add to the Nano?