-
Notifications
You must be signed in to change notification settings - Fork 1.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
multiple D455 camera sync issue #2323
Comments
Hi @snakehaihai I have seen the error mode doesn't support option 42 occur in a past case when inter_cam_sync_mode was set after launch during runtime, and the problem did not occur if defining the inter_cam_sync_mode values before launch. When you set inter_cam_sync mode for each camera using the instructions below, are you doing it before launch (in the launch file or in a roslaunch instruction) or after launch, please?
|
I tried both. I do add /camera1/stereo_module/intel_cam_sync_mode: 2 in rs_multiple_devices.launch. Then i check the frequency, and the freqnecy is not the same Then I tried to check in rqt_reconfigure and found it wasn`t set to correct mode. then I set manually But none of the method works well |
Let's address first the Invalid md size: bytes used = 0 ,start offset=10 that appears in your log before the inter_cam_sync_mode error occurs. This will help to determine whether or not inter_cam_sync_mode is failing because of the invalid md size error that precedes it. The invalid md size error has also occurred with librealsense 2.42.0 and wrapper 2.2.22 in the past case #1713 (comment) In that case, the log informed the user that they had built the wrapper with SDK 2.41.0 but were running it with 2.42.0. Do you have a similar message in your log? |
I dont have the issue. I know the version has to match.
|
And the problems still occur if you add initial_reset:=true to the multiple_devices.launch roslaunch instruction to reset the cameras at launch? |
I added reset and reset works. but the frequency is still wrong. Below are the log and rostopic hz of 2 camera
|
This is my current launch file
|
Your launch log image shows a likely requires patch for fourcc code RW16! message. This message can indicate that the Linux kernel has not been patched or the patch was not applied correctly before the librealsense SDK was installed if the SDK was built from source code. Patching the kernel adds support for camera hardware metadata in a source-code build. Metadata support is automatically included when building librealsense from Debian packages or building from source code with CMake and including the -DFORCE_RSUSB_BACKEND=true build flag. Could you tell me whether you built librealsense from source code without using -DFORCE_RSUSB_BACKEND=true please? |
Ok i Thanks Regards |
It looks as though you are performing a basic librealsense build with cmake ../ -DCMAKE_BUILD_TYPE=Release. This builds librealsense in Release mode (compiled with optimizations) instead of in Debug mode. This instruction will build librealsense in V4L backend mode (not RSUSB), meaning that the patch script needs to be run before librealsense is built with CMake. It looks as though this is what you are doing. May I confirm if you are building on a PC and not on an Nvidia Jetson board? Jetson boards must use a different patch script. |
Its a intel 11 gen NUC. I know different embedded PC need different patch. I`ve done the patching below are the output. now configuring
|
after adding the new flag. The rs_multicamera.launch end with red error. Below is the cmake output which looks suspicious
Below is the launch error here is the full log
|
Is the control_transfer returned warning generating continuously after launch or does it repeat a small number of times and stop? |
For the first instance. The camera node crashes. I tried to launch couple more times. The node lives. But the error control_transfer returned error repeats forever. The good news is the hz rate of 2 camera seems to be the same. both at 7 hz. although I command it to output at 0.5hz. not big deal if it just scaling issue. The bad news is the data is still out of sync |
The depth and color streams are meant to be publishing at 30 FPS / 30 Hz each. A rate of 7 Hz and the No data available messages suggest that the computer is struggling to process the frames. After launch, could you open the rqt_reconfigure interface with rosrun rqt_reconfigure rqt_reconfigure and then go to the rgb_camera options and untick an option called auto_exposure_priority please. Doing so to disable auto_exposure_priority should force the streams to try to be maintained at a constant rate instead of being permitted to vary. |
I will try it tmr. It is near 1 am in Singapore time. I preset the Arduino to output pulse at 0.5hz. I did lower the frequency so that 4-6 cameras can be fed concurrently for street view capturing and image-based localization study. So you mean the camera only can be set at 30hz? It can not vary between 10-20hz based on Arduino pulse frequency? Then in this case, the PC cant handle that kind of bandwidth right? |
Once you have defined a particular FPS frequency for a stream and launched then you should not be able to manually alter the frequency after launch. However, the FPS is permitted to vary if auto_exposure = true and auto_exposure_priority = true (i.e you do not disable auto_exposure_priority). You can set the FPS at whatever rate is supported by a particular camera model for a particular resolution. On D455 the supported FPS rates for depth and color will typically be 5, 10, 15, 30 and 60. Generally, a RealSense hardware sync setup will not try to control the FPS rate of the cameras. Instead it is aiming to sync the stream timestamps with the master trigger signal. Using an external signal generator such as SparkFun instead of generating the trigger pulse from a RealSense camera set to an Inter Cam Sync Mode of '1' can be extremely difficult though because the trigger signal has to be so precisely matched to the Hz rate that the slave cameras are using if the slaves are set to '2'. Also, when '2' is used, you cannot sync RGB - only depth sync is supported. It is for this reason that Intel released an External Synchronization (Genlock) multiple camera hardware sync system to make it much easier to sync slave cameras to an external signal generator instead of a RealSense master camera. The genlock system also supports an Inter Cam Sync Mode of '3' (Full Slave) in which RGB can be synced. https://dev.intelrealsense.com/docs/external-synchronization-of-intel-realsense-depth-cameras I should emphasize that the Genlock system (supporting modes 1 and 2 plus modes 3+) is considered unvalidated and experimental by Intel, compared to the 'validated and mature' original hardware sync system (modes 1 and 2) |
Hi After read thought your post. I change the ciricult so that one camera is working as master and one camera is working as slave. this is my launch file now
below is the log
Only 1 camera is outputting. The other is now output anymore. in another launch test. one of the node just crash without reason |
Instead of using multiple_devices.launch, are both cameras able to launch if you open two separate ROS terminals and perform an rs_camera.launch for camera 1 in terminal 1 and a launch for camera 2 in terminal 2 using the instructions below. Terminal 1 Terminal 2 |
This method did not work here is the 2 launch file
|
You can distinguish cameras by name by adding a camera:= instruction, like below. Terminal 1 Terminal 2 |
I pwr cycled PC. and run this command here is the output |
And you get similar results if you only launch one camera instead of two? |
Hi MartyG |
Let me reload the firmware and perform hardware set first. Liekly previous config stuck inside |
after reflash the firmware and power reset. both of them can get back to 30 hz. So what should I do now to get them sync? still master-slave trigger with auto_exposure_priority set to false? |
If you are now using camera 1 as Master and camera 2 as Slave then set the Master camera as inter_cam_sync_mode '1' and the Slave camera as '2'. Setting auto_exposure_priority to false is not a requirement for multiple camera hardware sync but is useful as an optional choice for keeping the FPS rate constant. |
My student borrowed them for some presentation tmr. I`ll continue sync test after I get them back tmr. |
Okay, thanks very much for the updates. I look forward to continuing this case with you. |
After turn off the auto exposure and set fix exposure to both camera. I notice that the time delay between 2 camera become fixed So I think the master-slave sync wosk. But because it is trigger by external source and could have latency issue for 2nd camera So do I need to add another camera so that master rigger camera 2 and camera 3 and so that 2 and 3 are synced? And how about the exposure method? fix value seems pretty bad |
If you are setting one camera to Master ('1') instead of using an external signal generator, setting the slave as '2' will only sync Depth timestamps. If you are aiming to sync color timestamps, as suggested by your chart above, then the slave camera should be set to '3' (Full Slave mode). Also bear in mind that if the timestamps of slave and master are perfectly, permanently synced then that indicates that hardware sync is not working. With hardware sync, if the timestamps slowly drift apart over a long period of time then this indicates that sync is working correctly. This is discussed in the section of the original hardware sync paper linked to below, in the paragraph titled Now to the somewhat counter intuitive aspect of time stamps. |
It's excellent to hear that you made apparent progress! I look forward to hearing about your further test results. :) |
Does master-slave sync really triggers the image? If so? can i read the master trigger rising edge and force the next master and slave camera to have the same trigger timestamp? The timestamp drift is caused similar to this issue right? #796 |
Mode 3 for achieving RGB sync as Full Slave has very little documentation compared to genlock modes 4+. My understanding from the External Synchronization paper is that Mode 3 will behave the same as slave mode '2' except for the addition of RGB sync support. This implies (not stated in the paper) that in '3' mode, the Slave will listen for a trigger signal on each frame and then after a certain time period has passed, it will give up trying to sync to the master on that particular frame and initiate unsynced capture independently. This 'wait and then capture the frame unsynced' is how slave mode '2' works. In mode 4+ (genlock), the slave does not trigger after a certain wait period but instead waits indefinitely for a trigger signal and then only initiates capture once the trigger is received. It is normal for timestamps to drift apart when hardware sync is working correctly, though the rate at which yours are driting seems to be very fast, as the original multiple camera hardware sync paper states that "you might expect this drift to be on the order of less than 1ms/minute". Ultimately, the very limited amount of documentation for mode '3' makes it difficult to advise about this problem, unfortunately. |
OK I got it. The hardware dev team forgot to put a memo. Bad boys. I`ll try to read the sync pulse and send the pulse into ROS topic system and see how is the timing correlated to each other. If I guess it right, both images are triggered at that pulse rising edge time, span for exposure time to form the image. So by right, I can modify the rs driver to take that pulse time as the reference and set both camera timestamps to be at the same time. |
Thanks very much, @snakehaihai - please do update here with the results of your next tests when you have them. |
Good luck with finding your oscilloscope or a replacement! |
Hi @snakehaihai Do you have an update about this case that you can provide, please? Thanks! |
Hi currently i am confusing about what happens to the sync. I`m not even know how the camera appears to be in sync I`m doing more testing and will update here later |
The purpose of using Arduino in a multi-camera setup is typically to provide a trigger pulse for the Slave camera (3). If one of the cameras is set to '1' to designate it as Master then that camera is generating the trigger pulse instead of Arduino. So I would not expect the Arduino being on or off to influence the outcome of the sync. There is very little information about using Full Slave mode (3) and virtually no information about using Full Slave in ROS other than your own investigation, unfortunately. The External Synchronization system that introduced mode 3 was also an experimental one that did not receive further development. So you are a pioneer in this regard. |
It's great to hear that you made significant progress. Thanks so much for continuing to share your methods with the RealSense community! |
Hi @snakehaihai Do you have an update about this case that you can provide, please? Thanks! |
Hi, I |
Hi @snakehaihai That is great news! Thanks very much for the update. :) |
Hi
I
m using 1 Arduino as the master clock and Arduino is sending pulses to 2 D455 to sync. I used the same setup for IDS camera and pointgrey camera before, but there wasn
t an issue. But when triggering d455. I counter some issues.I start slow with a 1sec pulse and 1-sec downtime to test out the system sync function. By right, i should get 0.5hz camera feed. But i couldn`t get it.
I`m using realsense ros 2.2.22 + librealsense-2.42.0. Before running the camera, I updated the firmware to the latest.
I called rs_multiple_devices.launch with the correct serial IDs
then i set the
/camera1/stereo_module/intel_cam_sync_mode: 2
/camera2/stereo_module/intel_cam_sync_mode: 2
I connect the pin 5 to of D455 to pin 10 of Arduino. and all ground are connected together.
Below is my setup and my results
For one case, it has different Hz
In another case, it is not sync
can someone suggest a working solution?
The text was updated successfully, but these errors were encountered: