Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

LIDAR data acquisition #835

Closed
michellevalente opened this issue Oct 3, 2018 · 15 comments
Closed

LIDAR data acquisition #835

michellevalente opened this issue Oct 3, 2018 · 15 comments
Labels

Comments

@michellevalente
Copy link

Hi,

I'm creating a dataset using only LIDAR and position information. However, I'm having a problem in obtaining the point cloud complete (and not during a rotation). Is there any way of getting only the last complete rotation of the sensor?

Any help would be appreciated, thanks!

@nsubiron
Copy link
Collaborator

nsubiron commented Oct 3, 2018

Hi @michellevalente,

If you can run the simulation at fixed time-step, you can adjust the frame rate and the Lidar frequency to exactly match one complete rotation on each frame. This way the message will contain always a complete rotation. This way also the simulation may run faster if your hardware allows it.

Though, note that if you do a complete rotation in a single frame everything will correspond to a "still image" of that frame. If you want more realism you can adjust it to match say 1/5 of the rotation, then merge 5 files into one. With the fixed frame rate you can even do crazy things like 1000FPS to capture a slow motion of the simulation.

@Romanenko-Serhii
Copy link

Yes, I do like say @nsubiron for emulate true lidar spin effects and use 2000fps (2000fps have more LIDAR bims then 1000fps). Then in function _on_loop I collect every LIDAR bim, and every 2000 fps save it to file.
And, if you want a ground trues in some period of time, you can add second LIDAR sensor and collect all points in one moment (it looks like flash LIDAR) data.
You can specified PointsPerSecond and RotationFrequency

@michellevalente
Copy link
Author

Thank you for the suggestions @nsubiron and @Romanenko-Serhii !

I'm trying to generate a dataset to pre-train a network that will use the KITTI dataset after to find results.
I did what was recommended by @nsubiron and I was able to get the full rotation, but I'm still having a problem that maybe you guys could help me understand. Between two scans I find almost no difference, while there is significant drift between two scans from the real dataset (which is basically what I need my learning algorithm to learn). I'm taking the data at the same frequency of the real dataset, 10hz.

Do you guys have any idea why this is happening? Do you think it can be just due to the lack of the true lidar spin effect?

@Romanenko-Serhii
Copy link

Romanenko-Serhii commented Oct 4, 2018

@michellevalente
You need create a LIDAR with parm, like:

 lidar = sensor.Lidar('Lidar_one_shot')
        lidar.set_position(0, 0, 2.5)
        lidar.set_rotation(0, 0, 0)
        lidar.set(
            Channels=64,
            Range=50,
            PointsPerSecond=100000, #important
            RotationFrequency=1, #highly important
            UpperFovLimit=10,
            LowerFovLimit=-30)
        settings.add_sensor(lidar)

Then you need set FPS > 1500, like: ./CarlaUE4.sh -windowed -ResX=800 -ResY=600 -carla-server -carla-world-port=2000 -fps=1600 -benchmark

And upgrade function _on_loop, for collecting all bins* into one file.

This affects will be grateful seen on the EGO that moves on oncoming traffic.

*At bins I mean one laser scan with 64 channels.

@Romanenko-Serhii
Copy link

Btw, by default, your car model looks like two cubes. If you wanna get a more realistic view of the car, you need to make some changes with UnrealEngine Carla project.

@Romanenko-Serhii
Copy link

@michellevalente I see on e-mail notification about your question about FPS, but don't see it here. Do you steel need the answer?

@michellevalente
Copy link
Author

@Romanenko-Serhii
I got it! thank you :)

@WangWangPoint
Copy link

Hi,do you know the labeling-tool for KITTI data set?

@maxjaritz
Copy link

maxjaritz commented Nov 21, 2018

Btw, by default, your car model looks like two cubes. If you wanna get a more realistic view of the car, you need to make some changes with UnrealEngine Carla project.

@Romanenko-Serhii Could you explain what changes have to be made? Thank you!

@Romanenko-Serhii
Copy link

Romanenko-Serhii commented Nov 21, 2018

@maxjaritz Content -> Blueprints -> Vehicles chose some car (Audi). There will be phisical model. When you open it in right pannel find drop-down list where current state Box, Change it to some anuother state and click on green button in right corner.

P.S. sorry, I haven't Carla in this moment and can't give more clear instruction. If you will have some problem, pls poot print screen.

@maxjaritz
Copy link

Okay, thanks!

@ravishk1
Copy link

Hi,

I want to use Lidar as different distance sensor, Suppose I want to use Lidar as a Radar sensor, for that I want to restrict 360 degree rotation of the Lidar. I want it to move only in the range of suppose 0 to 180 degree. Can we modify the Lidar configuration in such a way?? Just like we restrict the Upper FOV and Lower FOV, can we restrict the Horizontal FOV??

Any help will be highly appreciated.
Thanks !

@nsubiron
Copy link
Collaborator

@ravishk1

can we restrict the Horizontal FOV?

That would be a nice feature, but not currently implemented. Maybe you can try implementing it yourself, take a look at RayCastLidar.cpp, this seems like a good first time contribution 😉

@v-prgmr
Copy link

v-prgmr commented Jun 5, 2019

@nsubiron , Is there no other way that a complete 360deg sweep from the lidar can be captured other than adjusting the fps of the server?

@v-prgmr v-prgmr mentioned this issue Jun 7, 2019
@visakhchikku
Copy link

Is it possible to save lidar data in .pcap in carla?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

8 participants