-
Notifications
You must be signed in to change notification settings - Fork 8
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Token indices sequence length is longer than the specified maximum sequence length for this model (2409 > 2048). Running this sequence through the model will result in indexing errors #13
Comments
This warning has no impact on training. |
OK, Thanks a lot |
My model is not good enough in 3D perception. This is my config file. May I ask if there is anything that needs to be modified? base = [ If point cloud range is changed, the models should also change their pointcloud range accordinglypoint_cloud_range = [-51.2, -51.2, -5.0, 51.2, 51.2, 3.0] For nuScenes we usually do 10-class detectionclass_names = [ num_gpus = 8 collect_keys=['lidar2img', 'intrinsics', 'extrinsics','timestamp', 'img_timestamp', 'ego_pose', 'ego_pose_inv', 'command', 'can_bus'] dataset_type = 'CustomNuScenesDataset' file_client_args = dict(backend='disk') ida_aug_conf = { train_pipeline = [ data = dict( optimizer = dict(constructor='LearningRateDecayOptimizerConstructor', type='AdamW', optimizer_config = dict(type='Fp16OptimizerHook', loss_scale='dynamic', grad_clip=dict(max_norm=35, norm_type=2)) learning policylr_config = dict( evaluation = dict(interval=num_iters_per_epoch*num_epochs, pipeline=test_pipeline) find_unused_parameters=False #### when use checkpoint, find_unused_parameters must be False load_from='ckpts/eva02_petr_proj.pth'resume_from=None |
Token indices sequence length is longer than the specified maximum sequence length for this model (2409 > 2048). Running this sequence through the model will result in indexing errors
I got this warning when I was training the model.Does this warning have an impact on training?How to resolve this warning?
The text was updated successfully, but these errors were encountered: