Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Clarify the encoding of the sign bit in coordinates #16

Open
ts5746 opened this issue Jul 1, 2020 · 0 comments
Open

Clarify the encoding of the sign bit in coordinates #16

ts5746 opened this issue Jul 1, 2020 · 0 comments
Labels
documentation Additional information or clarification required interoperability Impacts interoperability and backwards compatibility

Comments

@ts5746
Copy link
Contributor

ts5746 commented Jul 1, 2020

The standard is not clear on how the sign of the coordinates of the ObjectLatitude and ObjectLongitude fields should be binary encoded. This should be explicitly specified.

Note that the Whiteflag API uses 0 for the - sign, and 1 for the + sign.

@ts5746 ts5746 added interoperability Impacts interoperability and backwards compatibility documentation Additional information or clarification required labels Jul 1, 2020
@ts5746 ts5746 changed the title Calrify the encoding of the sign bit in coordinates Clarify the encoding of the sign bit in coordinates Jul 1, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
documentation Additional information or clarification required interoperability Impacts interoperability and backwards compatibility
Projects
None yet
Development

No branches or pull requests

1 participant