Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to debug a "failed to parse all tokens" ParseError? #423

Closed
jimmycuadra opened this issue May 12, 2018 · 6 comments
Closed

How to debug a "failed to parse all tokens" ParseError? #423

jimmycuadra opened this issue May 12, 2018 · 6 comments

Comments

@jimmycuadra
Copy link

I'm working on updating ruma-api-macros from syn 0.11 to 0.13 and I finally got the version using 0.13 compiling. However, when I run the integration test, I get this error:

error: proc macro panicked
  --> tests/ruma_api_macros.rs:16:5
   |
16 | /     ruma_api! {
17 | |         metadata {
18 | |             description: "Does something.",
19 | |             method: Method::Get, // A `hyper::Method` value. No need to import the name.
...  |
51 | |         }
52 | |     }
   | |_____^
   |
   = help: message: ruma_api! failed to parse input: ParseError(Some("failed to parse all tokens"))

I see the error is generated here but I'm not sure what to do to figure out why.

The macro invocation looks like this:

ruma_api! {
    metadata {
        description: "Does something.",
        method: Method::Get, // A `hyper::Method` value. No need to import the name.
        name: "some_endpoint",
        path: "/_matrix/some/endpoint/:baz",
        rate_limited: false,
        requires_authentication: false,
    }

    request {
        // With no attribute on the field, it will be put into the body of the request.
        pub foo: String,

        // This value will be put into the "Content-Type" HTTP header.
        #[ruma_api(header)]
        pub content_type: ContentType,

        // This value will be put into the query string of the request's URL.
        #[ruma_api(query)]
        pub bar: String,

        // This value will be inserted into the request's URL in place of the
        // ":baz" path component.
        #[ruma_api(path)]
        pub baz: String,
    }

    response {
        // This value will be extracted from the "Content-Type" HTTP header.
        #[ruma_api(header)]
        pub content_type: ContentType,

        // With no attribute on the field, it will be extracted from the body of the response.
        pub value: String,
    }
}

and I'm attempting to parse it like this:

pub struct Exprs {
    pub inner: Vec<Expr>,
}

impl Synom for Exprs {
    named!(parse -> Self, do_parse!(
        exprs: many0!(syn!(Expr)) >>
        (Exprs {
            inner: exprs,
        })
    ));
}

#[proc_macro]
pub fn ruma_api(input: TokenStream) -> TokenStream {
    let exprs: Exprs = syn::parse(input).expect("ruma_api! failed to parse input");
    # ...
}

Apparently the structure of the macro syntax cannot be represented as a series of Exprs? I tried using ExprStructs instead, but no dice. Is there another type I should be using to represent this syntax? I might be able to figure it out myself if I had a better idea where the parser was getting stuck.

@dtolnay
Copy link
Owner

dtolnay commented May 12, 2018

request {
    pub foo: String,
}

This is not syntactically an expression in Rust.

@dtolnay
Copy link
Owner

dtolnay commented May 12, 2018

For failed to parse errors I typically bisect by removing lines until the error goes away and then minimizing until it is obvious what went wrong.

@dtolnay
Copy link
Owner

dtolnay commented May 12, 2018

From your sample input it looks like the relevant types are going to be syn::FieldValue and syn::Field. Something like:

use syn::{Field, FieldValue};
use syn::punctuated::Punctuated;
use syn::synom::Synom;

struct Exprs {
    metadata: Vec<FieldValue>,
    request: Vec<Field>,
    response: Vec<Field>,
}

type ParseMetadata = Punctuated<FieldValue, Token![,]>;
type ParseFields = Punctuated<Field, Token![,]>;
impl Synom for Exprs {
    named!(parse -> Self, do_parse!(
        custom_keyword!(metadata) >>
        metadata: braces!(ParseMetadata::parse_terminated) >>
        custom_keyword!(request) >>
        request: braces!(call!(ParseFields::parse_terminated_with, Field::parse_named)) >>
        custom_keyword!(response) >>
        response: braces!(call!(ParseFields::parse_terminated_with, Field::parse_named)) >>
        (Exprs {
            // Convert from Punctuated to Vec.
            metadata: metadata.1.into_iter().collect(),
            request: request.1.into_iter().collect(),
            response: response.1.into_iter().collect(),
        })
    ));
}

@jimmycuadra
Copy link
Author

Thanks much, @dtolnay! That was enough info for me to progress. Do you think there's anything actionable here in terms of documentation additions? If not, feel free to close the issue.

@rushmorem
Copy link

rushmorem commented Jun 1, 2018

I just ran into this while trying to figure out #436. The trick to debugging this is calling Synom::parse on the type that is not being parsed correctly rather than syn::parse and analysing the result. The result is very informative as it tells you what was parsed and gives you a cursor to the remaining TokenStream. By turning what was parsed to a TokenStream and pretty printing it, you can see what it was parsed into. You can then turn the cursor to a TokenStream by calling token_stream() on it to see what the rest of the tokens were. Enough talk, here is the code:-

use syn::buffer::TokenBuffer;
use syn::synom::Synom;

let buffer = TokenBuffer::new(input);
match Exprs::parse(buffer.begin()) {
    Ok((exprs, rest)) => {
        println!("Parsed Exprs:- \n{:#?}", exprs.inner);
        println!("Remaining tokens:- \n{:#?}", rest.token_stream());
    }
    Err(error) => println!("Error: {:?}", error),
}

@dtolnay
Copy link
Owner

dtolnay commented Sep 1, 2018

This is being addressed by #47 and will be shipping in 0.15.

@dtolnay dtolnay closed this as completed Sep 1, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants