Replies: 1 comment 1 reply
-
Hey @hydra1983, I've converted this into a discussion. Have you tried profiling your code yet? Node.js features some pretty good profiling that should help you identify bottlenecks. Generally, Chevrotain has been the fastest option for parsing text in the JS space for me. However, it is easy to accidentally run into slow behavior. For example, do you get any parser errors? Or does the whole input parse without errors? Do you recreate the parser during your benchmark, or do you use the same object for the whole benchmark? |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
It took me three days to implement the g4 grammar using Chevrotain, but the performance turned out to be significantly worse than before.
Antlr4
antlr4
Chevrontain v10.5.0
chevrotain
Beta Was this translation helpful? Give feedback.
All reactions