-
-
Notifications
You must be signed in to change notification settings - Fork 230
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Define a base to accept a framework #53
Comments
I think the effort to implement and maintain the list is too high. However, in the fastify README we should limit them for popularity/downloads. Do we need a router? Does it needs to have middleware/plugins? |
I don't think this would work, because every framework has its own features. Maybe we could add a feature list table, like we did in: https://github.com/delvedor/router-benchmark#router-features In any case I think we should not accept frameworks with few downloads per week. |
From my point of view, citing the router framework (aka request/sec) is not enough to choose the http framework. We need to know the maturity or maintenance of the protect also |
In my opinion we should keep it very simple. Only compare frameworks which provide at least the featureset:
But we should be open for innovations which don't comply with the download rate. |
I am with @allevo and @StarpTech on this one. On the other hand, I believe everyone has the right to be benchmarked (and listed) (personally, I am not a fan of exclusion). That is, of course, provided we want to become something like a "benchmarking aggregator", the likes of this, for instance. To start -maybe- stick to Nodejs frameworks? |
The question is, how do you define if a project is innovative or not? In my opinion have dozens of framework in our benchmark is useless and confusing for the reader, there are other projects for that. |
Right, it is starting to get out of hand. For example is this: If |
Yes. |
@delvedor -hence my comment:
@dougwilson -agreed. Now the question is which ones to keep as part of the benchmark? |
The number of frameworks in the benchmark has grown very large. I agree that it would be a good idea to have some criteria that can be used to determine which frameworks should and shouldn't be part of the benchmark. Here's my suggestion:
|
Could we say that JSON should be the "first-class" value? Then we can create a PR template of it |
Hmm, "first-class JSON" might be a little too specific. I think the main thing I'm trying to get at with point 3 is that the framework should do more than just pass Node's For example, Fastify has a bunch of great features like decorators, lifecycle hooks, and async handlers. Express extends |
In my opinion, we cannot add every framework in the Node ecosystem, there are a lot!!
We should define a base rule to accept or not a framework in our list.
I think that the best solution could be the number of downloads, in which case we should define a threshold (at least 1k downloads per week?).
Thoughts?
The text was updated successfully, but these errors were encountered: