-
-
Notifications
You must be signed in to change notification settings - Fork 35.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Per material GLSL ShaderChunk includes #10789
Comments
I assume you want to inject code into a built-in material's shader on a per-instance basis. One hack was posted here. I assume you are looking for an nicer API for doing so. Is that correct? |
Yes, to not override the global chunks. See the pull request. The hack takes care of the defines, but then the defines are further hacked by adding conditional branching in the preprocessor. You have to take care it's coordinated with the defines correctly, and that it doesn't break other stuff. With this there would be no need for branching. |
In the future, i'd consider an api where you can inject glsl in more abstract entry points. As much as i hated working with scenekit, i really liked this api |
Description of the problem
I was able to do magical things with a few undocumented features
Material.defines
,Material.customDepthMaterial
andMaterial.customDistanceMaterial
.#10764
I'm wondering now if this whole system could be extended by having the renderer use a chunk provided by the material instead of
THREE.ShaderChunks
.It's possible to modify the shader chunks so that they don't break by using the preprocessor.
This can then easily be monkey patched into three. Before the first render call is made,
THREE.ShaderChunk[ chunk ]
is modified. Then, that branch can be controlled by assigning a define to a material instance's defines.The shader chunk mechanism is already so neatly broken up, that it could probably cover most cases.
For example, something that modifies the geometry can use
uv_pars_vertex
to safely be included in all the needed shaders. Depth shader doesnt need to render maps, but it needs to fetch a texture if it uses displacement, so it includes the uvs. Something that modifies lighting, can make a branch in one of the lighting includes.I'm thinking, monkey patching and branching can be avoided if the renderer would look up a chunk provided by something like
Material.shader.entryPoint['uv_pars_vertex']
.In addition to this, i'd add more entry points, at least a
global_vertex
andglobal_fragment
, or a define#IS_FRAGMENT
so then any global stuff could be appended tocommon
.Confused though why are parameters being passed here:
https://github.com/mrdoob/three.js/blob/master/src/renderers/webgl/WebGLProgram.js#L496
tl:dr;
would this be a good idea?
https://github.com/mrdoob/three.js/blob/master/src/renderers/webgl/WebGLProgram.js#L153
@WestLangley @mrdoob
Three.js version
Browser
OS
Hardware Requirements (graphics card, VR Device, ...)
The text was updated successfully, but these errors were encountered: