Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Customising Shaders #249

Closed
vorg opened this issue Sep 27, 2019 · 34 comments
Closed

Customising Shaders #249

vorg opened this issue Sep 27, 2019 · 34 comments
Labels
type/feat A new feature

Comments

@vorg
Copy link
Member

vorg commented Sep 27, 2019

Currently we can set material.vert and material.frag but that breaks as we also include them in the depth prepass shaders vert: material.vert || DEPTH_PASS_VERT. The correct (needs testing) way is to define all 3 shaders and use preprocessor to support all 3 cases

#ifdef DEPTH_PRE_PASS_ONLY
//copy pre pass shader
#elseif DEPTH_PASS_ONLY
//copy depth shader
#else
//pbr shader
#endif

That's painful.

Related

@vorg
Copy link
Member Author

vorg commented Sep 27, 2019

After looking at that SceneKit shader modifiers our api could look e.g. like this:

material.shaderModifiers = {
   // model space, before skinning, world transform and projection
   geometry: `
      uniform float uDisplaceAmplitude;
      vertexData.position += vertexData.normal * uDisplaceAmplitude;
   `,
   // after reading all uniforms and texture values
   // before lighting, values in view space
   surface: `
      vec3 textureTriplanar(vec3 worldPos, vec3 normal) {        
         ...
      }
      surfaceData.baseColor = textureTriplanar(worldPos, worldNormal);
   `
   // after lighting (but before or after exposure and tonemapping?)
   fragment: `
      outputData.color.rgb = vec3(1.0) - outputData.color.rgb;
      // but also
      outputData.color.rgb = surfaceData.normal * 0.5 + 0.5;
   `
}

Notes

  • geometry is run in vertex shader, surface and fragment in fragment shader
  • surface stage assumes value in view space but we already compute values in world space e.g. for envmaps, those could be available too
  • SceneKit includes another lighting stage that can execute in vertex or fragment shader
  • SceneKit allows modification of texture coordinates, which means colors from textures are read after this stage
  • fragment state in SceneKit is the final pixel color

Open questions:
How do you do you define uniforms?

  1. we could detect and extract uniforms and functions as assumed in above example
  2. we split modifier into declaration and code parts
material.shaderModifiers = {
   // model space, before skinning, world transform and projection
   geometry: {
      defs `uniform float uDisplaceAmplitude;`,
      code: `vertexData.position += vertexData.normal * uDisplaceAmplitude;`
   }
   `,

@simonharrisco
Copy link
Collaborator

  • why geometry and surface ? is this additional jargon that might just add confusion?
  • are you going to be be able to change these on the fly?
  • detect and extract uniforms and functions this stinks of heartache and pain from edge cases and debugging etc

@vorg
Copy link
Member Author

vorg commented Sep 27, 2019

is this additional jargon that might just add confusion?

Hmm these are basic concepts in computer graphics. Geometry defines the shape of thing being render, and surface parameters how that shape responds to light. And they are not as much detached from renderer names and it might first seem.

geometry - it makes sense if you notice that those are raw values coming from geometry component attributes, you could also call it vertex

surface - this is excellent name to clarify where the modification happens, in our code we have one PBRData blob that contains raw values from uniforms and textures (surface data), color values computed from light based on BRDF (reflectance function output data) and final fragment shader before and after post-processing (if done inline, fragment data)

@vorg
Copy link
Member Author

vorg commented Sep 27, 2019

are you going to be be able to change these on the fly?

Same as currently recompilation happens when any of the material parameters modifying shader is changed.

detect and extract uniforms and functions this stinks of heartache and pain from edge cases and debugging etc

I think so too

@dmnsgn
Copy link
Member

dmnsgn commented Sep 30, 2019

Open questions:
How do you do you define uniforms?

Could we make use of material._uniforms somehow?

With the shaderModifiers syntax above, it feels a big vague as to where custom code is added (is it at the top, uniform declaration, beginning/end of main()...).

@vorg
Copy link
Member Author

vorg commented Sep 30, 2019

@dmnsgn which part of it is vague?

If code then then way Scene Kit defines it is very clear: "after the above values are defined/read but before they are used. I.g. vertex before skinning, surface after getBaseColor() (read uniform or texture) but before applying lighting and frag at the very end.

The problem is e.g. if you want to declare function as well. Hence proposal for defs part so i can add eg. vec3 triplannarTexturing() { function just before main.

_uniforms is just cache
uniforms - is list of values if you have declared new uniforms in you modified shader

@vorg
Copy link
Member Author

vorg commented Sep 30, 2019

Code injection points
geometry: material.vert.js#L134 and depth-pass.vert.js#L124
surface: material.frag.js#L203
fragment: material.frag.js#L253

@dmnsgn
Copy link
Member

dmnsgn commented Sep 30, 2019

@dmnsgn which part of it is vague?

How do I handle modifying vertex position before or after instancing if I only have one injection point? Same if I need to modify data.normalWorld before or after.
Not 100% sure these are valid use cases but worth thinking of how the injection point could be limiting.

@vorg
Copy link
Member Author

vorg commented Sep 30, 2019

Yes, it's limiting but also make things clear how they work and provides immunity from underlying shader code changes. Otherwise things get complicated fast just to cover remaining 5-10% of use cases and we end up where we are now "copy whole shader, modify what you want and hope it works in the X.1 release" https://twitter.com/bonzajplc/status/1177715088079368194?s=21

@vorg
Copy link
Member Author

vorg commented Sep 30, 2019

Another good thing i see coming out of it we could clean up code a bit as now values are defined and computed all over the place at different stages. e.g. reflectionWorld added to global PBRData struct only if we have reflection probes material.frag.js#L207

@vorg
Copy link
Member Author

vorg commented Sep 30, 2019

A good idea is to write a list of use cases:

Easy

  • tri planar texture mapping -> surface
  • generative colors (base color, metallic) -> surface
  • toon shading -> fragment
  • showing debug information (normals, ao, texture coords) -> fragment
  • picking color id -> fragment

No clear how to prototype

  • displacement mapping -> vertex + surface (frag based normal recalc?)
  • subsurface scattering -> ?
  • irradiance volumes -> more of a lighting modifier
  • projective texturing (projection mapping, light cookies) -> post lighting blend?
  • particle depth write -> fragment?

@vorg
Copy link
Member Author

vorg commented Sep 30, 2019

@dmnsgn to answer your earlier question about uniforms and varying. Similar to what thi.ng/umbrella is doing we could move to more modular shader definitions like here : https://github.com/thi-ng/umbrella/blob/master/examples/webgl-ssao/src/shaders.ts#L20

Group 21

This way custom shader would be pushing it's uniforms to material.uniforms defs array and adding custom functions to eg. material.lib list of snippets and then custom part to material.modifiers.surface nice and clean... even possibly webgl2 compilable

@vorg
Copy link
Member Author

vorg commented Sep 30, 2019

For now this is what's happening

Renderer following shader chunks

this.shaders = {
    chunks: SHADERS_CHUNKS,
    pipeline: {
      material: {
        vert: PBR_VERT,
        frag: PBR_FRAG
      }
    }
  }

We customize material shader

material.frag = renderer.shaders.pipeline.material.frag.replace(...)`

but that then get's used for depth pre-pass

 return {
      flags: flags,
      vert: material.vert || DEPTH_PASS_VERT,
      frag: material.frag || DEPTH_PRE_PASS_FRAG
    }

What i think should happen is

depth prepass - renders geometry and outputs depth and normals then used for SSAO
depth pass - simplified shader for shadow maps depth rendering
material - normal pbr with full lighting

material.frag = `#ifdef DEPTH_PRE_PASS_ONLY
${renderer.shaders.pipeline.depthPrePass.frag.replace(...)}
#elseif DEPTH_PASS_ONLY
${renderer.shaders.pipeline.depthPass.frag.replace(...)}
#else
${renderer.shaders.pipeline.material.frag.replace(...)}
#endif

@vorg
Copy link
Member Author

vorg commented Oct 4, 2019

Proper syntax with elif defined

var frag = `#ifdef DEPTH_PRE_PASS_ONLY
  ${renderer.shaders.pipeline.depthPrePass.frag}
  #elif defined DEPTH_PASS_ONLY
  ${renderer.shaders.pipeline.depthPass.frag}
  #else
  ${renderer.shaders.pipeline.material.frag.replace(
    `gl_FragData[0] = encode(vec4(color, 1.0), uOutputEncoding);`,
    `color = data.normalView * 0.5 + 0.5;
    gl_FragData[0] = encode(vec4(color, 1.0), uOutputEncoding);`
  )}
  #endif`

@pailhead
Copy link

@vorg

Open questions:
How do you do you define uniforms?

There could be an additional "above the fold" point where you can insert GLSL outside of main(){}. Three's chunk are more or less aligned with what scene kit does with stages, like you can transform the vertex in model space in almost every shader by replacing/modifying the same chunk. However it's missing a lot, for example the common one is included in both the vertex and fragment shaders, so putting something like attribute float foo; in there will break the fragment shader.

Also, there would not be a need to even define them in GLSL if you could pass the type in along with the uniform:

const uFoo = { value: 1, type: 'i' }
const uBar = { value: 1, type: 'f' }
const uBaz = { value: Vector3() } //most actually wouldnt need a type, so maybe omit ints, allow only floats...

@pailhead
Copy link

@dmnsgn

How do I handle modifying vertex position before or after instancing if I only have one injection point?

Currently you can do it by modifying various THREE.ShaderChunk values at runtime 😢. If you prepend to the instancing chunk, it will happen before instancing, if you append, it will happen after. You actually have way more granular stages today than you would in this proposal i believe. The problem is that it's not clear how to access them, and the interface is IMHO extremely clunky and opinionated (buggy on top of everything).

More info.

@pailhead
Copy link

displacement mapping -> vertex + surface (frag based normal recalc?)

This example is interesting. I don't think that you would want to write normal calculation code every time for every shader. This could perhaps be automated to take into account only the fact that the default normal is no longer correct. So for example, the presence of some GLSL in some field, could trigger a define to calculate the normals. I think you'd want to limit this to just vertex, and have the surface stuff happen in a general way (vertex moved / normals invalid), automagically.

@pailhead
Copy link

pailhead commented Jan 16, 2020

Also might be worth checking out (help welcome!) this . It's on the shelf atm, but i think a couple of these materials are completely wired. Basically you can remove all mention of materials other than ShaderMaterial from WebGLRenderer and everything should still work fine.

The approach is exactly the same as how WebGLRenderer is handling this, except that it happens outside, and you have control over it way before the internals such as onBeforeCompile kick in. I was able to easily implement my approach to shader includes by using this.

State of the world:

[ShaderMaterial] \
[______Material]  _\_______ [ WebGLRenderer ] 

ChunkMaterial basically does this:

[______Material] ______ [ShaderMaterial] _______ [WebGLRenderer] 

You're free to process the template anyway you want, but you get getters and setters so you don't have to do

material.map.value = myTexture
material.uMyUniform.value = 1

You get:

material.map = myTexture
material.uMyUniform = 1

@vorg
Copy link
Member Author

vorg commented Jan 21, 2020

Note to self:

If above ifelse doesn't work try

#ifdef DEPTH_PRE_PASS_ONLY
//copy pre pass shader
#endif
#ifdef DEPTH_PASS_ONLY
//copy depth shader
#endif
#ifndef DEPTH_PRE_PASS_ONLY     
#ifndef DEPTH_PASS_ONLY     
//pbr shader      
#endif
#endif   

@vorg
Copy link
Member Author

vorg commented Jan 21, 2020

@pailhead thanks for the comments. I'll definitely review the ThreeJS struggles. It seems though that everybody (three, babylonjs, unity) are moving towards shader graphs. I need to understand better "why now?". as there was a big push against that in the gamedev industry few years back. One benefits of shader graph would be that it fits naturally with http://nodes.io which is where pex-renderer is used the most.

@pailhead
Copy link

pailhead commented Feb 14, 2020

I don't understand the idea of compiling shader graphs on the web. It seems that three's NodeMaterial is struggling with just that. I really like the scene kit approach to this problem. I think that after you've made some graph, in some application, you should generate some GLSL, in three's case something compatible with it (specific uniform/attribute names and such).

Heh i also didn't realize i wasn't on three's page anymore :D

@vorg
Copy link
Member Author

vorg commented Nov 4, 2022

My latest preferred approach would be to have named hook/extension points + optional source string replacement like BabylonJS Material Plugins

screenshot_2022-08-30_at_13 15 05

Curious how ThreeJS does it.

@vorg
Copy link
Member Author

vorg commented Nov 4, 2022

How would it work in practice?

entity.material = components.material({
 type: 'standard',
 baseColor: [1, 0, 0, 1],
 vertHooks: {
    AFTER_DECLARATIONS: /*glsl*/`
      uniform vec2 uTextureScale;
    `,
    BEFORE_POSITIONOUT: /*glsl*/`
       vTexCoord0 *= uTextureScale;
    `
 },
 fragHooks: {
    AFTER_DECLARATIONS: /*glsl*/`
      uniform sampler2D uVertexNormalMap;
    `,
    BEFORE_PBR: /*glsl*/`
       data.normalView = texture2D(uVertexNormalMap, data.texCoord0);
    `
 },

@vorg
Copy link
Member Author

vorg commented Nov 4, 2022

It's a simple solution that could be battle tested in the current project for smooth normals from vertex texture and reviewed afterwards. There is lot's of complications that are to be considered beyond GLSL modification. E.g. where data comes from? If we were to follow Babylon approach with getUniforms() it could look like

entity.material = components.material({
 type: 'standard',
 baseColor: [1, 0, 0, 1],
 vertHooks: { ... }
 fragHooks: { ... },
 uniformHook: (entity) => {
    return {
       uTextureScale: entity.material.textureScale,
       uVertexNormalMap: entity.material.vertexNormalMap
    }
 } 

or even

entity.material = components.material({
  type: "standard",
  baseColor: [1, 0, 0, 1],
  hooks: {
    vert: {
      AFTER_DECLARATIONS: /*glsl*/ `
       uniform vec2 uTextureScale;
        `,
    },
    frag: {
      AFTER_DECLARATIONS: /*glsl*/ `
       uniform sampler2D uVertexNormalMap;
     `,
    },
    uniforms: (entity) => {
      return {
        uTextureScale: entity.material.textureScale,
        uVertexNormalMap: entity.material.vertexNormalMap,
      };
    },
  },

@dmnsgn
Copy link
Member

dmnsgn commented Nov 4, 2022

I'll attempt a summary (WIP):

Hooks

Not a huge fan of hooks for a few reasons.

Cons:

  • naming things is hard, no way I'll remember these names without checking docs
  • how do we choose where to insert them, do we add one hook after each function/stage?
  • how would they be injected? Shaders are strings so we'd need to
    • clutter shaders with comment/pragma/include/custom syntax #include AFTER_DECLARATIONS + regex replace
    • make shaders source function where we pass hooks as options?
  • doesn't allow to modify existing parts of a material
  • reliance on vert/frag to have these hooks

Pros:

  • easily replaceable
  • fastest to implement

Full shader + #include declaration

Both three and lygia moved to #include chunkOfLibrary.

Pros:

  • smaller shaders, visually parsable at a higher level

Cons:

  • hoping no breaking changes in previous releases
  • not sure it the best for tree shaking
  • risk of not finding which included part is triggering an error

Shader graph

Pros:

  • most customisable
  • could handle dependencies
  • could be used as a better way to visualise a shader
  • nodes friendly

Cons:

  • technical overhead
  • would be kind of object oriented hooks?

Side notes:

Three/dgel can do automatic variables insertion

My ideal scenario:

  • avoid regex replace: prone to error, hard to write/debug, monkeypatching
  • no custom syntax
  • something esm friendly and portable between es120/es300/wgsl
  • handling of dependencies
  • easy to debug/overwrite for quick testing

@vorg
Copy link
Member Author

vorg commented Nov 4, 2022

Adding some examples for context

#include declaration

ThreeJS

https://github.com/mrdoob/three.js/blob/dev/src/renderers/shaders/ShaderLib/meshphong.glsl.js

The fact that all those are globally defined with no path/source doesn't look like much fun to debug.

export const vertex = /* glsl */`
#define PHONG
varying vec3 vViewPosition;
#include <common>
#include <uv_pars_vertex>
#include <uv2_pars_vertex>
#include <displacementmap_pars_vertex>
#include <envmap_pars_vertex>
#include <color_pars_vertex>
#include <fog_pars_vertex>
#include <normal_pars_vertex>
#include <morphtarget_pars_vertex>
#include <skinning_pars_vertex>
#include <shadowmap_pars_vertex>
#include <logdepthbuf_pars_vertex>
#include <clipping_planes_pars_vertex>

or

void main() {
	#include <clipping_planes_fragment>
	vec4 diffuseColor = vec4( diffuse, opacity );
	ReflectedLight reflectedLight = ReflectedLight( vec3( 0.0 ), vec3( 0.0 ), vec3( 0.0 ), vec3( 0.0 ) );
	vec3 totalEmissiveRadiance = emissive;
	#include <logdepthbuf_fragment>
	#include <map_fragment>
	#include <color_fragment>
	#include <alphamap_fragment>
	#include <alphatest_fragment>
	#include <specularmap_fragment>

And then

import alphamap_fragment from './ShaderChunk/alphamap_fragment.glsl.js';
import alphamap_pars_fragment from './ShaderChunk/alphamap_pars_fragment.glsl.js';
import alphatest_fragment from './ShaderChunk/alphatest_fragment.glsl.js';
import alphatest_pars_fragment from './ShaderChunk/alphatest_pars_fragment.glsl.js';
import aomap_fragment from './ShaderChunk/aomap_fragment.glsl.js';
import aomap_pars_fragment from './ShaderChunk/aomap_pars_fragment.glsl.js';
import begin_vertex from './ShaderChunk/begin_vertex.glsl.js';
import beginnormal_vertex from './ShaderChunk/beginnormal_vertex.glsl.js';
import bsdfs from './ShaderChunk/bsdfs.glsl.js';
import iridescence_fragment from './ShaderChunk/iridescence_fragment.glsl.js';
import bumpmap_pars_fragment from './ShaderChunk/bumpmap_pars_fragment.glsl.js';

Lygia https://github.com/guidoschmidt/lygia_threejs_examples/blob/main/examples/forward-rendering.advanced/src/glsl/forward.frag.glsl

One issue I see there that it implies access to the library. For nodes does it mean glslify like code on the client side?

#include "../../../lygia/lighting/envMap.glsl";
#include "../../../lygia/lighting/pbr.glsl";
#include "../../../lygia/lighting/shadow.glsl";
#include "../../../lygia/lighting/material/new.glsl"
#include "../../../lygia/lighting/atmosphere.glsl"
#include "../../../lygia/color/dither.glsl"
#include "../../../lygia/color/tonemap.glsl"

void main() {
  vec4 final = vec4(0.0, 0.0, 0.0, 1.0);

  vec3 lightPos = u_lightPosition;
  vec3 normal = normalize(v_normal);
  vec3 lightColor = vec3(1.0);
  vec3 lightDir = normalize(lightPos - v_position);
  vec3 viewDir = normalize(u_cameraPosition - v_position);

  Material material = materialNew();
  material.albedo.rgb = u_diffuseColor;
  material.emissive.rgb = u_diffuseColor * 0.0;
  material.normal = normalize(v_normal);
  material.metallic = u_metallic;
  material.roughness = u_roughness;
  material.reflectance = u_reflectance;
  material.ambientOcclusion = 0.5;

  final = pbr(material);

Shader graph

nodes friendly

Assumes that we have subgraph and can e.g. abstract Principled PBR node away from smaller internal chunks. Not that customizable without node based hooks if in order to add some effect in PBR friendly way you would need to break master shader apart. Just see how many includes has the ThreeJS examples above.

Maybe an alternative is possible. What i like about hooks is that they support most common scenarios:

  • animate mesh position / normal / tex coord in vertex shader
  • replace uvs before all PBR textures are sampled e.g. for procedural meshes
  • subsitute color/metallic/roughness before PBR computation
  • overwrite final output before writing frag color

I don't think following usecases are desired and shouldn't be optimised for

  • implementing fully customized translucency that would also need custom render pass
  • adding things like sheen or clearcoat as they often impact other parts of the light equation for conservation of energy

Side notes

Three/dgel can do automatic variables insertion

Based on some parameter declaration?

@vorg
Copy link
Member Author

vorg commented Nov 4, 2022

Is ShaderGraph goal https://nodetoy.co/ ?

Screenshot 2022-11-04 at 19 29 41

ThreeJS Node Material
There is also GLSL-in-JS https://github.com/mrdoob/three.js/tree/dev/examples/jsm/nodes but it's madness... although it allows compilation to both GLSL and WGSL.

generateStandardMaterial( builder, { colorNode, diffuseColorNode } ) {
		const { material } = builder;
		// METALNESS
		let metalnessNode = this.metalnessNode ? float( this.metalnessNode ) : materialMetalness;
	        metalnessNode = builder.addFlow( 'fragment', label( metalnessNode, 'Metalness' ) );
		builder.addFlow( 'fragment', assign( diffuseColorNode, vec4( mul( diffuseColorNode.rgb, invert( metalnessNode ) ), diffuseColorNode.a ) ) );
		// ROUGHNESS
		let roughnessNode = this.roughnessNode ? float( this.roughnessNode ) : materialRoughness;
		roughnessNode = getRoughness(....

@vorg
Copy link
Member Author

vorg commented Nov 4, 2022

TLDR: not aiming for fully dynamic gpu compute graphs, just avoiding yet-another-pbr-version-with-this-one-small-change

To answer concern about hooks

naming things is hard, no way I'll remember these names without checking docs
how do we choose where to insert them, do we add one hook after each function/stage?

We could limit it to those touch points
VERT_BEFORE_TRANSFORM //after reading from attributes like aNormal
VERT_AFTER_TRANSFORM //after skinning skinning, moving to world space etc
FRAG_BEFORE_TEXTURES //after reading from varyings and uniforms but before reading from textures
FRAG_BEFORE_PBR //before doing lighting math
FRAG_AFTER_PBR //after pbr math, allows for individual direct/indirect/emissive overwrites
FRAG_END //before writing to screen

What would probably cover 90% of cases for me.

how would they be injected? Shaders are strings so we'd need to
clutter shaders with comment/pragma/include/custom syntax #include AFTER_DECLARATIONS + regex replace

Just few. See above

make shaders source function where we pass hooks as options?

No need. Entry points should be limited.

doesn't allow to modify existing parts of a material

Not desired IMO. Any example use cases?

reliance on vert/frag to have these hooks

To be available only in standard PBR material.

@vorg
Copy link
Member Author

vorg commented Nov 5, 2022

As predicted naming is hard...

in glsl mark place for code injection

#define HOOK_VERT_DECLARATIONS_END

and then in material js code

hooks: {
  vert: {
      HOOK_VERT_DECLARATIONS_END: `?`,
      VERT_DECLARATIONS_END: `?`,
      DECLARATIONS_END: `defaulting to this for now`

@vorg
Copy link
Member Author

vorg commented Nov 5, 2022

Screenshot 2022-11-05 at 23 33 48

For smooth normals I would need tangents or noise deriviative so i skipped that for simplicity.

const sphereEntity = createEntity({
  transform: transform({
    position: [0, 1, 0],
  }),
  geometry: geometry(icosphere({ radius: 1, subdivisions: 4 })),
  material: material({
    baseColor: [1, 0, 1, 1],
    metallic: 0,
    roughness: 0.2,
    castShadows: true,
    receiveShadows: true,
    hooks: {
      vert: {
        DECLARATIONS_END: /*glsl*/ `
          ${perlinNoiseGLSL}
          uniform float uTime;
          varying float vNoiseAmount;
        `,
        BEFORE_TRANSFORM: /*glsl*/ `
          vNoiseAmount = 0.5 + 0.5 * cnoise(position.xyz * 2.0 + vec3(uTime, 0.0, 0.0));
          position.xyz += normal.xyz * vNoiseAmount;
        `,
      },
      frag: {
        DECLARATIONS_END: /*glsl*/ `
          varying float vNoiseAmount;
        `,
        BEFORE_TEXTURES: /*glsl*/ `
          vec3 dX = dFdx(data.positionView);
          vec3 dY = dFdy(data.positionView);
          data.normalView = normalize(cross(dX, dY));
          data.normalWorld = vec3(data.inverseViewMatrix * vec4(data.normalView, 0.0));
        `,
        BEFORE_LIGHTING: /*glsl*/ `
          data.roughness = 0.1;
          data.metallic = step(0.5, vNoiseAmount);
          data.baseColor = mix(vec3(2.0, 0.4, 0.0), vec3(1.0), data.metallic);
        `,
      },
      uniforms: (entity) => {
        return {
          uTime: (Date.now() % 10000) / 1000,
        };
      },
    },
  }),
});

@vorg
Copy link
Member Author

vorg commented Nov 6, 2022

Added basic implementation in 47904f2 and example in f2feab0

ezgif-1-1f46bd47fa

@vorg
Copy link
Member Author

vorg commented Nov 7, 2022

Currently implemented

vertex shader:
DECLARATIONS_END - end of uniform and function block
BEFORE_TRANSFORM - after all data has been read from attributes but before applying transformation matrices
END - end of shader

fragment shader:
DECLARATIONS_END - end of uniform and function block
BEFORE_TEXTURES - after data structure has been initialised with all default values from vertex shader and before pbr properties are read from textures
BEFORE_LIGHTING - before pbr math
AFTER_LIGHTING - after all direct and indirect lighting has been calculated but before it has been all added up to color, allows for last minute overwrites
END - end of shader

@vorg
Copy link
Member Author

vorg commented Jan 21, 2023

This works pretty well and i'm closing this for now as i don't anticipate any changes in direction before next stable release.

@vorg vorg closed this as completed Jan 21, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
type/feat A new feature
Projects
None yet
Development

No branches or pull requests

4 participants