This document is written by a random average developer, so this should not be treated as official source of information. This guide is merely my interpretation on how to use Stride's shader system, and hopefully help you understand the system better.
Please note that this is not a beginner's guide since there are a lot of initial shader knowledge required before you can approach this guide. This is also not an advanced guide, more specifically, this guide is not intended to dive deeply into the 'Shader Effect' side of things, since it is complicated.
The main intent of this guide is to hopefully help bridge the gap in knowledge from the official documentation Effects and Shaders and how it works in the Stride engine and applying it in the Stride editor.
- Read the official documentation Effects and Shaders
- Basic shader knowledge, such as shader pipeline
- The ability to write code (knowing any programming language, eg. C#, means you can quickly pick up writing shader code)
An overview of how the engine uses the 'shader' can be seen in the diagram below:
- An Entity (game object) in a scene requires a
ModelComponent
(this is just called a 'Model' in the editor). The Entity'sModelComponent
references aMaterial
asset, which is effectively the 'shader' that is applied to the model. - The
Material
asset is an editor-focused general purpose shader builder, which exposes common shader properties, viaMaterialFeature
s, that you can set (eg. Displacement, Normal Map, Diffuse, etc). Depending on the context of theMaterialFeature
you can set simple data (eg.Color
,float
,Texture
), or set your own 'Shader Class' for passing data with more complex logic. - The
Graphics Compositor
asset declares all the rendering capabilities the Stride engine uses to render things on the screen, throughRenderFeature
s. The mainRenderFeature
of interest is theMeshRenderFeature
. TheMeshRenderFeature
is where you declare the 'Shader Effect' you want to use (via the shader's name), and when it comes to fully build the 'shader' theMaterial
asset calls eachMaterialFeature
to provide its relevant 'Shader Class' that will be merged into the 'Shader Effect'.
In order to write any shader code for Stride, you need to know some basic knowledge on writing with HLSL (eg. declaring variables such as float3
/float4x4
, calling functions like mul
/lerp
, etc), which can then be transferred to writing Stride shader code, since Stride's shader language (SDSL) is a superset of HLSL.
As long as you've already got some basic programming language skills (eg. C#), you should be able to quickly skim through official documentation on HLSL and see the coding style/syntax is mostly the same.
For those coming from other game engines where you generally write the whole "shader" in a single file, this does not appear as straight forward in Stride.
There are effectively two types of 'shaders' in Stride. 'Effects' (.sdfx
files) & 'Shaders' (.sdsl
files).
To help make the distinction clearer, this guide will call the Stride 'Effect' as Shader Effect (.sdfx
type) , and 'Shader' (.sdsl
type) as Shader Class.
Any mention of 'shader code' has with no greater meaning, it is just actual code that you write.
While the official documentation does not call the
.sdsl
shaders as 'Classes', internally the engine refers them as ShaderClassSource objects.
From a high level view, a Shader Effect is the closer term that people from other game engines are thinking of when they talk about 'shaders'. In Stride, the Shader Effect forms the final code that then gets compiled to HLSL/GLSL/SPIR-V for the graphics card to execute.
In Stride, the purpose of Shader Classes are to break down shader code in a Shader Effect into smaller (and usually reusable) pieces of code. This is why the word 'shader' starts to blur between 'Effect' & 'Shader', because the Effect can essentially be just a bunch of 'Shader' code combined to form the full 'shader'.
Shader Classes can be thought of as being similar to C# classes, each class on its own is separate from another class, you can inherit other classes and override the base class's methods/functions. You can "reference" other classes and execute their methods (eg. compose ComputeColor ...
).
It is important to note that Shader Classes have no impact on their own. In order to function, they must be plugged in somewhere in order to be executed, either directly to a Shader Effect, or plugged into another Shader Class which is connected the Shader Effect (or continue chaining up until it the final parent is the Shader Effect).
When authoring your own Shader Effect, you technically can ignore writing any Shader Classes and just directly write all your shader code/functions in the Shader Effect, however if you look at Stride's source code, you'll notice they heavily lean into plugging in Shader Classes into their Shader Effects since they reuse/share their shader code in multiple Shader Effects.
If you wish to see all the Shader Classes used, you can view all the
.sdsl
files from in the Github source repository, or you can download the very helpful Shader Explorer application.If you want to see any Shader Effects (
.sdfx
files), you'll need to view from the Github source repository linked above, eg. StrideForwardShadingEffect.sdfx
The bare minimum to do any advance rendering with a model/mesh (ie. apply a 'shader' to it) requires a Shader Effect.
The entry points to run the 'vertex shader' and 'pixel shader' (also known as the fragment shader) in a Shader Effect are declared as VSMain()
and PSMain()
. Note the documentation specifies the other possible entry points, but for the sake of this guide, only the vertex & pixel shader will be discussed since they are the most commonly used stages.
As per the Effect Language documentation, an example effect is given:
using Stride.Effects.Data;
namespace StrideEffects
{
params MyParameters
{
bool EnableSpecular = true;
};
effect BasicEffect
{
using params MaterialParameters;
using params MyParameters;
mixin ShaderBase;
mixin TransformationWAndVP;
mixin NormalVSStream;
mixin PositionVSStream;
mixin BRDFDiffuseBase;
mixin BRDFSpecularBase;
mixin LightMultiDirectionalShadingPerPixel<2>;
mixin TransparentShading;
mixin DiscardTransparent;
if (MaterialParameters.AlbedoDiffuse != null)
{
mixin compose DiffuseColor = ComputeBRDFDiffuseLambert;
mixin compose albedoDiffuse = MaterialParameters.AlbedoDiffuse;
}
if (MaterialParameters.AlbedoSpecular != null)
{
mixin compose SpecularColor = ComputeBRDFColorSpecularBlinnPhong;
mixin compose albedoSpecular = MaterialParameters.AlbedoSpecular;
}
};
}
Ignoring the params
/using params
declarations (beyond the scope of this guide), the mixin [ShaderName]
declaration are the lines of interest.
There exists Shader Classes named ShaderBase
, TransformationWAndVP
, etc, and the shader code from these Shader Classes will be copied/merged into this Shader Effect.
You may notice that there in the above shader there doesn't appear to be a VSMain()
or PSMain()
explicitly declared in this Effect. While most of the Shader Classes in this example doesn't exist in Stride, ShaderBase
is actually a real Shader Class from Stride and that Shader Class does contain a VSMain()
and PSMain()
. Do note that technically the VSMain()
& PSMain()
in ShaderBase
are empty so by itself would crash when executing this shader because neither assigns the output vertex position & output pixel color - it is expected one of the later fictional Shader Classes have overridden VSMain()
& PSMain()
to fulfilled this requirement, see the documentation on what variables must be assigned for each shader stage.
Applying a Shader Effect is a little complicated. In Stride, if you open the Graphics Compositor
asset in a project, you can see how the Stride engine is using their Shader Effects by default:
In the above image, the MeshRenderFeature
has four render stages (as seen in the Render Stage Selectors on the right section). When it is MeshRenderFeature
's turn to render, each stage will determine whether to include the model/mesh based on matching the Render Group
on the stage selector with the Render Group
on the ModelComponent
of the entity, then build the final shader by combining the Material properties into the Shader Effect
, as specified by the Effect Name
property. It is also the job of the render feature to pass any dynamic parameters declared on the shader (ie. passing application data to the shader, eg. game time, camera information, etc).
Note that
MeshRenderFeature
is special in that it is actually further divided by havingSubRenderFeature
s, which are used to pass any dynamic parameters instead of theMeshRenderFeature
, due to being the most complex render feature.eg.
TransformRenderFeature
is aSubRenderFeature
forMeshRenderFeature
, and you can see it passing data such as game time and camera settings to the shader in thePrepare
method.
Click here for a quick examination of 'StrideForwardShadingEffect.sdfx' if you're curious.
You can see the source code for StrideForwardShadingEffect.sdfx (relevant code copied below):
effect StrideForwardShadingEffect
{
using params MaterialKeys;
// Derive from StrideEffectBase
mixin StrideEffectBase;
// -----------------------------------------------
// Mix material and lighting shading for Pixel Shader
// -----------------------------------------------
ShaderSource extensionPixelStageSurfaceShaders = MaterialKeys.PixelStageSurfaceShaders;
if (extensionPixelStageSurfaceShaders != null)
{
mixin MaterialSurfacePixelStageCompositor;
mixin compose materialPixelStage = (extensionPixelStageSurfaceShaders);
mixin compose streamInitializerPixelStage = MaterialKeys.PixelStageStreamInitializer;
ShaderSource extensionPixelStageSurfaceFilter = MaterialKeys.PixelStageSurfaceFilter;
if (extensionPixelStageSurfaceFilter != null)
{
mixin (extensionPixelStageSurfaceFilter);
}
mixin child GBuffer;
}
// -----------------------------------------------
// Add direct and environment light groups
// -----------------------------------------------
mixin StrideLighting;
mixin child ShadowMapCaster;
mixin child ShadowMapCasterParaboloid;
mixin child ShadowMapCasterCubeMap;
};
For the most part, it doesn't seem much different compared to the earlier BasicEffect
example, every mixin is just a reference to a Shader Class which you can examine. I will leave it to the reader to discover where the VSMain()
and PSMain()
functions are declared.
Unfortunately, this guide will not go further into explaining how to plug in your own Shader Effect, as that falls into the 'advanced' category, however you can look at Stride's Space Escape sample project as an example (and look at the Graphics Compositor setup).
While it is possible to write a complete 'shader' with your own Shader Effect, this requires a bit of set up (eg. changing Shader Effect Name in the Graphics Compositor, passing additional data through the RenderFeature, etc).
The recommended way is not write a Shader Effect, but instead utilize Stride's Material system and write Shader Classes and add them to the relevant MaterialFeature
. This is because the Material system has basically already set up most of the "busy-work"/data of the shader and just exposes the most commonly used properties (eg. Displacement, Normal Map, Diffuse, etc) where you can slot in your data/modifications, then it will build the final shader code for you.
Stride's Material
can be seen as a shader template that exposes common shader properties to the editor, where you can set these property 'values' (eg. texture, color, etc):
If a more advanced property value is required, you can choose to set a Shader Class by changing the value type with the dropdown button on the right side next to the value to 'Shader' then start typing in your Shader's name:
The Shader textbox does not immediately display any available list of Shaders until you start typing at least one character in the textbox.
In order for the 'main' shader to know how to call your shader, your shader must inherit from the ComputeColor
Shader Class.
This is similar to how in C#, you inherit from an interface
/class
and implement (or override) a specific method so a third party library knows how to call your object.
As seen in the ComputeColor.sdsl, it has a Compute()
method that returns a float4
, and this gets called by some shader code that sits 'above' your Shader Class.
The reason why you can select different value types in the Material property is because the underlying shader exposes these properties via the compose
keyword.
eg. The 'Diffuse Map' property in the Shading
-> Diffuse
property can be seen in the shader itself:
shader MaterialSurfaceDiffuse : IMaterialSurfacePixel
{
compose ComputeColor diffuseMap;
override void Compute()
{
var colorBase = diffuseMap.Compute();
streams.matDiffuse = colorBase;
// Because matDiffuse can be modified when using a metalness, we are storing the colorBase into matColorBase
// so that we are able to query the original diffuse color without any modifications.
streams.matColorBase = colorBase;
}
};
Here you can see that diffuseMap
will be the exposed property, and its expected type is ComputeColor
. Therefore, when you write your own shader that is expected to slot into the Diffuse Map property, you must make sure it inherits from ComputeColor
to satisfy the class constraint.
An interesting side note: The other property value type options (Binary Operator, Color, Float4, Texture, Vertex Stream) are also
ComputeColor
shaders, but these are essentially hardcoded ones explicitly set to appear separately in the editor!
All properties exposed in the Material property that can be changed to a Shader type are expected to be Shader Classes that inherit from ComputeColor
.
Important Note: Be aware that different MaterialFeature
s interpret the returned float4
value differently, which unfortunately is not currently documented, and may require digging through the source code.
Examples:
- Displacement only uses
*.x
, ie. the first component of thefloat4
, and is just the displacement along the mesh's normal vector, ie. it is actually only a height displacement function. - Diffuse Map using the alpha value is dependent on other shader functions used. Most notably if
Misc
->Transparency
is not enabled, the alpha value will not make the model transparent (although 'Premultiply alpha' will still affect the final color, it will still be opaque).
- This is not a step-by-step guide on how to write your shader.
- The main aim of the demo project is to showcase various techniques that can be used when writing your own shader.
- This is not a feature complete water shader.
The shader was adapted and modified from the following sources:
- The displacement function: https://developer.nvidia.com/gpugems/gpugems/part-i-natural-effects/chapter-1-effective-water-simulation-physical-models
- Panning normal map(s): Water Shaders series from Ben Cloward
- Water color/edge detection: How To Create A Water Shader // Godot 4 Tutorial from StayAtHomeDev
Note: This project is not a one-to-one adaptation of the referenced materials.
The project provided in this guide shows a water shader that appears as the following:
The water shader has the following features:
- Allow multiple Vertex displacement functions (Gerstner waves)
- Allow multiple UV panning sampling of a Normal Map texture
- Setting the color of the water
- Setting the color of the Fresnel Effect on the water
- Distortion of objects under the water surface
- Setting the color of water edge (the contact point of the water and object)
Important Note: Ensure the Graphics Compositor have the following two settings enabled on the Forward renderer (as shown in the screenshot below):
- Bind Depth As Resource During Transparent Rendering
- Bind Opaque As Resource During Transparent Rendering
This is to allow the shader to 'see' the objects under the water surface so we can correctly anything under the water surface.
The Material properties of the Water Material are seen in the images below:
Despite the ability to use generics and/or compose ComputeColor
on a shader, the editor still has some limitations, eg.
- Generics can only expose 'Color' types as
float3
/float4
, so the editor can't show a color picker control. compose ComputeColor
can be a little clunky when used for something like sampling a texture.- Having an array of sub-shader functions does not appear in the editor.
To overcome the limitations, it is easier to create our own MaterialFeature
derivative to expose any desired properties and build the Shader Class with the defined properties.
The way to define your properties are exactly the same way as you would when exposing properties on your standard Stride SyncScript
/AsyncScript
s. When implementing your own MaterialFeature
, you must override GenerateShader(MaterialGeneratorContext context)
method to pass in your Shader Class through a ShaderMixinSource
object, and set any external custom data that the shader needs (you do not need to manually pass in data that Stride already passes in, eg. Global.Time
).
The easiest way to determine how each
MaterialFeature
should be implemented is to examine the source code of the existingMaterialFeature
s, eg.
- Displacement: MaterialDisplacementMapFeature.cs
- Surface/Normal Map: MaterialNormalMapFeature.cs
- Diffuse: MaterialDiffuseMapFeature.cs
- Transparency: MaterialTransparencyBlendFeature.cs
IMPORTANT: Make sure to save all your changes in the editor before modifying your custom
MaterialFeature
, as the editor may crash if the editor can't handle your changes.
It should be important to note that the shaders returned in MaterialFeature.GenerateShader()
are shaders that implement IMaterialSurface
or IMaterialSurfacePixel
(depending on the feature) and they are expected to override void Compute()
method. Because these methods do not have any return values, the way to 'share' data is to set specific streams
variables, which the overall shader uses (again, it would be wise to examine the source code to determine which variables gets used).
The project has three new MaterialFeature
s to make it easier to define the water properties:
-
MaterialWaveDisplacementFeature.cs
- Contains a hardcoded shader name reference
"MaterialWaveDisplacement"
which is defined in MaterialWaveDisplacement.sdsl - Exposes a list of displacement wave properties, which you can add as many displacement waves as possible, which is slotted in via
compose ComputeWaveDisplacement DisplacementFunctions[];
in theMaterialWaveDisplacement
shader. TheGenerateShader
method shows how to feed in the sub-shaders intoMaterialWaveDisplacement
'scompose
array property. - The wave property defined in WaveDisplacement.cs and the
GerstnerWave
object will supply the hardcoded shader name reference"ComputeGerstnerWave"
which is defined in ComputeGerstnerWave.sdsl WaveDisplacementBase
base class (whichGerstnerWave
derives from) exists so you can implement and choose a different wave displacement function.
- Contains a hardcoded shader name reference
-
MaterialWaveSurfaceNormalFeature.cs
- Contains a hardcoded shader name reference
"MaterialWaveSurfaceNormal"
which is defined in MaterialWaveSurfaceNormal.sdsl - Exposes a single
NormalMap
texture property to be passed in the shader and sampled by theWavePanningNormalMap
properties. - The wave panning texture sampler(s) properties are defined in WavePanningNormalMap.cs and the
WavePanningNormalMap
object will supply the hardcoded shader name reference"ComputeWaveNormalPanningUv"
which is defined in ComputeWaveNormalPanningUv.sdsl - Be careful of shader variable streams that cross shader stages. In our case,
MaterialWaveSurfaceNormal
is a pixel shader that readsstreams.WaveDisplacementPositionOffset
, however the output is set by theMaterialWaveDisplacement
shader, which is a vertex shader. If a pixel shader tries reading a vertex shader variable without it being set, you may encounter the error:[E_INVALIDARG/Invalid Arguments], Message: The parameter is incorrect.
To guard against this issue, we checkMaterialWaveDisplacementFeature.IsFeatureEnabled
was set in the material parameter fromMaterialWaveDisplacementFeature
, and then enable the shader code that reads the variable, via a macro.
- Contains a hardcoded shader name reference
-
- Derives from
MaterialTransparencyBlendFeature
so we do not need to set up the main shader details, but just passes in any water properties via the material keys (the material keys are auto-generated when you define the properties in the Shader Classes). - The color is set here, rather than the Diffuse Map because we need access to the depth and opaque textures, which allows the shader to 'see' what is underneath the water surface and override what is rendered on the water's surface (eg. less transparent water at greater depth, distorting the object under the water).
MaterialTransparencyBlendFeature.GenerateShader()
sets the necessary flags required for the shader to be executed after the opaque objects have been rendered first. Note that the Diffuse Map will still need to set a white color, due to quirks on how the overall shader works. - As stated earlier, make sure the
Graphics Compositor
has 'Bind Depth' and 'Bind Opaque' settings enabled.
- Derives from
Information about each shader implementation:
-
- This shader displaces the vertex position in all directions (ie. x, y, z). Because of this, we need to update the normal vector to ensure the lighting is correct, and also the tangent vector because the default shader requires this. The bitangent is not needed to be set. Therefore, we set the following
streams
variables:streams.Position
streams.meshNormal
streams.meshTangent
- This shader displaces the vertex position in all directions (ie. x, y, z). Because of this, we need to update the normal vector to ensure the lighting is correct, and also the tangent vector because the default shader requires this. The bitangent is not needed to be set. Therefore, we set the following
-
- This is shared by both
ComputeGerstnerWave
andMaterialWaveDisplacement
shaders. The purpose of this 'class' is soMaterialWaveDisplacement
knows what method should be called and what the output values are.
Ideally a custom struct as a return value would be used, however as of Stride version
4.2.0.2188
this fails to compile, so just passing data around viastreams
is used instead. - This is shared by both
-
- The actual wave position displacment implementation, based of the GPU Gems article.
- This shader uses generics for setting the parameters. GerstnerWave shows how values are passed from the editor to the shader. Note that when passing generics values, the values must be strings as valid shader data types (eg.
Vector2
values should be passed asfloat2(x, y)
)
-
MaterialWaveSurfaceNormal.sdsl
- This shader just sums all the normal vectors calculated from each
ComputeWaveNormal
shader. - The default engine's MaterialSurfaceNormalMap shader states
streams.matNormal
does not need to be normalized at this step, so this has also been ignored in our shader. - MaterialWaveSurfaceNormal also inherits
ComputeWaveDisplacement
so it can read the final offset (set instreams.WaveDisplacementPositionOffset
). This shows thatstage stream
can be used to transfer data across differentMaterialFeature
s (though this should already be understood due to being able to read things likestreams.Position
on multipleMaterialFeature
s fromPositionStream4
shader).
- This shader just sums all the normal vectors calculated from each
-
- This is shared by both
ComputeWaveNormalPanningUv
andMaterialWaveSurfaceNormal
shaders. The purpose of this 'class' is soMaterialWaveSurfaceNormal
knows what method should be called and what the output values are.
- This is shared by both
-
ComputeWaveNormalPanningUv.sdsl
- Samples the normal map texture based on the specified world size, direction, and speed.
-
- While the shader could expose the water parameters as generics, having the parameters as properties also exposes the material keys (see WaterColorTransparencyKeys). While not done in this project, you can potentially add a
SyncScript
and manipulate the water's color at run-time, eg. for day-night cycle. - The color output of the water mesh is determined by the depth distance from the water's surface to the underlying object.
- The water does not have realistic refraction. It is merely distortion done by sampling the opaque texture (ie. everything rendered before the transparent objects) offsetting the UV based off the calculated surface normal.
- The water edge detection is just a simple depth difference threshold.
- While the shader could expose the water parameters as generics, having the parameters as properties also exposes the material keys (see WaterColorTransparencyKeys). While not done in this project, you can potentially add a