Skip to content

Learn DirectX 11

Sergey Stepanov edited this page Dec 18, 2017 · 39 revisions

The sole purpose of this project is learning DirectX 11 api. Practical Rendering and Computation with Direct3D 11 book is used as the main source of knowledge. The second part of the book is composed of various rendering techniques and examples. Some of these examples will be implemented in the project (from very basic up to some advanced techniques). The implementation process follows the structure of the book.

Chapter 8. Mesh Rendering

Static mesh rendering

static_mesh_example

The example is pretty simple but it is a good point to start implementing more sophisticated techniques. There are a cube and directional light source on the scene.

The main goals are:

  • Initialize DirectX device, context and swap chain.
  • Load geometry and texture data and upload the data into the GPU.
  • Load and compile vertex and pixel shaders.
  • Set up the pipeline and draw the scene.

Vertex skinning

vertex_skinning_example_01 vertex_skinning_example_02 vertex_skinning_example_03

The example uses famous Bob Lamp model which has 1 animation. Also there is a directional light source on the scene.

Vertices of such animated model has the following format: position, normal, texture coordinates, 4 bone indices and 4 bone weights. Vertex shader gets appropriate bones' matrices from StructuredBuffer<float4x4> by bone indices and uses bone weights to interpolate the outcome vertex position.

Actually the vertex shader is pretty simple. The main challenge is to load a model with its animations, compose vertex data (including bone indices and weights) and implement animation algorithm which computes bone matrices for each frame preserving bone hierarchy.

Displacement mapping

displacement_mapping_wireframe displacement_mapping_solid

The original example from the book processes an animated mesh. Vertex shader does vertex skinning and then tessellation shaders tessellate skinned vertices (control points). The implementation skips vertex skinning part (see previous example). There are no vertex and index buffers. Vertex shader creates 2 triangles which form a quad and passes 2 patches (3 control points each) to the hull shader. Hull shader conveys control points’ attributes and selects the maximum inner/outer tessellation factors. There are no any smart LOD-based decisions made (naive implementation). Domain shader gets patches, samples the displacement map and emits new vertices. In addition a normal map is used to lit the displaced surface properly.

Chapter 9. Dynamic Tessellation

Terrain tessellation

terrain_tessellation_00 terrain_tessellation_01

The terrain geometry is represented by a flat grid on the XZ plane in 3D space. The majority of the work is been done in the hull shader constant function, which determines the LOD for the current patch (a cell of the terrain grid), using the patch being rendered, as well as its four immediate neighbors. The main hull shader acts as a simple pass-through of the four corner vertices for the current patch. The domain shader takes the tessellated locations and generates the actual terrain geometry that will finally be rasterized.

Terrain Tessellation With Complanarity Factor

compute_complanarity_camera_dist_only compute_complanarity_with_deviation compute_complanarity_displacement

displacement_map

The example is an enhanced version of the terrain tessellation example. The main idea is that the the camera distance value is not a completely reliable measure of the tessellation. Some patches have to be heavily tessellated due to their geometry complexity. To estimate the geometry complexity for each patch we can compute some kind of smoothness factor of the patch. This computation step is done once by a compute shader. The shader takes the height map as its input and outputs the resulting 2D texture which contains smoothness(coplanarity) values for each patch. These values are used by the hull shader’s constant function to compute the tessellation factors.

Chapter 10. Image Processing

Gaussian filter

gaussian_filter_00

The example implements a compute shader which takes an input image and filters it using the gaussian filter. The gaussian filter is a separable filter which is exploited in the example. First the input image is filtered horizontally and then vertically. The input texture sampling direction is controlled in the compute shader by the g_offset_dir parameter.

Bilateral filter

bilateral_filter_00

The example implements a compute shader which takes an input image and filters it using the bilateral filter. The bilateral filter is not a separable filter which means it's not mathematically correct to use the two pass approach from the gaussian filter example. However, in many cases it is still ok to use a separable implementation which is done in the example.