为什么要渲染体积雾

因为它就在那里。

当然了,更重要的是因为体积雾能迅速的营造出场景的真实感与氛围感,谁不喜欢光源边上还有一小圈光晕呢,如果什么高亮的物体都能影响体积雾的话,是不是就不太需要bloom效果了呢。我实际地在生活中观察了一下,发现人眼所看到的光晕的效果,是光线进入眼睛之后产生的,也就是说bloom和体积雾确确实实是两种不同的效果。

体积雾的渲染方法

体积雾一般有两种渲染方法,一种是单纯的从相机出发对场景进行Ray Marching,每次进行采样和混合。这种方法主要的缺点是Ray Marching的次数会比较高才能有较好的渲染效果。在我的测试中,开启TAA的时候,20次Ray Marching就能得到很好的体积雾效果了;但是不开启TAA的话,可能会需要60次甚至更高的Ray Marching才能得到和TAA类似的效果。同时,Ray Marching体积雾只能在后处理阶段使用,在处理不写深度的透明物体的时候,会有一些瑕疵。

另一种方法就是使用一张3D纹理,将整个场景的体积雾储存在这张3D纹理中,当绘制物体的时候使用物体的世界空间坐标采样这张3D纹理,直接在片元着色器中计算雾效之后的颜色。这种方法使用的3D纹理会占用更多的内存,但是一定程度上能够正确的渲染所有物体,和60次Ray Marching相比,性能上也说不定会有一些优势。

本文的体积雾实现,参考了EA的寒霜引擎在Siggraph 2015年时的演讲diharaw的OpenGL的体积雾效果。值得一看的还有Bart Wronski在Siggraph 2014年的演讲,以及之后的荒野大镖客在Siggraph 2019年的课程。使用的是Unity2019.4.29的URP工程。

具体的实现方法

  1. 将场景中的需要渲染的雾的信息和阴影信息储存到一张和相机的视锥体对齐的3D纹理中。按照寒霜引擎的做法,纹理大小为(分辨率宽/8)x(分辨率高/8)x64,这样就和屏幕大小的2D纹理占用的内存大小一致了,但我看Unity官方的体积雾工程中,3D纹理的深度为128,就也把自己的设置成128了,纹理深度越深,体积雾的细节就能越高。3D纹理的宽高和视锥体对齐,这很好理解,而这张贴图的纵向深度和实际的深度要怎么对齐呢?最简单的就是和视空间的深度线性对应,但是这会导致近处体积雾的分辨率不够;另一种是和裁剪空间的深度线性对应,经过一些分析可以知道这比之前的方法更糟糕;目前我看下来最好的应该是和视空间的深度指数型对应,这样离相机越近3D纹理的像素会越多,越远则越少。本文只使用了均一的雾,但是可以使用世界空间的坐标、噪波和一系列的运算,计算出某一点的体积雾的浓度。
  2. 使用上面的雾的信息和阴影信息计算出散射的值Lscat,从下面的图可以看到Lscat是对所有的光源(本文只有主光源)计算\(f(v, l)Vis(x, l)Li(x, l)\)的和,\(Vis(x, l)\)即为在x点l光的可见性,可以通过采样阴影贴图来获得,\(Li(x, l)\)即为在x点l光的光强,可以简单的计算获得,\(f(v, l)\)用来表述在v的方向观察雾时得到l的散射量,一般被叫做Phase Function,我们使用的是Henyey-Greenstein Phase Function,其中参数g是雾的各向异性的程度,越靠近1表示光线穿过雾时越保持之前的方向,越靠近0表示光线穿过雾时均匀的散射,越靠近-1表示光线穿过雾时越会进行反射(在实际的光照中,我们会去掉\(\pi\)这一项,这样能和Unity的光照模型保持一致)。时空混合也在这一步可以完成。

$$ \tag{Henyey-Greenstein} p(\theta) = \frac 1 {4\pi} \frac {1 - g^2} {(1 + g^2 - 2g \cos \theta)^{\frac 3 2}} $$

Volumetric Fog Scattering

  1. 对3D纹理从相机近点到远点进行混合,这其实是一种Ray Marching,不过是在3D纹理的纹理空间进行Ray Marching,一次前进一个像素。当混合当前像素和上一个像素时,需要考虑符合物理的透光率(transmittance), \(\varepsilon\)是一个用于归一化的常量,l是两点之间的距离,c是介质的吸收率(一定程度上可以用雾的密度来表示)。具体的混合的计算和说明可以看EA寒霜引擎的PPT第28、29页。

$$ \tag{Beer-Lambert} transmittance = e^{-\varepsilon l c} $$

  1. 最终在绘制物体时,使用物体的世界空间的坐标,转换到3D纹理的坐标,采样3D纹理,使用透光率乘上物体本身的颜色,再加上雾的颜色,就得到了最终的体积雾的效果了。

相关代码和说明

VolumetricFog.cs

用于Global Volume中方便添加体积雾和控制各种参数。值得考虑的是maxTransmittance的值,因为相机远裁剪面会比较远,即使雾并不是很大,在最远处也总是能变成单一的颜色,这个值用来防止这种情况,人为地限制了最大不透光率(但是还是叫maxTransmittance)。fogNear这个参数实际是影响了3D纹理和相机之间的距离,最好还是设置成0,不然时空混合时会有一些瑕疵。

using System;

namespace UnityEngine.Rendering.Universal
{
    [Serializable, VolumeComponentMenu("Post-processing/Volumetric Fog")]
    public class VolumetricFog : VolumeComponent, IPostProcessComponent
    {
        [Tooltip("是否启用体积雾")]
        public BoolParameter enabled = new BoolParameter(false);
        [Tooltip("整体控制体积雾强度")]
        public ClampedFloatParameter intensity = new ClampedFloatParameter(1.0f, 0f, 1.0f);
        [Tooltip("体积雾最大的透明程度(用于和天空混合)")]
        public ClampedFloatParameter maxTransmittance = new ClampedFloatParameter(1.0f, 0f, 1.0f);

        [Tooltip("体积雾的颜色倾向,目前强度为0.03")]
        public ColorParameter fogTint = new ColorParameter(Color.white);
        [Tooltip("体积雾距离相机最近的距离")]
        public ClampedFloatParameter fogNear = new ClampedFloatParameter(0.1f, 0.01f, 10f);
        [Tooltip("体积雾距离相机最远的距离")]
        public ClampedFloatParameter fogFar = new ClampedFloatParameter(100f, 1.0f, 1000.0f);

        [Tooltip("体积雾的密度,越密效果越明显")]
        public ClampedFloatParameter density = new ClampedFloatParameter(3.0f, 0f, 10.0f);
        [Tooltip("体积雾受光的各向异性程度")]
        public ClampedFloatParameter phase = new ClampedFloatParameter(0.0f, -0.9f, 0.9f);

        public bool IsActive() => (enabled.value && (density.value > 0.0f) && (intensity.value > 0.0f));

        public bool IsTileCompatible() => false;
    }
}

VolumetricFogRendererFeature.cs

平平常常的RendererFeature,事实上RenderPassEvent应该在DepthPrePass之后,但我没改物体的shader,就放在后处理之前了。

namespace UnityEngine.Rendering.Universal
{
    public class VolumetricFogRendererFeature : ScriptableRendererFeature
    {
        [System.Serializable]
        public class VolumetricFogSettings
        {
            public RenderPassEvent renderPassEvent = RenderPassEvent.BeforeRenderingPostProcessing;
            public ComputeShader volumetricFogComputeShader;
        }

        private VolumetricFogRenderPass volumetricFogRenderPass;
        public VolumetricFogSettings volumetricFogSettings = new VolumetricFogSettings();

        public override void Create()
        {
            volumetricFogRenderPass = new VolumetricFogRenderPass(volumetricFogSettings);
        }

        public override void AddRenderPasses(ScriptableRenderer renderer, ref RenderingData renderingData)
        {
            if (renderingData.cameraData.cameraType == CameraType.Game)
            {
                VolumetricFog volumetricFog = VolumeManager.instance.stack.GetComponent<VolumetricFog>();
                if (volumetricFog && volumetricFog.IsActive())
                {
                    volumetricFogRenderPass.Setup(volumetricFog);
                    renderer.EnqueuePass(volumetricFogRenderPass);
                }
            }
        }
    }
}

VolumetricFogRenderPass.cs

平平常常的RenderPass,实际使用的时候,只会用到Froxel Generate PassScatter PassComposite Pass完全可以用物件本身的渲染来代替。

namespace UnityEngine.Rendering.Universal
{
    public class VolumetricFogRenderPass : ScriptableRenderPass
    {
        private const string profilerTag = "Volumetric Fog Pass";
        private ProfilingSampler profilingSampler;
        private ProfilingSampler froxelSampler = new ProfilingSampler("Froxel Generate Pass");
        private ProfilingSampler scatterSampler = new ProfilingSampler("Scatter Pass");
        private ProfilingSampler compositeSampler = new ProfilingSampler("Composite Pass");

        private RenderTargetHandle cameraColor;
        private RenderTargetIdentifier cameraColorIden;
        private RenderTargetHandle cameraDepth;
        private RenderTargetIdentifier cameraDepthIden;
        private RenderTargetHandle cameraDepthAttachment;
        private RenderTargetIdentifier cameraDepthAttachmentIden;

        private VolumetricFog volumetricFog;
        private ComputeShader volumetricFogComputeShader;
        private VolumetricFogRendererFeature.VolumetricFogSettings settings;

        private RenderTexture[] froxelTextures;
        private RenderTextureDescriptor cubeDesc;

        private static readonly string froxelTextureOneName = "_FroxelBufferOne";
        private static readonly int froxelTextureOneID = Shader.PropertyToID(froxelTextureOneName);
        private RenderTargetHandle froxelTextureOneHandle;
        private RenderTargetIdentifier froxelTextureOneIden;

        private static readonly string froxelTextureTwoName = "_FroxelBufferTwo";
        private static readonly int froxelTextureTwoID = Shader.PropertyToID(froxelTextureTwoName);
        private RenderTargetHandle froxelTextureTwoHandle;
        private RenderTargetIdentifier froxelTextureTwoIden;

        private static readonly string scatterTextureName = "_ScatterBuffer";
        private static readonly int scatterTextureID = Shader.PropertyToID(scatterTextureName);
        private RenderTargetHandle scatterTextureHandle;
        private RenderTargetIdentifier scatterTextureIden;

        private static readonly string compositeTextureName = "_CompositeBuffer";
        private static readonly int compositeTextureID = Shader.PropertyToID(compositeTextureName);
        private RenderTargetHandle compositeTextureHandle;
        private RenderTargetIdentifier compositeTextureIden;

        private Vector2 colorTextureSize;
        private Vector2 invColorTextureSize;
        private Vector3 froxelTextureSize;
        private Vector3 invFroxelTextureSize;

        private Matrix4x4 lastViewProjMatrix;
        private int flipReadWrite = 0;

        public VolumetricFogRenderPass(VolumetricFogRendererFeature.VolumetricFogSettings settings)
        {
            this.settings = settings;
            profilingSampler = new ProfilingSampler(profilerTag);
            renderPassEvent = settings.renderPassEvent;
            volumetricFogComputeShader = settings.volumetricFogComputeShader;

            cameraColor.Init("_CameraColorTexture");
            cameraColorIden = cameraColor.Identifier();
            cameraDepth.Init("_CameraDepthTexture");
            cameraDepthIden = cameraDepth.Identifier();
            cameraDepthAttachment.Init("_CameraDepthAttachment");
            cameraDepthAttachmentIden = cameraDepthAttachment.Identifier();

            froxelTextureOneHandle.Init(froxelTextureOneName);
            froxelTextureOneIden = froxelTextureOneHandle.Identifier();
            froxelTextureTwoHandle.Init(froxelTextureTwoName);
            froxelTextureTwoIden = froxelTextureTwoHandle.Identifier();
            scatterTextureHandle.Init(scatterTextureName);
            scatterTextureIden = scatterTextureHandle.Identifier();
            compositeTextureHandle.Init(compositeTextureName);
            compositeTextureIden = compositeTextureHandle.Identifier();

            lastViewProjMatrix = Matrix4x4.identity;
        }

        public void Setup(VolumetricFog volumetricFog)
        {
            this.volumetricFog = volumetricFog;
        }

        private static void EnsureArray<T>(ref T[] array, int size, T initialValue = default(T))
        {
            if (array == null || array.Length != size)
            {
                array = new T[size];
                for (int i = 0; i != size; i++)
                    array[i] = initialValue;
            }
        }

        private static void EnsureRenderTexture(ref RenderTexture rt, RenderTextureDescriptor descriptor, string RTName)
        {
            if (rt != null && (rt.width != descriptor.width || rt.height != descriptor.height))
            {
                RenderTexture.ReleaseTemporary(rt);
                rt = null;
            }

            if (rt == null)
            {
                RenderTextureDescriptor desc = descriptor;
                desc.depthBufferBits = 0;
                desc.msaaSamples = 1;
                rt = RenderTexture.GetTemporary(desc);
                //rt = new RenderTexture(desc);
                rt.name = RTName;
                if (!rt.IsCreated()) rt.Create();
            }
        }

        public static void EnsureRT(ref RenderTexture[] froxelTexs, RenderTextureDescriptor descriptor)
        {
            EnsureArray(ref froxelTexs, 2);
            EnsureRenderTexture(ref froxelTexs[0], descriptor, "Froxel Tex One");
            EnsureRenderTexture(ref froxelTexs[1], descriptor, "Froxel Tex Two");
        }

        public override void Configure(CommandBuffer cmd, RenderTextureDescriptor cameraTextureDescriptor)
        {
            RenderTextureDescriptor desc = cameraTextureDescriptor;
            desc.enableRandomWrite = true;
            cmd.GetTemporaryRT(compositeTextureID, desc);
            colorTextureSize = new Vector2(desc.width, desc.height);
            invColorTextureSize = new Vector2(1.0f / desc.width, 1.0f / desc.height);

            int width = desc.width / 8;
            int height = desc.height / 8;
            int volmeDepth = 128;
            cubeDesc = new RenderTextureDescriptor(
                    width, height,
                    Experimental.Rendering.GraphicsFormat.R16G16B16A16_SFloat,
                    0);
            cubeDesc.volumeDepth = volmeDepth;
            cubeDesc.dimension = TextureDimension.Tex3D;
            cubeDesc.enableRandomWrite = true;

            froxelTextureSize = new Vector3(width, height, volmeDepth);
            invFroxelTextureSize = new Vector3(1.0f / (width-0), 1.0f / (height-0), 1.0f / (volmeDepth-0));

            cmd.GetTemporaryRT(scatterTextureID, cubeDesc);
        }

        private void GenerateFroxel(CommandBuffer cmd, CameraData camData, RenderTargetIdentifier depthid,
            RenderTexture froxelReadid, RenderTexture froxelWriteid,
            ComputeShader computeShader)
        {
            int froxelKernel = computeShader.FindKernel("FroxelMain");
            computeShader.GetKernelThreadGroupSizes(froxelKernel, out uint x, out uint y, out uint z);
            cmd.SetComputeVectorParam(computeShader, "_FroxelTextureSize", froxelTextureSize);
            cmd.SetComputeVectorParam(computeShader, "_ColorTextureSize", colorTextureSize);
            cmd.SetComputeMatrixParam(computeShader, "_LastViewProj", lastViewProjMatrix);

            Matrix4x4 projMat = camData.GetGPUProjectionMatrix();
            Matrix4x4 viewMat = camData.GetViewMatrix();
            lastViewProjMatrix = projMat * viewMat;
            
            cmd.SetComputeTextureParam(computeShader, froxelKernel, "_DepthTexture", depthid);
            cmd.SetComputeTextureParam(computeShader, froxelKernel, "_FroxelTexture", froxelReadid);
            cmd.SetComputeTextureParam(computeShader, froxelKernel, "_RW_FroxelTexture", froxelWriteid);

            Color fogTint = volumetricFog.fogTint.value;
            fogTint.a = 0.03f;
            volumetricFog.fogTint.Override(fogTint);
            cmd.SetComputeVectorParam(computeShader, "_FogTint", volumetricFog.fogTint.value);
            cmd.SetComputeVectorParam(computeShader, "_NearFar",
                new Vector4(camData.camera.nearClipPlane, camData.camera.farClipPlane,
                            volumetricFog.fogNear.value, volumetricFog.fogFar.value));
            cmd.SetComputeVectorParam(computeShader, "_VolumetricFogParams",
                new Vector4(volumetricFog.phase.value,
                            volumetricFog.density.value,
                            volumetricFog.intensity.value,
                            volumetricFog.maxTransmittance.value));

            cmd.DispatchCompute(computeShader, froxelKernel,
                Mathf.CeilToInt(froxelTextureSize.x / x),
                Mathf.CeilToInt(froxelTextureSize.y / y),
                Mathf.CeilToInt(froxelTextureSize.z / z));
        }

        private void ScatterFog(CommandBuffer cmd, RenderTexture froxelid, RenderTargetIdentifier scatterid, ComputeShader computeShader)
        {
            int scatterKernel = computeShader.FindKernel("ScatterMain");
            computeShader.GetKernelThreadGroupSizes(scatterKernel, out uint x, out uint y, out uint z);

            cmd.SetComputeTextureParam(computeShader, scatterKernel, "_FroxelTexture", froxelid);
            cmd.SetComputeTextureParam(computeShader, scatterKernel, "_RW_ScatterTexture", scatterid);

            cmd.DispatchCompute(computeShader, scatterKernel,
                Mathf.CeilToInt(froxelTextureSize.x / x),
                Mathf.CeilToInt(froxelTextureSize.y / y),
                1);
        }

        private void CompositeVolumetricFog(CommandBuffer cmd, RenderTargetIdentifier colorid, RenderTargetIdentifier depthid, RenderTargetIdentifier scatterid, RenderTargetIdentifier compositeid, ComputeShader computeShader)
        {
            int compositeKernel = computeShader.FindKernel("CompositeMain");
            computeShader.GetKernelThreadGroupSizes(compositeKernel, out uint x, out uint y, out uint z);

            cmd.SetComputeTextureParam(computeShader, compositeKernel, "_ColorTexture", colorid);
            cmd.SetComputeTextureParam(computeShader, compositeKernel, "_DepthTexture", depthid);
            cmd.SetComputeTextureParam(computeShader, compositeKernel, "_ScatterTexture", scatterid);
            cmd.SetComputeTextureParam(computeShader, compositeKernel, "_RW_CompositeTexture", compositeid);

            cmd.DispatchCompute(computeShader, compositeKernel,
                Mathf.CeilToInt(colorTextureSize.x / x),
                Mathf.CeilToInt(colorTextureSize.y / y),
                1);

            cmd.Blit(compositeid, colorid);
        }

        public override void Execute(ScriptableRenderContext context, ref RenderingData renderingData)
        {
            CommandBuffer cmd = CommandBufferPool.Get(profilerTag);
            context.ExecuteCommandBuffer(cmd);
            cmd.Clear();

            EnsureRT(ref froxelTextures, cubeDesc);
            RenderTexture froxelReadTex = froxelTextures[flipReadWrite];
            RenderTexture froxelWriteTex = froxelTextures[1 - flipReadWrite];
            flipReadWrite = 1 - flipReadWrite;

            using (new ProfilingScope(cmd, froxelSampler))
            {
                GenerateFroxel(cmd, renderingData.cameraData, cameraDepthAttachmentIden, froxelReadTex, froxelWriteTex, volumetricFogComputeShader);
            }

            using (new ProfilingScope(cmd, scatterSampler))
            {
                ScatterFog(cmd, froxelWriteTex, scatterTextureIden, volumetricFogComputeShader);
            }

            using (new ProfilingScope(cmd, compositeSampler))
            {
                CompositeVolumetricFog(cmd, cameraColorIden, cameraDepthAttachmentIden,
                    scatterTextureIden, compositeTextureIden, volumetricFogComputeShader);
            }

            context.ExecuteCommandBuffer(cmd);
            cmd.Clear();
            CommandBufferPool.Release(cmd);
        }

        public override void FrameCleanup(CommandBuffer cmd)
        {        
            cmd.ReleaseTemporaryRT(scatterTextureID);
            cmd.ReleaseTemporaryRT(compositeTextureID);
        }
    }
}

VolumetricFogComputeShader.compute

重头戏来了,这个Compute Shader一共有3个kernel。第一个用来通过雾的信息和阴影的信息计算光照并储存到_RW_FroxelTexture中,同时也做了自身的和历史的混合,也对TAA做了适配。第二个用来做纹理空间的Ray Marching,计算散射的颜色和透光率。第三个其实是一个屏幕后处理的效果,将雾效画到屏幕上,当在物件shader中计算雾效时,就不需要这个kernel了。

GetDepth是将纹理的z转换到视空间的线性深度(Linear Eye Depth),GetRatio则是相反,把线性的视空间的深度转换到纹理的z坐标, NOT_SIMPLIFIED这个宏可以让人更好的理解指数型分布的计算过程。

为了简化问题,这边只考虑了主光源使用联级阴影时的体积雾效果,也没有考虑集成SH来计算全局光照对体积雾的影响。

#pragma kernel FroxelMain
#pragma kernel ScatterMain
#pragma kernel CompositeMain

#define _MAIN_LIGHT_SHADOWS
#define _MAIN_LIGHT_SHADOWS_CASCADE
#define _SHADOWS_SOFT

#include "Packages/com.unity.render-pipelines.core/ShaderLibrary/Common.hlsl"
#include "Packages/com.unity.render-pipelines.universal/ShaderLibrary/Core.hlsl"
#include "Packages/com.unity.render-pipelines.universal/ShaderLibrary/Lighting.hlsl"

Texture2D<float> _DepthTexture;
Texture2D<float4> _ColorTexture;
Texture3D<float4> _FroxelTexture;
Texture3D<float4> _ScatterTexture;

RWTexture3D<float4> _RW_FroxelTexture;
RWTexture3D<float4> _RW_ScatterTexture;
RWTexture2D<float4> _RW_CompositeTexture;

SamplerState sampler_LinearClamp;
SamplerState sampler_PointClamp;

float4 _ColorTextureSize;
//float4 _InvColorTextureSize;
float4 _FroxelTextureSize;
//float4 _InvFroxelTextureSize;

float _StepCount;
float4 _NearFar;
float4 _FogTint;
float4 _VolumetricFogParams;
// x: cam near, y: cam far, z: fog near, w: fog far

#define _Phase _VolumetricFogParams.x
#define _Density _VolumetricFogParams.y
#define _Intensity _VolumetricFogParams.z
#define _MaxTransmittance _VolumetricFogParams.w

float4 _TAAOffsets;
float4x4 _LastViewProj;

float3 NDCToWorld(float3 ndc)
{
    ndc.xy = 2.0f * ndc.xy - 1.0f;
    ndc.y = -ndc.y;
    float4x4 invJitteredVP = UNITY_MATRIX_I_VP;//mul(UNITY_MATRIX_I_V, _InvJitteredProj);
    float4 positionWS = mul(invJitteredVP, float4(ndc, 1.0f));
    return positionWS.xyz / positionWS.w;
}

float Linear01DepthToRawDepth(float z, float4 zBufferParams)
{
    return (rcp(z) - zBufferParams.y) / zBufferParams.x;
}

float LinearEyeToRawDepth(float depth, float4 zBufferParams)
{
    return (1.0f / depth - zBufferParams.w) / zBufferParams.z;
}

float GetDepth(float2 camNearFar, float2 vfNearFar, float ratio)
{
#if NOT_SIMPLIFIED
    float valLeft = log(vfNearFar.x / camNearFar.x);
    float valRight = log(vfNearFar.y / camNearFar.x);
    float val = lerp(valLeft, valRight, ratio);
    float depthVal = camNearFar.x * exp(val);
    return depthVal;
#else
    float valLeft = log(vfNearFar.x);
    float valRight = log(vfNearFar.y);
    float val = lerp(valLeft, valRight, ratio);
    float depthVal = exp(val);
    return depthVal;
#endif
}

float GetRatio(float2 camNearFar, float2 vfNearFar, float linearDepth)
{
#if NOT_SIMPLIFIED
    float valLeft = log(vfNearFar.x / camNearFar.x);
    float valRight = log(vfNearFar.y / camNearFar.x);

    float val = log(linearDepth / camNearFar.x);
    float ratio = (val - valLeft) / (valRight - valLeft);
    return ratio;
#else
    float valLeft = log(vfNearFar.x);
    float valRight = log(vfNearFar.y);

    float val = log(linearDepth);
    float ratio = (val - valLeft) / (valRight - valLeft);
    return ratio;
#endif
}

float HGPhaseFunction(float g, float cosTheta)
{
    float g2 = g * g;
    float denominator = 1.0f + g2 - 2 * g * cosTheta;
    return 0.25 * (1.0f - g2) * rsqrt(denominator * denominator * denominator);
}

float3 GetFogColor(float3 color, float3 lightDir, float3 viewDir, float g)
{
    float cosVal = dot(-lightDir, viewDir);
    return color * HGPhaseFunction(g, cosVal);
}

float Hash13(float3 p)
{
    p = frac(p * 0.1031);
    p += dot(p, p.zyx + 31.32);
    return frac((p.x + p.y) * p.z);
}

[numthreads(8,8,8)]
void FroxelMain (uint3 id : SV_DispatchThreadID)
{
    float2 texcoord = (id.xy + 0.5f) / _FroxelTextureSize.xy;
    texcoord += 0.5f * _TAAOffsets.xy;
    float jitter = Hash13(float3(texcoord, _Time.y * id.z));
    float ratio = (id.z + jitter) / _FroxelTextureSize.z;
    float depthVal = GetDepth(_NearFar.xy, _NearFar.zw, ratio);
    float rawDepth = LinearEyeToRawDepth(depthVal, _ZBufferParams);
    float3 positionNDC = float3(texcoord, rawDepth);
    float3 positionWS = NDCToWorld(positionNDC);

    float3 viewDir = normalize(GetCameraPositionWS() - positionWS); 
    float4 shadowCoord = TransformWorldToShadowCoord(positionWS);   
    Light mainLight = GetMainLight(shadowCoord);
    float3 lightColor = mainLight.color * mainLight.shadowAttenuation;
    float3 lightDir = mainLight.direction;
    
    float3 fogColor = GetFogColor(lightColor, lightDir, viewDir, _Phase);
    fogColor += _FogTint.rgb * _FogTint.a;
    float density = _Density;

    float4 finalFroxel = float4(fogColor, density);

    // Reprojection Temporal Filter
    float ujRatio = (id.z + 0.5) / _FroxelTextureSize.z;
    float ujDepthVal = GetDepth(_NearFar.xy, _NearFar.zw, ujRatio);
    float ujRawDepth = LinearEyeToRawDepth(ujDepthVal, _ZBufferParams);
    float3 ujPositionNDC = float3(texcoord, ujRawDepth);
    float3 ujPositionWS = NDCToWorld(ujPositionNDC);
    float4 lastPositionCS = mul(_LastViewProj, float4(ujPositionWS, 1.0f));
    lastPositionCS /= lastPositionCS.w;
    lastPositionCS.y = -lastPositionCS.y;
    float3 lastNDC = float3(lastPositionCS.xy * 0.5 + 0.5, lastPositionCS.z);
    lastNDC.xy -= 0.5f * _TAAOffsets.zw;

    if(all(lastNDC > 0.0) && all(lastNDC < 1.0f))
    {
        float linearEyeDepth = LinearEyeDepth(lastNDC.z, _ZBufferParams);
        float reprojRatio = GetRatio(_NearFar.xy, _NearFar.zw, linearEyeDepth);
        float4 froxelTex = _FroxelTexture.SampleLevel(sampler_LinearClamp, float3(lastNDC.xy, reprojRatio), 0);
        finalFroxel = lerp(finalFroxel, froxelTex, 0.95);
    }

    _RW_FroxelTexture[id] = finalFroxel;
}

float SliceThickness(int z)
{
    float ratioThis = z / _FroxelTextureSize.z;
    float depthThis = GetDepth(_NearFar.xy, _NearFar.zw, ratioThis);

    float ratioNext = (z+1.0f) / _FroxelTextureSize.z;
    float depthNext = GetDepth(_NearFar.xy, _NearFar.zw, ratioNext);

    return depthNext - depthThis;
}

float4 AccumScatter(int z, float4 accum, float4 slice)
{
    slice.a = max(slice.a, 1e-5);
    float thickness = SliceThickness(z);

    float sliceTransmittance = exp(-slice.a * thickness * 0.01f);
    float3 sliceScattering = slice.rgb * (1.0f - sliceTransmittance);

    float3 accumScattering = accum.rgb + sliceScattering * accum.a;
    float accumTransmittance = accum.a * sliceTransmittance;
    return float4(accumScattering, accumTransmittance);
}

[numthreads(16,16,1)]
void ScatterMain (uint3 id : SV_DispatchThreadID)
{
    float4 accum = float4(0.0f, 0.0f, 0.0f, 1.0f);

    for (int z=0; z<_FroxelTextureSize.z; z++)
    {
        int3 coord = int3(id.xy, z);
        float4 slice = _FroxelTexture[coord];
        accum = AccumScatter(z, accum, slice);
        //_RW_ScatterTexture[coord] = slice;
        _RW_ScatterTexture[coord] = accum;
    }
}

[numthreads(16,16,1)]
void CompositeMain (uint3 id : SV_DispatchThreadID)
{  
    float2 texcoord = (id.xy + 0.5f) * rcp(_ColorTextureSize.xy);
    float3 colorTex = _ColorTexture.SampleLevel(sampler_PointClamp, texcoord, 0).rgb;
    float depthTex = _DepthTexture.SampleLevel(sampler_PointClamp, texcoord, 0);

    float linearEyeDepth = LinearEyeDepth(depthTex, _ZBufferParams);
    float ratio = GetRatio(_NearFar.xy, _NearFar.zw, linearEyeDepth);

    float4 froxelTex = _ScatterTexture.SampleLevel(sampler_LinearClamp, float3(texcoord, ratio), 0);
    
    float3 accumScatter = froxelTex.rgb;
    float accumTrans = max(1.0f - _MaxTransmittance, froxelTex.a);
    float3 finalColor = colorTex * accumTrans + froxelTex.rgb;
    finalColor = lerp(colorTex.rgb, finalColor, _Intensity);

    _RW_CompositeTexture[id.xy] = float4(finalColor, 1.0f);
}

后记

好久没有写新的博客啦,之前一直在学c++,不怎么有时间做新的东西。体积雾还是一个蛮重要的效果,之前做Ray Marching的时候老是把握不住该步进多少,用了指数型的步进之后就感觉豁然开朗了。TAA其实也做了一版新的,还没来得及写,GTAO也学了一遍,就之后再说吧。