2017년 7월 26일 수요일

G2 Online

After finishing my military service(which was basically working as programmer at a company), I started indie game project with my friend. We called it "Tactical fighters".
We wrote dx8 based in-house engine, and made prototype gameplay based on that engine.
And we showed it to my boss of the previous company. We couldn't receive fund but he proposed us to join his new company and do a new project within the company.

That is the start of "G2 online".



G2 was sequal of "GoldWing2". which had minor succuss in the company.
Basically G2 was flying FPS game. Not flight simulator.
The team was very small most time. only 2~3 programmers and small design and art team was involved. Though the features are very complex. It has most features of most MMORPG games and plus some new cool feature like replay system.

I referenced the Halo 3 replay system a lot. Players could record their play and share among other players.

And we made story mode too. I got to know that making game with story is very hard and time consuming job. It needs hard work of design/art team. If we didnt have story mode we could deliver this game a lot earlier.

Anyway this game didn't succeeded in the korean game market But had some fans in japan.

Youtube video uploaded by japanese fan.

It was many years ago and suddenly this memory came up to me this morning. And I could find some images and movie clip from internet.

This project taught me many thing.
ex : Making game takes long and hard work of many people.
ex : Hard work does not gaurantee success.
ex : Making game engine and game play with 2~4 programmers is stupid idea. (if it is not online game maybe it would not have been that hard).
ex : South koreans don't not like SF and Military at least on the game market.



Some more screenshot.
Boss(Gigantula)


Lobby
Boss(Leviathan)





2017년 7월 24일 월요일

Reconstructing world normal from depth to resolve decal artifact.

UE4 has deferred decal. and it is widely used in our levels.
Onday our artists complained that vertical projected decal pixel shows stretching artifact like below photo.


At first, I thought that multiplying dot(worldnormal, float(0, 0, 1)) to opacity could resolve this issue so that vertical face does not receive decal. But it turned out I was too naive.

Deferred decal material cannot read world normal gbuffer data while it writes normal to gbuffer. So I could use world normal to determine if it is vertical face.

I almost frustrated but soon found the solution. We actually can reconstruct world normal from depth buffer using ddx/ddy instruction.

world normal = cross(ddx(worldposition), ddy(world position)
Multiplying this to opacity give this!

Vertical face does not receive decal.








2017년 6월 19일 월요일

My little journey to next gen skin shading.

Recently, I spent some time reading papers / UE4 shaders code / various web pages about skin shading models. It was very interesting and I've got some achievements which I want to share.

Sub Surface Scattering
Skin is translucent surface and usually rendered with shading models based on scattering theory.
Light enters at some point and travels inside of translucent tissue and comes out at other point. This looks simple but needs some tricks to work in real time renderer.

Diffusion Profile
All of proper skin shading models are based on this theory and data. It describes how light is scattered along the distance from incident point.
People measures data how much light is scattered using real skin.

Perpendicular laser on real skin

Different diffusion curve among color channel
Using Gaussian function to fit this curve


Actual fitted gaussian parameters
And the curves are fitted to sum of gaussian functions. this fitted function is used to blur adjacent diffuse color pixels or to pre-integrate diffuse BRDF texture. I was curious that are there any measured data from asian or african american on the internet but got no success. What about some alien whose blood is blue? Maybe I should worry later about this.

Screen Space Sub Surface Scattering
Jorge Jimenez's method.
http://www.iryoku.com/sssss/

The core ideas is blurring diffuse irradiance in screen space using fitted gaussian kernel. Of course blur distance is adjusted according to depth and angle to be accurate.
This technique is well suited to deferred renderer and the quality is very good.
But some people would not agree this is the best method due to the low quality of low default diffuse irradiance and too much blur look. but this is still great skin rendering method.
I didn't do much research on this because UE4 version of this(SubSurfaceProfile) is perfect in my opinion.

Pre-Integrated Skin (Penner)
Another popular method is Penner's Pre-Integrated Skin.
https://www.slideshare.net/leegoonz/penner-preintegrated-skin-rendering-siggraph-2011-advances-in-realtime-rendering-course

The basic idea is to assume skin surface as circle and integrate diffusion profile according to two parameter(incicent angle and curvature of skin surface). He thinks that these two factors are most important parts of skin scattering.

This generates texture like below.


Actual hlsl shader code would be like,
float3 ScatteredLight = Texture2DSampleLevel(PreIntegratedBRDF, PreIntegratedBRDFSampler, float2(saturate(dot(N, L) * .5 + .5), Curvature), 0).rgb;

 U : indexed by cosine
 V : indexed by r(curvature)

There are many other details in this technique, but the basic idea is like this.

UE4 version of Pre-Integrated Skin Shading model
The overal structure is same,
-Pre-integrated BRDF texture(index by cosine and curvature like parameter)

But there are several difference between UE4 version and original.
-Pre-Interated Skin BRDF is applied only on shadowed region.
Unshadewd region is shaded by default lit(this is incorrect and make skin too bright)
-BRDF texture doesn't have different diffusion profile among color channel(frequency)
Texture tone is gray
BRDF texture contains grayscale(same diffusion among color channel) and multiplied by subsurface color provided by artist.
Actually subsurface color by artist is not that bad idea. But it is very difficult to make realistic human skin using UE4's pre-integrated skin shading model. 

I think this is why epic recommends SubSurfaceProfile model as high quality skin rendering.

By proper implementation according to oroginal paper, we can get below.
My implementation
Due to the limitation of deferred renderer, I could not implement normal bluring, It could look a little bit harsh with rough normal map. but it was ok with our art direction. (we could control our normal map).

Translucency
With these two methods, rendered human skin already looks like skin. But we need more.
Because skin is translucent tissue, incident lights actually transmit through thin parts.
And should have effects like below,

light transmitted through thin ear.


And Jorge Jimenez wrote paper about this.

The most important idea is calculating thickness from light incident point to actual shading point. With this thickness we could know how much light will transmit through.
We know these two points in shadow projection shader. (Shadow depth value and shaded world point)

Actually, UE4 is already calculating this transmittance value and define this as "Sub Surface Shadow Term". This term is usually multiplied to various subsurface lighting terms.(ex: TWO_SIDED_FOILAGE). But it seems that both of two skin shading models lacks this effect.

So I added translucency effect of Jorge Jimenez to Pre-Integrated skin shading model.

With environment lighting

Due to the shadow depth range precision issue, It seems that spot light is most accurate for calculating thickness(transmittance or sub surface shadow term). Point Light lacks this feature.

Skin BRDF with Indirect Lighting
All of the above methods are for direct lightings. Sometimes game characters could be in lighting conditions which have only indirect lighting(shadowed region). Then all of these fancy skin shading will disappear.

UE4 multiplies subsurface color to diffuse irradiance(from Spherical harmonics probe). It could be ok with that. But there is something we could do for this sad situation. The graphics programmers in Ready At Dawn suggest excellent technique for applying skin BRDF with indirect probe.

The idea is like this,
We use clamped cosine as transfer function which will be dotted with irradiance spherical harmonics probe. This is zonal harmonics coefficients projected to represent clamped cosine function.
The idea of RAD programmers is to use special zonal harmonics transfer function which is projected to represent diffuse skin BRDF. They explained their idea kindly in there paper.
http://blog.selfshadow.com/publications/s2013-shading-course/rad/s2013_pbs_rad_notes.pdf

I implemented this and below image is generated zonal harmonics coefficients indexed by curvature.
X axis is curvature and Y Axis are zonal harmonics order.

First row : order 0, Second row : order 1, Third row : order 2
x axis : surface curvature
(with 30000 randome sample during monte carlo integration)

And this is the result. The result was quite impressive.

You can compare with normal diffuse irradiance from light probe.


Lighting condition used.
Emissive sphere and 1 light probe(no direct light)
3 zonal harmonics coefficients are projected using below generation code.

Order 0
_func = [this](double theta, double rInv)
            {
                float R = rInv * 255;
                float radius = 2.0f * 1.f / ((R + 1) / (float)SizeX);

                float cosTheta = FMath::Cos(theta);
                auto BRDF = IntegrateDiffuseScatteringOnRing(cosTheta, radius);

                return 0.28209479177 * BRDF.X *sin(theta);
            };
            D0R = 2 * PI * MonteCarloIntegral1D(0, PI / 2, NumMonteCarloSample, rInv);

Order 1
_func = [this](double theta, double rInv)
            {               
                float R = rInv * 255;
                float radius = 2.0f * 1.f / ((R + 1) / (float)SizeX);

                float cosTheta = FMath::Cos(theta);
                auto BRDF = IntegrateDiffuseScatteringOnRing(cosTheta, radius);
                return 0.4886025119 * cos(theta) * BRDF.X * sin(theta);
            };
            D1R = 2 * PI * MonteCarloIntegral1D(0, PI / 2, NumMonteCarloSample, rInv);

Order 2
_func = [this](double theta, double rInv)
            {
                float R = rInv * 255;
                float radius = 2.0f * 1.f / ((R + 1) / (float)SizeX);

                float cosTheta = FMath::Cos(theta);
                auto BRDF = IntegrateDiffuseScatteringOnRing(cosTheta, radius);
                return 0.31539156525 *  (3 * pow(cos(theta), 2) -1) * BRDF.X *sin(theta);
            };
            D2R = 2 * PI * MonteCarloIntegral1D(0, PI / 2, NumMonteCarloSample, rInv);

I used skin BRDF integration from this link.

I think maybe there will be more advanced skin rendering technique later. But these are best for now.


2017년 5월 1일 월요일

Volumetric Fog

I implemented volumetric fog using Bart's method(screen aligned voxel & CS) before.



And a few days ago UE 4.16 preview seems to have volumetric fog feature which is exactly same method I used. I frustrated a little for a while(maybe 5 minutes :) ). but I was somehow proud of this event because they had chosen same algorithm at least.

Anyway, this method is easy to implement, quality is good, and performance is ready to use.

https://bartwronski.com/publications/

The idea is simple.

1. Convert Cascade Shadow Map into "Exponential Shadow Map"
2. Inject Sun Lighting(including shadow) into Voxel.
3. Scatter 2D along depth using 2D Compute Shader.

There were several difficulties(ex : ESM Blur should be different among CSM cascades),
but Implementation is intuitive in overall. If you use In-house game engine, it would be worth to implement this. If you use UE4, check out 4.16 preivew.

2016년 8월 4일 목요일

My strange global illumination solution.

Global illumination is spationaly/directinaly low frequency signal.
Lightmap is sptaionaly high frequency but cannot store high frequency directional data due to memory issue.
Specular IBL is also important part of Global illumination. most game engine uses pre-filtered cubemap for specular GI. This has high frequency directional data, but spatial frequency is very low(1 ~ 300 cubemap in the level in UE4).

So I thought that if specular IBL is low spatial freqency data, what if we use high frequency diffuse GI and forgive high frequency spatial diffuse GI?

Below are result of that thought. There is only 1 probe which can be updated in realtime(no lightmap) and hbao+.


Diffuse indirect lighting might be incorrect spatially. but matches with specular IBL.
and with very low memory(no lightmap) and can be easily interpolated(time of day).

probe can be interpolated using volume texture(3d clipmap in large world), or blended in tiled culling using compute shader. I tried both and chose tiled culling with CS.

2016년 4월 15일 금요일

Fourier Opacity Map volumetric shadow.

I implemented volumetric shadow of particle system in UE3.
There were several options for this effect. The most feasible 3 were like below.

1. per pixel deep shadow map(need CS, UE3 didnt have CS back then)
2. FOM
3. Simple thickness map

I tried #2 first. the result was quite good with some resource. But finally I chose simple thickness map technique because of the severe ringing effect of some particles with high opacity! ringing effect is seen in spherical harmonics too. because frouier and SH use wave like basis functions.

Today, I found that UE4 is using FOM for volumetric self shadow. and document is warning about ringing effect.

why did they choose FOM? maybe they must have found solution for ringing effect.
I will look into source code later.

2016년 3월 27일 일요일

Incorrect naming in UE4 skylight source code.

Recently, I am reading UE4 skylight code. and found that they call projected SH coefficients as "IrradianceEnvironmentMap".

Actually, this naming is wrong.
They make this Irradiance SH Function by producting cosine lobe in the end. But until then it is projected "Radiance SH" not "Irradiance SH".

Let's recall basic rendering process using spherical harmonics.

1. Project SH coefficients from radiance cube map.
2. Convert Radiance SH to Irradiance SH by dot product with cosine lobe
(Ravi Ramamoorthi's paper)
3. Evaluate Irradiance env map with normal vector

Let's see two different shader code for SH shading.

Below code is hlsl from MJP in gamedev.net
This is easy to understand.
There is radiance SH, cosine lobe.



And below is code from Peter-Pike Sloan
This code seems to be just a sh evaluation code.
The secret is coefficient for cosine lobe is already producted to radiance SH(and it becomes Irradiance finally!)


Later code may be a lot faster(because use fewer shader instructions).

UE4 call value after 1 as IrradianceEnvironmentMap(it becomes irradiance in the end, but not at first).

Although final rendering is OK, But Incorrect naming confuse stupid reader like me.