Vertexmotion 1.2.4 Unity 5

Posted on

Unity 4.1.2 The Unity 4.1.2 release brings you Web Player features and fixes. Read the release notes below for details. Note that in Unity 4.1.1 an issue was causing Asset Store packages to fail.

  1. Unity 5.5.2
  2. Unity 5.4.1
  3. James 1:2-4

主にUnity の事で自分が. 環境 Unity5.5.2p2 VertExmotion 1.4.2 セールの時に購入したまま寝かせていたアセットVertExmotion Pro. The Unity 4.1.2 release brings you Web Player features and fixes. Read the release notes below for details. *Note that in Unity 4.1.1 an issue was causing Asset Store. Torrent location.

In 4.1.2, UnityEngine.dll and UnityEditor.dll are no longer signed/strong-named, and compiled assemblies can now reference the UnityEditor and UnityEngine namespaces again. The Unity 4.1.1 release notes have been combined with the Unity 4.1.2 notes below.

For more information about the previous main release, see the. New to Unity?

Release notes Features. Web Player: Secret information can now be injected into a running web player via Javascript, with access to that information controlled by cryptographic signatures. Security.GetChainOfTrustValue can be used to retrieve this secret information, provided that the Security object is called from an assembly whose cryptographic signature is valid and matches the signature passed in when the secret is injected. For further information, see the Manual. Improvements. Android: Stylus support.

Unity 5.5.2

UI: Windows can now be displayed modally using the GUI.ModalWindow call. Modal windows appear on top of all other Unity GUI elements and prevent interaction with all Unity GUI elements not within the modal window while open. Only one modal window can be open during a frame. Fixes. Android: Fixed touch and stylus input on devices having a single surface emitting both at the same time.

Vertexmotion 1.2.4 Unity 5

Android: Fixed performance regression with static batching. Android: Added a workaround for android issue 41755 - ANR when using apple magic mouse. Android: Fixed an issue where 'Stream from disc' audio was really slow when used with the 'Split Application Binary' option. Android: Fixed an issue where 'Stream from disc' audio was broken when used with the 'Split Application Binary' option. DX11: Fixed GPU profiler not working (regression in 4.1). Fix Crash in libmono.so due to GCabort.

Fix ArgumentException: Invalid platform target' when trying to switch platform. Graphics: Fixed Graphics.activeColorBuffer, activeDepthBuffer etc. Crash when there's no active render target (regression in 4.1). GUI: Modal GUI windows are now always on top of all other Unity GUIs, regardless of GUI.depth value. GUI: When right-clicking on a window, context-menu events are once again being fired properly.

Native Client: Fixed 'Development Player' label showing on non-development players. Shuriken: Fixed an issue where particle rotations were broken. Social API: Fixed problem loading leaderboard scores in Mac OS X 10.8 GameCenter. UnityEngine.dll and UnityEditor.dll are no longer signed/strong-named. Compiled assemblies can once again reference the UnityEditor and UnityEngine namespaces. Web Player: Chain of Trust system now requires signed assemblies to be loaded with new Security.LoadAndVerifyAssembly method in order to be eligible to call Security.GetChainOfTrustValue.

Values stored in the Chain of Trust are still only accessible to assemblies signed with the specified public key. Webplayer: Fixes a bug where using the WWW class in the web player would cause 50ms hiccups for each instance. Webplayer: Application.ExternalEval and Application.ExternalCall will now properly escape Javascript strings with embedded escape sequences.

The depth-buffer approach only works with fully opaque objects. Semitransparent objects require a different approach. We'll deal with those in a future tutorial. This process is repeated for the secondary light, except now we're adding to what's already there. Once again, the fragment program is only run if nothing is in front of what we're rendering.

If so, we end up at the exact same depth as the previous pass, because it's for the same object. So we end up recording the exact same depth value. Because writing to the depth buffer twice is not necessary, let's disable it. This is done with the ZWrite Off shader statement.

Blend One One ZWrite Off Draw Call Batches To get a better idea of what's going on, you can enable the Stats panel at the top right corner of the game view. Look at the amount of batches, as well as those saved by batching. These represent draw calls. Do this with only the main light active.

Five batches, seven total. As we have six objects, you'd expect six batches. But with dynamic batching enabled, all three cubes are combined into a single batch. So you'd expect four batches, with two saved. But you might have five batches. The extra batch is caused by dynamic shadows.

Let's eliminate it by entirely disabling shadows in the quality settings, via Edit / Project Settings / Quality. Make sure that you adjust the quality settings that you're currently using in the editor. No more shadows, four batches. Why do I still have an additional batch? Those are only supported for static light mapping. We'll cover that topic in a future tutorial.

To support spotlights as well, we have to add SPOT to the keyword list of our multi-compile statement. #pragma multicompile DIRECTIONAL POINT SPOT Our additive shader now has three variants. // Snippet #1 platforms ffffffff: DIRECTIONAL POINT SPOT 3 keyword variants used in scene: DIRECTIONAL POINT SPOT Spotlights have a position, just like point lights. So when either POINT or SPOT is defined, we have to compute the light direction.

#if defined(POINT) defined(SPOT) light.dir = normalize(WorldSpaceLightPos0.xyz - i.worldPos); #else light.dir = WorldSpaceLightPos0.xyz; #endif Spotlight with 60° angle. This was already enough to get spotlights to work.

They end up with a different UNITYLIGHTATTENUATION macro, which takes care of the cone shape. The attenuation approach starts identical to that of a point light. Convert to light space, then compute the attenuation factor. Then, force the attenuation to zero for all points that lie behind the origin. That limits the light to everything in front of the spotlight. Then the X and Y coordinates in light space are used as UV coordinates to sample a texture. This texture is used to mask the light.

The texture is simply a circle with a blurred edge. This produces a light cylinder. To turn it into a cone, the conversion to light space is actually a perspective transformation, and uses homogeneous coordinates. What does UNITYLIGHTATTENUATION look like for spotlights? As there is no attenuation, only the cookie is sampled. #ifdef DIRECTIONALCOOKIE uniform sampler2D LightTexture0; uniform unityShadowCoord4x4 unityWorldToLight; #define UNITYLIGHTATTENUATION(destName, input, worldPos) unityShadowCoord2 lightCoord = mul(unityWorldToLight, unityShadowCoord4(worldPos, 1)).xy; fixed destName = tex2D(LightTexture0, lightCoord).w.

SHADOWATTENUATION(input); #endif Cookies for Point Lights Point lights can also have cookies. In this case, the light goes in all directions, so the cookie has to wrap around a sphere. This is done by using a cube map. You can use various texture formats to create a point light cookie, and Unity will convert it to a cube map. You'll have to specify the Mapping so Unity knows how to interpret your image. The best method is to provide a cube map yourself, in which case you can suffice with the automatic mapping mode. Point light cookie cube map.

5.4.0

Point light cookies don't have any additional settings. Point light with cookie. At this point, we'll have to add the DIRECTIONALCOOKIE keyword to our multi-compile statement. It is becoming quite a long list. Because it is such a common list, Unity provides us with a shorthand pragma statement that we can use instead.

Unity 5.4.1

#pragma multicompilefwdadd // #pragma multicompile DIRECTIONAL DIRECTIONALCOOKIE POINT SPOT You can verify that this indeed produces the five variants that we need. // Snippet #1 platforms ffffffff: DIRECTIONAL DIRECTIONALCOOKIE POINT POINTCOOKIE SPOT 5 keyword variants used in scene: POINT DIRECTIONAL SPOT POINTCOOKIE DIRECTIONALCOOKIE And don't forget to compute the light direction for point lights with a cookie as well. #if defined(POINT) defined(POINTCOOKIE) defined(SPOT) light.dir = normalize(WorldSpaceLightPos0.xyz - i.worldPos); #else light.dir = WorldSpaceLightPos0.xyz; #endif Point light with a cookie. What does UNITYLIGHTATTENUATION look like in this case? You still compute four vertex lights.

Some of them will simply be black. So you always pay the price of four lights. Switching between vertex and pixel lights. By default, Unity decides which lights become pixel lights. You can override this by changing a light's Render Mode. Important lights are always rendered as pixel lights, regardless of the limit.

Lights that are not important are never rendered as pixel lights. Light render mode.

Spherical Harmonics When we've used up all pixel lights and all vertex lights, we can fall back to yet another method of rendering lights. We can use spherical harmonics. This is supported for all three light types. The idea behind spherical harmonics is that you can describe all incoming light at some point with a single function. This function is defined on the surface of a sphere.

Typically, this function is described with spherical coordinates. But you can use 3D coordinates as well. That allows us to use our object's normal vector to sample the function. To create such a function, you'd have to sample the light intensity in all directions, then figure out how to turn that into a single, continuous function. To be perfect, you'd have to do this for every point on every object's surface.

This is of course not possible. We'll have to suffice with an approximation. First, we'll only define the function from the point of view of the object's local origin.

This is fine for lighting conditions that don't change much along the surface of the object. This is true for small objects, and lights that are either weak or far away from the object. Fortunately, this is typically the case for lights that don't qualify for pixel or vertex light status. Second, we also have to approximate the function itself. You can decompose any continuous function into multiple functions of different frequencies. These are known as bands.

For an arbitrary function, you might need an infinite amount of bands to do this. A simple example is composing sinusoids. Start with a basic sine wave. Sine wave, `sin 2pix`. This is the first band. For the second band, use a sine wave with double the frequency, and half the amplitude.

Double frequency, half amplitude, `(sin 4pix) / 2`. When added together, these two bands describe a more complex function. Two bands, `sin 2pix + (sin 4pix) / 2`. You can keep adding bands like this, doubling the frequency and halving the amplitude each step. Third and fourth bands.

Each band that you add makes the function more complex. Four sine wave bands, `sum(i=1)^4 (sin 2piix) / i^2`. This example used regular sine waves with a fixed pattern. To describe an arbitrary function with sine waves, you'd have to adjust the frequency, amplitude, and offset of each band until you get a perfect match. If you use less bands than needed for a perfect match, you end up with an approximation of the original function. The fewer bands you use, the less accurate the approximation gets.

This technique is used to compress lots of things, like sound and image data. In our case, we'll use it to approximate 3D lighting. The bands with the lowest frequencies correspond with the large features of the function. We definitely want to keep those. So we'll discard the bands with higher frequencies. This means that we lose the details of our lighting function.

This is fine if the lighting doesn't vary quickly, so once again we'll have to limit ourselves to diffuse light only. Spherical Harmonics Bands The simplest approximation of lighting is a uniform color.

The lighting is the same in all directions. This is the first band, which we'll identify as `Y0^0`. It is defined by a single sub-function, which is simply a constant value. The second band introduces linear directional light. For each axis, it describes where most of the light is coming from. As such, it is split into three functions, identified with `Y1^-1`, `Y1^0`, and `Y1^1`.

Each function includes one of our normal's coordinates, multiplied by a constant. The third band gets more complex. It consists of five functions, `Y2^-2` `Y2^2`. These functions are quadratic, meaning that they contain the product of two of our normal's coordinates.

We could keep going, but Unity uses only these first three bands. Here they are, in a table.

All terms should be multiplied by ` 1 / (2 sqrt pi)`.2 -1 0 1 2 0 `1` 1 `-y sqrt 3` `z sqrt 3` `-x sqrt 3` 2 `xy sqrt 15` `-yz sqrt 15` `(3z^2 - 1) sqrt 5 / 2` `-xz sqrt 15` `(x^2 - y^2) sqrt 15 / 2` This is really a single function, split so you can identify its sub-functions. The final result is all nine terms added together. Different lighting conditions are created by modulating each of the nine terms, with an additional factor.

What determines the shape of this function? Spherical harmonics are a solution to Laplace's equation, in the context of a sphere. The math is rather involved. The definition of a function part is `Yl^m = Kl^m e^(imvarphi) Pl^ m cos theta, l in NN, -l. `(3z^2 - 1) sqrt 5 / 2 = z^2 3 sqrt 5 / 2 - sqrt 5 / 2` so that's `z^2` with a factor, plus an extra constant.

James 1:2-4

As the whole table represents a single formula, the `-sqrt 5 / 2` constant can be merged into the `Y0^0` term. You can visualize the normal coordinates to get a sense of which directions the terms represent. For example, here's a way to color positive coordinates white and negative coordinates red. Float t = i.normal.x; return t 0? T: float4(1, 0, 0, 1).t; You can then visualize each term by using i.normal.x, and i.normal.x. i.normal.y, and so on. 1, y, z, x, xy, yz, zz, xz, xx - yy.

Using Spherical Harmonics Every light that gets approximated by spherical harmonics has to be factored into the 27 numbers. Fortunately, Unity can do this very quickly. The base pass can access them via a set of seven float4 variables, defined in UnityShaderVariables. UnityCG contains the ShadeSH9 function, which computes lighting based on the spherical harmonics data, and a normal parameter. It expects a float4 parameter, with its fourth component set to 1.

What does ShadeSH9 look like?