[DISCUSSION] Shader coordinate changes with transparency

Share your Construct 2 effect files

Post » Wed Feb 11, 2015 6:59 am

So I have been wondering - is it normal that local Shader coordinate space gets thrown out the window the moment transparency or another Shader comes into play? Say, I have a simple scaling Shader:
Image

If one element is set to transparent it suddenly uses screen-space coordinates:
Image

Which, obviously, breaks the effect, but that's nothing compared to several transparent elements:
Image

Now they all live in screen-space and somehow manage to exist within each other's bounds.

I'd like to ask @Ashley if this is supposed to be expected behaviour and if so, is there a way around it?
B
19
S
6
G
7
Posts: 1,101
Reputation: 6,146

Post » Wed Feb 11, 2015 8:55 pm

I do not know if you know this, but I tried the scroll and C2:s own fade together and it produced strange results. But maybe you posted this just because of my findings. or from your own findings with the border shader problem. Looks like the transparency is culprit for many of the shader troubles?
B
58
S
18
G
13
Posts: 447
Reputation: 10,740

Post » Wed Feb 11, 2015 9:06 pm

helena wrote:I do not know if you know this, but I tried the scroll and C2:s own fade together and it produced strange results. But maybe you posted this just because of my findings. or from your own findings with the border shader problem. Looks like the transparency is culprit for many of the shader troubles?


I did see your post and had observed this previously as well. Hopefully in the end we can have proper shader behaviour for all.
B
19
S
6
G
7
Posts: 1,101
Reputation: 6,146

Post » Mon Feb 23, 2015 12:31 pm

The reason this happens is to optimise rendering of shaders in some cases. If possible the object is simply drawn directly to the screen with the shader applied, which is the fastest way. However if you have other shaders or effects - including setting the opacity - then the shader doesn't know about them. In order to get that included in the visual appearance, it reverts to a full-chain rendering with an intermediate surface (a transparent render texture the size of the screen). The rendering process then goes like this:

1. Render the object with its effect to the intermediate surface.
2. Render the object from the intermediate surface to the screen using the opacity.

This then makes sure the shader effect has the opacity applied. If there was a second shader, it would render to that screen with the second shader instead of opacity. If you have two shaders and opacity, it goes:

1. Render the object with effect1 to surface1.
2. Render the object from surface1 with effect2 to surface2.
3. Render the object from surface2 to the screen with the opacity.

This does mean that foreground texture co-ordinates work differently at different stages of the effect pipeline. The workaround is to work from the destStart and destEnd shader parameters, which give texture co-ordinates based on the background draw area (which is always a texture the size of the screen) rather than the foreground (which could be either an object texture or an intermediate screen-sized texture).
Scirra Founder
B
402
S
238
G
89
Posts: 24,632
Reputation: 196,031

Post » Mon Feb 23, 2015 12:36 pm

Thanks for the explanation, Ashley.

The workaround is lacking however as we can instantly see in the cases shown above - if we'd like the contents to move at an angle along the object it basically becomes impossible. I'm guessing there's no real solution to this?
B
19
S
6
G
7
Posts: 1,101
Reputation: 6,146

Post » Thu Jun 25, 2015 4:43 am

Yeah I've tried to make a simple reflection shader effect (duplicated below, flipped, opacity fade out gradiated, etc) and then came across the mysterious vTex behaviour too.

I also noticed it in the distortion shaders when the layout scrolls the waves go crazy fast so it's only useful as an overlay.
B
5
S
1
Posts: 10
Reputation: 293


Return to Effects

Who is online

Users browsing this forum: No registered users and 0 guests