[DISCUSSION] Shader coordinate changes with transparency

0 favourites
  • 6 posts
From the Asset Store
Customize the animation of character when item changed
  • So I have been wondering - is it normal that local Shader coordinate space gets thrown out the window the moment transparency or another Shader comes into play? Say, I have a simple scaling Shader:

    If one element is set to transparent it suddenly uses screen-space coordinates:

    Which, obviously, breaks the effect, but that's nothing compared to several transparent elements:

    Now they all live in screen-space and somehow manage to exist within each other's bounds.

    I'd like to ask Ashley if this is supposed to be expected behaviour and if so, is there a way around it?

  • Try Construct 3

    Develop games in your browser. Powerful, performant & highly capable.

    Try Now Construct 3 users don't see these ads
  • I do not know if you know this, but I tried the scroll and C2:s own fade together and it produced strange results. But maybe you posted this just because of my findings. or from your own findings with the border shader problem. Looks like the transparency is culprit for many of the shader troubles?

  • I do not know if you know this, but I tried the scroll and C2:s own fade together and it produced strange results. But maybe you posted this just because of my findings. or from your own findings with the border shader problem. Looks like the transparency is culprit for many of the shader troubles?

    I did see your post and had observed this previously as well. Hopefully in the end we can have proper shader behaviour for all.

  • The reason this happens is to optimise rendering of shaders in some cases. If possible the object is simply drawn directly to the screen with the shader applied, which is the fastest way. However if you have other shaders or effects - including setting the opacity - then the shader doesn't know about them. In order to get that included in the visual appearance, it reverts to a full-chain rendering with an intermediate surface (a transparent render texture the size of the screen). The rendering process then goes like this:

    1. Render the object with its effect to the intermediate surface.

    2. Render the object from the intermediate surface to the screen using the opacity.

    This then makes sure the shader effect has the opacity applied. If there was a second shader, it would render to that screen with the second shader instead of opacity. If you have two shaders and opacity, it goes:

    1. Render the object with effect1 to surface1.

    2. Render the object from surface1 with effect2 to surface2.

    3. Render the object from surface2 to the screen with the opacity.

    This does mean that foreground texture co-ordinates work differently at different stages of the effect pipeline. The workaround is to work from the destStart and destEnd shader parameters, which give texture co-ordinates based on the background draw area (which is always a texture the size of the screen) rather than the foreground (which could be either an object texture or an intermediate screen-sized texture).

  • Thanks for the explanation, Ashley.

    The workaround is lacking however as we can instantly see in the cases shown above - if we'd like the contents to move at an angle along the object it basically becomes impossible. I'm guessing there's no real solution to this?

  • Yeah I've tried to make a simple reflection shader effect (duplicated below, flipped, opacity fade out gradiated, etc) and then came across the mysterious vTex behaviour too.

    I also noticed it in the distortion shaders when the layout scrolls the waves go crazy fast so it's only useful as an overlay.

Jump to:
Active Users
There are 1 visitors browsing this topic (0 users and 1 guests)