PDA

View Full Version : Opacity not working as in TS12



whitepass
November 24th, 2015, 10:59 AM
901
The Opacity of windows in TANE is not the same as it was in TS12 and older. In TS12 you could change the opacity linearly, in TANE it is an ether/or at around 70%, this is in Blender.

Have submitted bug.

VinnyBarb
November 24th, 2015, 03:29 PM
Welcome to the "wonderful" world of TANE, we were promised what works in TS12, will work in TANE. Obviously not like several other features which work in TS12 but not in TANE. You know what these are as I have posted some of these in TANE not working features numerous times here.

I will wait until SP1 gets released, whenever that might be to see what works and what still does not and then make up my mind about continuing to create for TANE. Easy, no sleepless nights for me.

VinnyBarb

Tony_Hilliam
November 24th, 2015, 07:48 PM
I'm pretty sure the promise was "backwards compatible with almost everything able to work in TANE". I do know we're getting closer and closer.

VinnyBarb
November 25th, 2015, 04:06 PM
The answer repeatedly to us Content Creators was from the powers that be (after numerous tags/features etc. were/are still not working yet in TANE), keep creating in TS12 and that will work in TANE.

VinnyBarb

WindWalkr
January 15th, 2016, 06:43 AM
The short version is that you probably need to turn your Antialiasing setting up higher. For most materials in T:ANE (and for most materials in TS12 and TS2010 native mode, for that matter) we use alpha-to-coverage to produce order-independant transparency. The higher your AA settings, the more levels of opacity you are able to achieve.

chris

TRam__
January 15th, 2016, 07:55 AM
Currently as i suppose, only m.reflect uses alpha testing (not alpha-to-coverage). Are you going to add extra materials for choosing the alpha mode and choosing shadow rendering (excluded form shadow render, included into, and included but does not cast shadow itself)?

Alpha testing is worth for cabin windows and different transparent illuminated meshes (ground lighting spots and corona glows for example).

Excluding from shadows will be worth in cases when we do not want to see self-cast of shadow and can neglect shadows from other objects .

VinnyBarb
January 15th, 2016, 03:46 PM
The short version is that you probably need to turn your Antialiasing setting up higher. For most materials in T:ANE (and for most materials in TS12 and TS2010 native mode, for that matter) we use alpha-to-coverage to produce order-independant transparency. The higher your AA settings, the more levels of opacity you are able to achieve.

chris

Anyone please care to explain what this means to us mere mortals? I get this:


The higher your AA settings, the more levels of opacity you are able to achieve.


but the rest to me and I guess to others is "double dutch":


we use alpha-to-coverage to produce order-independant transparency.

I never experienced in TS10 or TS12 needing to crank up anti aliasing to get more transparency out of whatever needs to be transparent on a created asset. Why would this be needed, more anti aliasing is supposed to remove or smooth the "jaggies", slightly horizontal lines appearing to be "zig zag". Just like in my signature picture below which was made with anti aliasing set to max.

Thank you for an understandable answer.

VinnyBarb

Dinorius_Redundicus
January 15th, 2016, 05:57 PM
It will be interesting to see if Chris can provide a less "opaque" explanation of alpha-to-coverage. Google searching only turns up highly technical discussions couched in specialist terms that are equally incomprehensible to me. Like opening up a Russian doll.

However, I do know for a fact that Chris is right in saying anti-aliasing has affected transparency at least since TS2010. It is evident in anything that uses a blended alpha texture to give a smoothly graded change of opacity (e.g. in a shadow plane fading to transparent around the edges). Instead of smoothly fading out, the opacity decreases in steps if the anti-alias setting is low. It reminds me of bathtub rings.

What I didn't realise until recently (http://forums.auran.com/trainz/showthread.php?127087-Problems-with-opacity-in-m-tbumpenv&p=1477304#post1477304) was that AA settings also affect opacity set by the Blinn Basic parameters of a material. As you know, this type of opacity can be anywhere between 0-100% in increments of 1% within 3DS Max (and probably in Blender too). However in Trainz, if the AA is set too low, the number of opacity increments available to the game is reduced. You don't get bathtub rings (because this opacity applies to the whole texture) but in the worst case, a texture that is semi-opaque in Max will appear totally transparent or totally opaque in Trainz. Not good. Higher AA settings give a better approximation to the opacity specified in Max or Blender, but I'm guessing it still involves steps larger than the 1% increments available in Max/Blender.

As far as I can tell, the impetus for using alpha-to-coverage is that it makes it easier for the game to sort the spatial order of surfaces - ie. which ones are in front or behind others. However, to gain this advantage, it sacrifices fine control of opacity.


.

WindWalkr
January 15th, 2016, 07:43 PM
Are you going to add extra materials

We certainly will over time, but it's not the first thing on our priority list. As to exactly what materials, no idea at this stage. We don't want to go overboard with it, but there are obviously some enhancements that would be useful.



Alpha testing is worth for cabin windows and different transparent illuminated meshes (ground lighting spots and corona glows for example).

Do you mean alpha blending? A2C is already a superset of alpha testing.

chris

WindWalkr
January 15th, 2016, 08:16 PM
It will be interesting to see if Chris can provide a less "opaque" explanation of alpha-to-coverage. Google searching only turns up highly technical discussions couched in specialist terms that are equally incomprehensible to me. Like opening up a Russian doll.

It is by nature a pretty technical discussion. I'll have a stab at it. I'm going to assume you understand how a z-buffer works. If you don't, go and do your own reading on that.

Sorted Alpha
Alpha-blending is inherently incompatible with the use of a z-buffer. A z-buffer allows only one depth value to be tracked at each pixel, which means that once a surface has been rendered at 5m, a surface at 10m or 100m will be considered obscured. If the surface at 5m is a 10%-opaque window, it becomes trivial to see that many of the objects behind the window have stopped rendering. The only way to resolve this problem is to render in "painter's order" - that is, render the distant things first and progressively approach the camera. In this way, we can ensure that anything behind the window is not obscured by the window. This has a number of technical issues, however:

* Sorting is slow, and is performed on the CPU.
* Sorting typically works at the per-object level, so it doesn't work for two objects which fall into the same depth range.
* If you allow polygon-level sorting, then you have to dynamically split geometry at the CPU level and re-upload it to the GPU every frame, which is slow.
* Even with polygon-level sorting, you can still get a conflict- this requires slicing the polygons, which is even slower.

In practice, when this approach is used, we typically only perform the most basic steps and then live with the fact that the results are not perfect. This allows us to maintain a reasonable level of performance while achieving acceptable results as long as alpha isn't used too heavily. This doesn't work well when the blended surfaces span large z ranges, because then they can't sort effectively.

Trainz has a lot of objects which use alpha and span large z ranges, including common items such as:

* Splines (eg. high voltage power lines)
* Forests (eg. a series of individually placed tree objects which the game has stitched into a single mesh.)
* In fact, anything where stitching is likely to combine distant objects.

In practice, we limit our reliance on this approach to a few objects in the scene. Interior windows, pfx, that kind of thing. The lower number of objects makes conflicts much less likely.

Alpha-to-Coverage
Alpha-to-coverage is conceptually similar to a screen-door dither. Each level of alpha is converted to a screen-door pattern, where every pixel in the screen-door is either "opaque" or "fully transparent". We're not using alpha blending at all here, so regular z-buffer techniques work perfectly. The dither is performed at the hardware level, so it's effectively free. No sorting or slicing is necessary. No cpu work is necessary. You can have as many overlapping layers of transparency as you want and it will still work fairly well.

Because of the way the Anti-Aliasing is implemented in hardware, the dither doesn't actually occur at the pixel level. It occurs at the fragment level, where there may be quite a number of fragments per pixel. Once AA has performed its magic, the end result is true transparency, albeit quantised to certain levels. For example, say there were four fragments per pixel. This allows for the following combinations:

* All four fragments written- 100% opacity.
* Three fragments written, one left untouched- 75% opacity.
* Two fragments written, two left untouched- 50% opacity.
* One fragment written, three left untouched- 25% opacity.
* No fragments written- 0% opacity.

Obviously, the higher the level of AA selected, the more fragments we have available to this trick, and the better the result. Since increasing the AA level often has a minimal effect on performance, it's typically a fairly good tradeoff.

The quality of the dither also depends to some degree on your hardware, drivers, and driver settings. There isn't a single "correct" technique for implementing this, so different vendors give slightly different results.

The other negative of this technique is that blending is not additive. In real life, if you took a 50% opacity and placed it in front of another 50% opacity, the result should be something like 75% opacity. Even with a screen-door opacity, this is true- the screens won't align perfectly in your vision, although you may see some moire effects. With A2C however, the screens will align perfectly every time. The result is that the closer surface will exactly overlay the more distant surface, resulting in 50% + 50% = 50%. Anything visible through the distant object will also be distant through the near object. Anything obscured by the distant object will also be obscured by the near object.

In practice, this is not typically a problem, but there are some scenarios where it matters. A common example in Trainz is a forest made of trees that are 80% opaque. You would expect that this would quickly reach 100% opacity due to the overlapping trees, but with A2C this doesn't happen and you end up with a much less opaque forest than you expected. This is fairly easy to solve at the art level- just ensure that your trees have an opaque core- but it isn't expected unless you have a clear understanding of how A2C works.

hth,

chris

ATSF854
January 15th, 2016, 08:54 PM
I reported something related to alpha textures back when SP1 beta testing started. The way it looks now, seriously damages the visuals of the game. ARN numbers appearing far too sharp at a distance, and alpha textures just not looking nearly as correct as they should.

This image should describe what I'm talking about for the most part

http://hostthenpost.org/uploads/e1fe4cf40e556204c6df3ca3931aa760_thumb.jpg (http://hostthenpost.org/uploads/e1fe4cf40e556204c6df3ca3931aa760.jpg)

And it isnt only on locomotives where this appears, it's on a very wide range of scenery objects and textures.

adjusting anti-aliasing settings does not effect this at all.

WindWalkr
January 15th, 2016, 09:01 PM
ARN numbers appearing far too sharp at a distance

Difficult to judge from a screenshot, but it's quite possible that this is unrelated to the above. It looks suspiciously like there is no blending going on here at all.



adjusting anti-aliasing settings does not effect this at all.

AA settings won't necessarily get rid of the banding entirely, but if it's literally having no effect then you're probably not seeing A2C problems but some other effect (perhaps an inappropriate texture compression setting?)

chris

pcas1986
January 16th, 2016, 03:12 AM
We certainly will over time, but it's not the first thing on our priority list. As to exactly what materials, no idea at this stage. We don't want to go overboard with it, but there are obviously some enhancements that would be useful.
...

Won't press you at this time but that sounds very interesting.

I was also curious about the "Alpha to coverage" comment so I'll read your response and see if I can understand. I've just discovered Euclidean Geometry as it applies to computer graphics and my brain hurts.:D

TRam__
January 16th, 2016, 04:14 AM
Do you mean alpha blending?Yes, you are right.


In practice, we limit our reliance on this approach to a few objects in the scene. Interior windows, pfx, that kind of thing. The lower number of objects makes conflicts much less likely. And famous trainz compas, and red-and-green junction arrows... Why not to add separate material for blending?

As to "wrong order of object sorting"... In MSTS this order for definite interior/locomotive was setted with names of the materials. I don't propose to do the same way, but there are quite many cases when you can predict order of transparent meshes, and/or combinate blending with "alpha to coverage" (for traincar exterior front side of window is blended, backward is sorted).

As for "blending replacement"... For light beams something like pfx, but oriented along specified axis is required. For light glows (as http://hostthenpost.org/uploads/e1fe4cf40e556204c6df3ca3931aa760.jpg - the lower light glow) an additional "texture of self-illumination with mask" is required. But how to set static shadows and static light glows on the surface of other assets... Really, don't know how to replace them. With downward omni shadow/light source :D ?

Dinorius_Redundicus
January 16th, 2016, 07:45 AM
It is by nature a pretty technical discussion. I'll have a stab at it...........

.........hth,

chris

Wow, I understood most of it. Much appreciated and thanks for spending a good bit of your weekend on it!

WindWalkr
January 16th, 2016, 07:53 AM
Why not to add separate material for blending?

It's quite likely that we'll head in this direction. The main thing for us is to do it in a way that isn't overused, or we're back to square one. This could be as simple as a named material, or it could be something more situational.

One of the things that I've been mulling over is how to offer more material flexibility. We can't simply make materials a new asset type, for a few reasons:

* We pay a game startup cost per material type. A user who installed an extra 100 hypothetical material assets from the DLS might have to wait an extra five minutes for game startup. This isn't acceptable.

* We pay a runtime cost per number of materials in use (sort of.) The more materials we use, the slower everything runs. This doesn't prohibit the use of custom materials, but it certainly means that introducing large numbers of new materials without any mitigating controls is likely to lower performance noticeably.

* Shaders are heavily tied with the underlying render technology. As we make changes to the engine capabilities, the shaders may need to be rewritten to maintain support. It wouldn't be good to encourage third-party shaders, and then break them all (and all of the models that depend on them) with each major engine revision.

On the flipside, you and others have got some great ideas, and it's a shame to have to hold everybody back until N3V has time to look at each possible feature internally, when you're quite capable of doing this stuff externally given the right tools.

Exactly what to do about this? I'm not entirely sure yet. One option might be for this group to play around with the shaders a little bit (obviously you can already do this locally, but I'm talking about something a little more formal where we share and test the changes.) If there are changes that the entire group feel are positive and don't introduce new problems, we could consider making them official.

Not promising anything. Just thinking out loud to see what others have to say.

chris