PDA

View Full Version : Extremely Hi poly assets not displaying in preview.



clam1952
August 19th, 2015, 02:11 PM
Chris,

There seems to be an issue with some recently updated cars, not mine I should add!

When displaying in preview ( build 76536 ) all that appears is blue sky and nothing else, it also freezes the preview making it difficult to get out using escape, clicking the close button on the window shuts T:ANE down completely, they also don't appear in game.

On investigation these are extremely high poly assets for example 955212 Polys for main mesh 240696 polys for lod1 and 222 polys for lod 2 There are also 18 image files. These updates are all build 3.7

My question being is this failure to display by intent because of the extremely high poly count or some kind of bug or some other problem?

Obviously I am avoiding using these now like the plague! however they are used on a lot of routes as they are nice looking assets.
The original version was only 26044 polys no lod ver 2.7 but was working OK in T:ANE.

WindWalkr
August 19th, 2015, 07:25 PM
Not a problem that I've encountered. Do you have a repro?

chris

clam1952
August 19th, 2015, 07:36 PM
Not a problem that I've encountered. Do you have a repro?

chris

<kuid2:68787:1141:17> Smart_Blau2 is one of a batch of car updates, all afflicted with the same problem.

WindWalkr
September 17th, 2015, 11:32 PM
A horribly inefficient model, but it runs without any problems (other than performance) on my machine.

If you put the preview into performance analysis mode (not the default) then it may be running at minutes per frame (exact numbers dependant on your GPU) which may feel like it's hung, but it's still running correctly.

chris

JCitron
September 18th, 2015, 12:34 AM
A horribly inefficient model, but it runs without any problems (other than performance) on my machine.

If you put the preview into performance analysis mode (not the default) then it may be running at minutes per frame (exact numbers dependant on your GPU) which may feel like it's hung, but it's still running correctly.

chris

Minutes per frame... Is that like molasses left outside in the Arctic? :)

No wonder the program felt like it had hung...

clam1952
September 18th, 2015, 07:01 AM
Seems like the "Smart" thing to do is to avoid them. ;)
Wondering how many of the problems some are having with T:ANE are due to having assets like that installed and on a route especially when they sneaked in as updates for an item that previously didn't have any noticeable problems in game.

Only reason I checked them was because that creator has many assets on the higher poly end of the scale.

Dinorius_Redundicus
September 18th, 2015, 08:32 AM
Is anyone brave enough to post him a note about the horrible inefficiency of his assets? Maybe he will start making them in an efficient way in future. Perhaps he's just unaware of what he's doing since 950,000 triangles doesn't even trigger a warning it seems. The system is too busy catching criminals who dare leave >500 triangles in the lowest LOD.

WindWalkr
September 18th, 2015, 08:44 AM
Is anyone brave enough to post him a note about the horrible inefficiency of his assets? Maybe he will start making them in an efficient way in future. Perhaps he's just unaware of what he's doing since 950,000 triangles doesn't even trigger a warning it seems. The system is too busy catching criminals who dare leave >500 triangles in the lowest LOD.

Well, I can always add a limit. You wouldn't like it though :)

chris

WindWalkr
September 18th, 2015, 09:06 AM
.. and done. I've added warnings at 100k polygons (for traincar assets) and 10k polygons for everything else. Personally I think that these are both a bit high, but we'll start there and take feedback once we see how they work out in practice.

chris

zatovisualworks
September 18th, 2015, 09:06 AM
Limit? What limit? The limit is limitless... :hehe: Put no limit to 3D virtual life, so we can criticize (-a favourite fashion over the place of late-) limitless dreamers who still believe in miracles. No, I wouldn't like limits either. :D

The limitless side of Alberte :wave:

andi06
September 18th, 2015, 09:16 AM
.. and done. I've added warnings at 100k polygons (for traincar assets) and 10k polygons for everything else. Personally I think that these are both a bit high, but we'll start there and take feedback once we see how they work out in practice.

chris

Here we go again. You can't possibly have thought this through in the time you've taken.

Dinorius_Redundicus
September 18th, 2015, 09:49 AM
.. and done. I've added warnings at 100k polygons (for traincar assets) and 10k polygons for everything else. Personally I think that these are both a bit high, but we'll start there and take feedback once we see how they work out in practice.

chris

Sounds like a plan. A warning informs both creators and users of a high-poly asset, but doesn't prevent its use if people really insist on it. I'm sure there will be a cool and rational discussion of the principles and the numbers involved.

martinvk
September 18th, 2015, 10:31 AM
Not sure if it is possible but could the poly limit be tied to the size of the object? A breadbox at >10k has a problem while a castle with those polys might not. Naturally if all creators were conscientious and didn't use more polys that absolutely necessary we wouldn't be having this conversation.

clam1952
September 18th, 2015, 11:03 AM
Sounds reasonable to me as it's just a warning, that may just cause people to sit up and start thinking a bit more.

JCitron
September 18th, 2015, 11:42 AM
This is a good start in the right direction as it may get people to think a bit more before exporting models from sources such as TurboSquid which has nice still-object models, but not those that are optimized for animated environments.

Later on would it be possible to publish something like the heat map to show us where these models are in our routes? The areas with the darkest red are the worst offenders while those in other lighter colors are okay. Using the number ranges given in the warnings, this could set the color table range. For that I'll leave it to the programmers to figure out. :)

John

whitepass
September 18th, 2015, 07:23 PM
The C&O Kanawha has over 100k so maybe 200k?

WindWalkr
September 18th, 2015, 08:42 PM
Not sure if it is possible but could the poly limit be tied to the size of the object? A breadbox at >10k has a problem while a castle with those polys might not.

We could do this, but it's not clear to me that it's worth it. The algorithm would be a lot more complicated and there would be more arguing and additional stupid ways to exploit it.

We don't intend truly "large" meshes to be made with the current system anyway- once you start getting into that kind of scale you really want to make things piecemeal and LOD each mesh separately.

chris

WindWalkr
September 18th, 2015, 08:44 PM
The C&O Kanawha has over 100k so maybe 200k?

I know it is. I'm not sure that this changes anything. If we want to raise the limits then we'd need to demonstrate that it's typically useful to go above that number. I don't think it is. I'm open to additional counter-examples.

chris

WindWalkr
September 18th, 2015, 08:45 PM
Here we go again. You can't possibly have thought this through in the time you've taken.

.. because it takes so much effort to determine that one million polygons in a single object is a bad idea.

Seriously?

chris

Dinorius_Redundicus
September 18th, 2015, 10:38 PM
Well, I can always add a limit. You wouldn't like it though :)


chris


Ah, but see I did like it!


Perhaps I'm tempting fate to throw another idea at you. From a user's perspective it would be great if assets with this particular warning were also automatically marked such that they stood out visually from the rest in CM and Surveyor lists. Even better if they could also be grouped by sorting and/or by filtering.


The benefits of giving instant visual feedback to both route-builders and content creators about high-poly assets are pretty obvious I think.

WindWalkr
September 18th, 2015, 10:41 PM
Ah, but see I did like it!

Maybe :)

I didn't actually add a limit, just a warning. And some people didn't like it anyway :)



Perhaps I'm tempting fate to throw another idea at you. From a user's perspective it would be great if assets with this particular warning were also automatically marked such that they stood out visually from the rest in CM and Surveyor lists. Even better if they could also be grouped by sorting and/or by filtering. The benefits of giving instant visual feedback to both route-builders and content creators about high-poly assets are pretty obvious I think.

This is a bit more tricky and not something I can quickly slip into the code. I would like to have a fairly comprehensive rating system for content which considers both user feedback/usage and actual performance outcomes, but we're still a while off getting something like this.

chris

andi06
September 19th, 2015, 05:00 AM
.. because it takes so much effort to determine that one million polygons in a single object is a bad idea.

Seriously?

chris
Of course I'm serious. You aren't talking about a million polys warning, you're talking about 100K/10K.

To reduce the risk of spurious warnings, did you consider:

Mesh Size
- Is 100K with decent LOD actually worse than 25K with no LOD at all?
- How do we calculate the poly load, lowest LOD ignoring culled meshes (same as the >500 warning), highest main mesh, etc?
- What about non placeable assets such as mesh libraries or procedural track components. These could be both enormous and highly efficient.
- What about small placeable assets with dozens of large attachments. These could be tiny and yet bring any system to its knees.

Who Will a Warning Help
- This will be useless for route builders who need this information at the point of use, the Surveyor pick lists, perhaps where icon-0 used to be.
- This may be also useless for the authors, since the asset may be technically simple and error free in which case the warnings may never be seen.

Barely 15 minutes elapsed between your suggesting this and your adding it to the code. I would respectfully suggest that your coding finger may well have been moving without adequate intellectual control :-)

Despite the apparent negativity of my posts on these subjects I think we are moving, albeit painfully slowly, towards a much more useful validation system and much better content creation tools. But we are not there yet. Error messages are better but we can't get faulty content into the tools to help identify and fix the errors until we have fixed the errors. I would suggest that it is currently totally impractical to create content for TANE in TANE and I believe that resolving this is infinitely more important than additional validation, especially when current validation around polycounts is both flaky and undocumented.

norfolksouthern37
September 19th, 2015, 12:08 PM
Agree 100% Andi.

Validations like this just waste time and do not help anyone. Please put some consideration into errors and warnings that goes beyond a quick thought about what YOU think is best.

It is poorly planned 'limits' that have been the source of many problems in the past. Remember the shadow polygon limit? Ill bet that sounded like a good idea to someone too - except they didnt think about how it would impact things, nor did they even know the difference between a triangle a vertex or an index. In the past for sure but we all live and learn - please try to remember to learn.

WindWalkr
September 19th, 2015, 08:32 PM
Of course I'm serious. You aren't talking about a million polys warning, you're talking about 100K/10K.

The asset in question is a million polygons. I have said from the outset that 100k/10k is a starting place.



- Is 100K with decent LOD actually worse than 25K with no LOD at all?

Of course not.



- How do we calculate the poly load, lowest LOD ignoring culled meshes (same as the >500 warning), highest main mesh, etc?

Sum of the active high-LOD meshes.



- What about non placeable assets such as mesh libraries or procedural track components. These could be both enormous and highly efficient.

As with the LOD warnings, inactive meshes are exempted.



- What about small placeable assets with dozens of large attachments. These could be tiny and yet bring any system to its knees.

They could.



- This will be useless for route builders who need this information at the point of use

It will.



- This may be also useless for the authors, since the asset may be technically simple and error free in which case the warnings may never be seen.

Being error-free does not suppress warnings. If the creator chooses to simply ignore warnings unless there is an error, then there's not much I can do short of making it an error, which I'd prefer not to do until we're sure that it's nearly 100% robust (as opposed to a warning, where we can get away with some "false" positives.)



Barely 15 minutes elapsed between your suggesting this and your adding it to the code. I would respectfully suggest that your coding finger may well have been moving without adequate intellectual control :-)

Suggest what you like :) I'm not the kind of person to mull things over once I have the data. If we find some data which demonstrates that additional changes are required, then we can review the changes in that light. The easiest (perhaps ONLY) way to find that data is to test it.



chris

WindWalkr
September 19th, 2015, 08:36 PM
Remember the shadow polygon limit? Ill bet that sounded like a good idea to someone too

It solved the performance problem that we were having in that area. Just because you don't like some of the outcomes doesn't mean it wasn't a net win for the game.



nor did they even know the difference between a triangle a vertex or an index.

You like to keep repeating this, even though you're clearly wrong.


chris

norfolksouthern37
September 19th, 2015, 09:31 PM
i was not clearly wrong. you set a limit on polygons, that was fine, you set the same number limit on face indices - the two are incompatible.

it was stated that the limit was 4000 vertices or 4000 polygons or 4000 indices in the code.



this was obviously just some number that was picked out of the air. this limit raised much discussion among creators when it was first implemented. nobody seemed to notice though that there is no way to reach 4000 polygons with a 4000 indices limit. 1333 becomes the upper number for polygons, because you need 3999 indices to index 1333 polygons (3 indices per triangle). any more and you will run over that first limit so the limit that they were first telling all of us of 4000 polygons was incorrect.

The limit was supposedly fixed in SP1 or a hotfix or whatever but by that time it didnt matter because for creators trying to create compatible content (since most people refused to use the service pack or hotfixes) it became a real thorn in the side. This is what I mean by you not doing any practical testing of such things.

the point is, it wasn't at all thought out, just like the above, added because 'it was a good idea' to someone. It may have solved some problem, but it was implemented and stated incorrectly. I just want to be sure we dont run into those kinds of problems again that is all.

WindWalkr
September 19th, 2015, 09:40 PM
i was not clearly wrong. you set a limit on polygons, that was fine, you set the same number limit on face indices - the two are incompatible.

Firstly, they're separate entities, and despite "common sense" are not necessarily tied together. I fully agree that they *should* be tied together.

Secondly, you jump from the fact that a "redundant" check was made (harmless even if you were correct) to an "oh my god, the person implementing it obviously knows nothing" which is clearly not a logical jump.

It would be like me spotting a minor inefficiency in one of your models, and jumping to the conclusion that you can't use Max, and then going on your forum every few months to loudly proclaim (with no context) that you don't know how to use Max.



it was stated that the limit was 4000 vertices or 4000 polygons or 4000 vertices in the code.

Now who's making redundant statements? :)



the point is, it wasn't at all thought out, just like the above, just added because 'it was a good idea' to someone. It may have solved some problem, but it was implemented and stated incorrectly.

It was implemented and stated correctly. You just didn't like it.

I agree that it would have been good to have fixed it 6-12 months before release so that everybody could have time to test the results and improve any necessary assets, but when we find a major performance bug in a series of release-critical assets shortly before a major release, we don't have that kind of luxury.

chris

norfolksouthern37
September 19th, 2015, 09:48 PM
It was implemented and stated correctly. You just didn't like it.

You are welcome to make that assumption but I agreed that there should be no shadow models that were simply copies of their base asset with thousands of useless polygons, I also agreed with there being a limit, I did not agree with you telling us over and over that it was 4000 polygons when it was clearly not possible because the same number was imposed on indices. I was an still am correct on this. It simply is not possible and I explained it fully above and long ago.

Now the reason I bring this up from time to time is because it is a great example of how N3v fails to listen to those who actually use their software, not because I think whoever did it was incompetent just that for some reason did not see the correlation between polygons and indices and the supposed limit. According to yourself it was intended to help and was a very good idea and except for that technical problem stemming from not having been tested it would have worked. We want tane to eventually work. So we don't need any hastliy applied and not tested good ideas to get in the way. That is all, carry on.

WindWalkr
September 19th, 2015, 10:37 PM
You are welcome to make that assumption but I agreed that there should be no shadow models that were simply copies of their base asset with thousands of useless polygons, I also agreed with there being a limit, I did not agree with you telling us over and over that it was 4000 polygons when it was clearly not possible because the same number was imposed on indices. I was an still am correct on this. It simply is not possible and I explained it fully above and long ago.

Mate, there are several independent limits. I gave you the full list. You are quite right that some of those limits are redundant under correct usage, nobody is disputing that. The problem is when you go on to claim that we're full of it because you don't like the redundancies.

chris

norfolksouthern37
September 19th, 2015, 11:02 PM
I don't get what you mean by redundant. Those limits are not independent you cant have one or the other and you even admitted this was a mistake and it was fixed after the fact. Mate, I'm not gonna go back and forth, you corrected it, thing is I don't want us to have to go through it all again with tane and not have things right the first time without having to wait for a few service pakes before you decide that those using the software might actually be onto something... mate.

WindWalkr
September 19th, 2015, 11:15 PM
I don't get what you mean by redundant. Those limits are not independent you cant have one or the other

Redundant meaning that these are several individual items that are stored independently of each other, are checked independently, but which under correct usage are mathematically linked in a way that means checking one limit is sufficient.

You and I can both hope that the assets are all correctly formed, and certainly the default limit checks should have assumed that as the common case from the start, but it would be a bit silly for us to overlook the fact that they CAN be different and fail to check that.



Mate, I'm not gonna go back and forth

And yet here we are several years later and you're still bringing it up and claiming that it means we know nothing.

chris

norfolksouthern37
September 20th, 2015, 12:03 AM
you're still bringing it up and claiming that it means we know nothing.

That is merely you taking it personally. I have explained why it is important that this is considered. I do feel silly that it is still brought up after so long, but it is even sillier that you dont recognize that a problem existed. Even if those items could be different, one was insufficient to give (or even make possible) your stated limit of 4000 polygons for all cases of correctly formed assets. there is no way around that fact even if you did intend to check them independently - still cant make 4000 polygons with 4000 indices so you cant very well tell us there is a 4000 poly limit can you? no, you can't in any version or system or format or correctly or incorrectly formed asset. Just avoid that kind of thing. :)

WindWalkr
September 20th, 2015, 01:31 AM
Even if those items could be different, one was insufficient to give (or even make possible) your stated limit of 4000 polygons for all cases of correctly formed assets.

No, I completely recognise and agree with that. The point is that these limits are applied separately (due to the possibility of incorrectly formed assets) and I'm simply stating what the limits are. I agree that this is confusing, but it's an accurate statement of the individual limits. As I stated above, these checks were added at the last minute and we cared only that they solved the problem.



there is no way around that fact even if you did intend to check them independently - still cant make 4000 polygons with 4000 indices so you cant very well tell us there is a 4000 poly limit can you?

Counter-intuitively, the number of indices, vertices, and faces are not strongly tied. This is obvious for indices and vertices (since a vertex can be re-used) but what is less obvious is that the faces are an independent data structure. This SHOULD always equal 1/3rd of the number of indices for a triangle mesh, however that assumption is not made in these checks because (1) we don't know that the asset is correctly formed, and (2) this code is in Jet, and Jet supports meshes other than triangle meshes.

Oh, I completely agree that this is semantics. My point is simply that the original claim was an accurate portrayal of the checks being made. You can argue that the checks are redundant- you'd be correct for a well-formed Trainz asset, and it's quite possible that a poorly-formed Trainz asset would be rejected elsewhere before reaching this code, although it's difficult to guarantee that.

I could have just given the practical maximum number of triangles, and left it at that, and perhaps that would have avoided confusion, but that isn't what the checks are actually doing and I tend to prefer giving out the full information. All we need is one person who somehow manages to get some unexpected extra vertices into their IM file, and we'd have complaints that the checks aren't doing what we claimed- and they'ed have been right, if we're made a claim predicated solely on estimated maximum triangle count.

chris

norfolksouthern37
September 20th, 2015, 02:05 AM
I could have just given the practical maximum number of triangles

You did precisely that. Nevermind that Jet could handle something other than triangle meshes. Why would you count on the extreme and very unlikely case rather than the 99% of the time or practical use?

you stated the poly limit as 4000

I stated that was impossible, any model submitted over 1333 polygons was rejected.

You stated that you were sure it was 4000, but also that it was 4000 vertices or 4000 indices whichever was greater

I state there is your problem, in the mesh files' simplest form it takes 3999 indices to form 1333 polygons in a properly formed .im file under any normal circumstance. The indices limit would be triggered in 99% of tests ,therefore your statement about there being a 4000 poly limit to the mesh before divulging what the other two triggers were was FALSE when made practical use of.

Now, that not withstanding at all - the concern here, is dont throw arbitrary limits into TANE - and it is mentioned only as a concern. Also, do not discount concerns of practical use and try to factor those into future good idea limits.

im done - thanks.

WindWalkr
September 20th, 2015, 03:22 AM
You did precisely that.

I also quickly corrected myself and gave the full list.



Nevermind that Jet could handle something other than triangle meshes. Why would you count on the extreme and very unlikely case rather than the 99% of the time or practical use?

Because that's my job. If we forget the 1% cases, then things break horribly.




I state there is your problem, in the mesh files' simplest form it takes 3999 indices to form 1333 polygons in a properly formed .im file under any normal circumstance. The indices limit would be triggered in 99% of tests ,therefore your statement about there being a 4000 poly limit to the mesh before divulging what the other two triggers were was FALSE when made practical use of.

Sure, in 99% (and more) of cases, I would agree with you. My point is that saying "it's impossible because it's not normal" doesn't make sense.



Now, that not withstanding at all - the concern here, is dont throw arbitrary limits into TANE - and it is mentioned only as a concern.

Pretty much all performance limits are arbitrary. If you think that this warning is set too low, or too high, I'm open to feedback, as I've said from the start. So far I've heard very little in the way of constructive feedback, and that's pretty much expected until people have had a chance to try it for themselves. As I said at the beginning, we're not really going to know whether the limit is good until we see how many "false positives" and "false negatives" there are. It's entirely possible for there to be both, so we'll just have to test it and see.

I would like the set point to be low enough that we're just noticing it on the heaviest of existing "normal" assets.

chris

whitepass
September 20th, 2015, 11:17 AM
There are a lot of users who try to have content that has no Error or Warnings that will not be happy with that user unfixable Warning.

norfolksouthern37
September 20th, 2015, 11:35 AM
This is why we can't get anywhere.

martinvk
September 20th, 2015, 06:32 PM
Instead of some useless comments why not, as the man said, try it in real situations and see what happens? The warning might be too high, too low or just right but until you test, everything is just idle speculation.