Why use High Poly - Sketch Up ... instead of learnig GMax & Blender

One of the ways of identifying high poly areas in a route is to walk the route. By walking I mean using Alt-Y to get down to track level and walk along quickly. If the camera stutters, there's a problem in that area and that means removing obects. Using this method, I've been able to identify a lot of "bad" objects and even bad track which I replaced with other more useable but not necessarily what I wanted to use in the first place.

I know this is after the fact, but since we have no way to know other than inspecting each model before using it, this method works.

John
 
I must admit that I have only read part of this thread, but still I have some things to say in this mather.

First of all do not blame or bannish the tool. I consider to move the creation of simple buildings from blender to sketchup since the process is so much faster. Sketchup and the Trainz exporter can be used to create desent low polly models if you know what you are doing. What we need is a good indication of efficient models in DLS and Surveior regardless of which tool they were created with.

There seems to be a few basic rules to follow when modelling in sketchup to avoid inefficent models:

1. Use the tool for things it's good at i.e. mostly buildings.
2. Create separate groups for intersecting objects unless you want them welded.
3. When mapping textures do not use the free pin mode as it will create an extra texture.
4. Map everything on a single texture unless you consider sharing textures among assets in a mesh library. (Sharing textures will need some xml editing for now)

I have and other fought regarding this and that comes down to a trade off pollys vs textures. Traditionally we have been creating one building using one texture say 1024x1024 pixels and then the same for the next building and such buildings end up someware betweene 22 polly for a very basic building and say 1000 polly for a station with lots of 3d details. In sketchup it's quite easy to use flat components for doors and windows, i.e splitting up the walls mapping those windows on a separate texture from the tiling texture used on the wall. Using that method the building might end up with 500 - 2k polly and a 1024x1024 polly texture for the details and a few 512x512 textures for tiling parts like walls, roof and so on. But these textures can than be shared among a large amount of buildings using the same mesh library.

Possitive and negative things with this as I can se it:

+ Fast creation of buildings, you do not have to stick with generic buildings for your route. It litteraly takes less than 30 minutes to create a textured building that takes a day to create in the traditional way with blender and gimp. Offcourse it will take a few days to set up the object and texture libraries.
+ Fewer textures if you have several buildings sharing them.
+ Higher texture resolution.
+- Buildings will look more similar since they share the textures.
- Higher pollycount.
- Generic textures, i.e. no shadows under roofs and so on.

If this is belived to be a good approach, it would be really nice if the exporter would export the textures with consistent texture names betweene models, i.e. based on filename och texture namne instead of a running texture id. For now you have to hand edit the XML and regenerate the IM files.

Further I belive it would be good if someone wrote a simple tutorial for creating good buildings in sketchup.
 
I wasn't really trying to add more to your plate Peter! Besides, I wasn't suggesting a program to "get rid of polys", only one that could analyze the Trainz database and identify which kuids were abnormally heavy in polycount.

It would be quite easy to gauge the size of the asset components if N3V would let us open things on their server. There's nothing mysterious about opening a cdp file and finding out how big the mesh file is. However I can't see N3V ever letting this ever happening.

In any case I can add a file size gadget to my ArchiveIndex tool (supplied with AssetX). Mesh poly count is roughly proportional to mesh file size, so we can estimate the poly count.
 
It would be quite easy to gauge the size of the asset components if N3V would let us open things on their server. There's nothing mysterious about opening a cdp file and finding out how big the mesh file is. However I can't see N3V ever letting this ever happening.

In any case I can add a file size gadget to my ArchiveIndex tool (supplied with AssetX). Mesh poly count is roughly proportional to mesh file size, so we can estimate the poly count.

Is there a practical way to count materials used in an asset?

A tool that would scan your downloaded assets, and perhaps add some keywords or ratings to them on the basis of poly count, material count, animation and/or LOD would be a great help in selecting the most efficient items for route-building.

Paul
 
We already have the "View technical details" option in CM providing data as poly count, number of textures, etc.. I think it should not be impossible to have CM calculate a "FPS-impact-index" tag value and to add it to the config when a CDP is made.

I recall Konni, I and some others asked N3V (Auran, at the time) to add such a feature well before any Sketchup model arrived on the DLS, but we had no answer.
 
If this is belived to be a good approach, it would be really nice if the exporter would export the textures with consistent texture names betweene models, i.e. based on filename och texture namne instead of a running texture id. For now you have to hand edit the XML and regenerate the IM files.

I hear what you are saying. The reason for the 'running ID', as you put it, is because the Ruby interpreter inside of SketchUp is incapable of handling file names that contain Unicode characters. This often occurs in models created by those outside the US, as their language naturally uses such characters. Therefore I came up with this compromise in order to get around the limitations of the programming environment. Also, tracking texture names between models would be (I think) prohibitive (at least from within SketchUp) as it would involve a great deal of searching of the users' hard drive, or coming up with some mechanism to track models.

If the learned folk here can supply me with some solid numbers about the poly-equivalency for a texture, then I can certainly update the summary dialog in my exporter to total up this extra load and add it to the grand total polygon count. I could perhaps also push the polygon count into the 'description' portion of he config.txt file if there is a general consensus that doing so would not be considered too heavy-handed on my part. Note that my exporter does include a warning as a model passes 25,000 polygons during export, and the user must consciously decide to continue the export after the message is issued even though they have been warned that the model may cause Trainz to exhibit poor performance.

Regards,
-Mike
 
Last edited:
If the learned folk here can supply me with some solid numbers about the poly-equivalency for a texture, then I can certainly update the summary dialog in my exporter to total up this extra load and add it to the grand total polygon count.

According to WindWalkr, every texture after the first one is equal to 300 polys.

I could perhaps also push the polygon count into the 'description' portion of he config.txt file if there is a general consensus that doing so would not be considered too heavy-handed on my part.

That would be nice indeed; at least this will provide some advice about the impact of the object.

I would like to add that I think that your RubyTMIX is a great and valuable piece of software; unfortunately it is often used to bring into Trainz objects that are absolutely unsuited for it (like the 30k poly lampposts).
 
One of the ways of identifying high poly areas in a route is to walk the route. By walking I mean using Alt-Y to get down to track level and walk along quickly. If the camera stutters, there's a problem in that area and that means removing obects. Using this method, I've been able to identify a lot of "bad" objects and even bad track which I replaced with other more useable but not necessarily what I wanted to use in the first place.

I know this is after the fact, but since we have no way to know other than inspecting each model before using it, this method works.

John

Currently the Developer Stats in Surveyor are able to identify 'bad spline', usually track, in TS10 and 12. Could these stats be augmented to identify the highest poly item in the view? A bit quicker than having to walk the route, though I agree that's a very good way to see if a route really 'works' on your PC.

Paul
 
According to WindWalkr, every texture after the first one is equal to 300 polys.

textures never equal polygons - maybe in terms of equivalent system load, but it should not count as polygons. even then i am certain that is very general. obviously a 128x128 texture is not the same as a 4096x4096 texture.
 
textures never equal polygons - maybe in terms of equivalent system load, but it should not count as polygons. even then i am certain that is very general. obviously a 128x128 texture is not the same as a 4096x4096 texture.

As I recall it each mesh carries an overhead of 300 poly equivalents and each texture one of 200 poly equivalents, this is in addition to the size of the texture file or mesh file.

CheerioJohn
 
i still think it is ridiculous to look at the overhead as poly equivalent in any case.

seems to me that it would be something you cant nail down to any specific equivalent esp if comparing it to a triangle. every GPU will handle things differently and apples do not = oranges :) . but if you have a certain amount of triangles those are always the same amount of triangles, so poly count is poly count regardless of anything else. again different GPUs will be able to draw more or less than others at any one time. but materials or textures or anything else should not add anything to the poly count of a mesh for the purposes of reporting the poly count in a utility.
 
OH MY GOSH, I cannot believe this thread is still alive and 13 pages long, do people ever shut their mouths?
 
i still think it is ridiculous to look at the overhead as poly equivalent in any case.

seems to me that it would be something you cant nail down to any specific equivalent esp if comparing it to a triangle. every GPU will handle things differently and apples do not = oranges :) . but if you have a certain amount of triangles those are always the same amount of triangles, so poly count is poly count regardless of anything else. again different GPUs will be able to draw more or less than others at any one time. but materials or textures or anything else should not add anything to the poly count of a mesh for the purposes of reporting the poly count in a utility.

Trainz has to load in the files and keep track of them and that's where the overhead comes from not the GPU, its simply expressed as a poly equivalent load.

Cheerio John
 
Trainz has to load in the files and keep track of them and that's where the overhead comes from not the GPU, its simply expressed as a poly equivalent load.

Cheerio John

Thanks, John. The next release of RubyTMIX will place loading information in the 'description' text inside the config.txt file, using the 300 polys-overhead-per-texture figure (after the first). It may not be perfect but it will be more information than has been available so far, and will hopefully help users avoid 'sticker shock' before an overly-heavy asset is placed in a route.

Regards,
-Mike
 
Currently the Developer Stats in Surveyor are able to identify 'bad spline', usually track, in TS10 and 12. Could these stats be augmented to identify the highest poly item in the view? A bit quicker than having to walk the route, though I agree that's a very good way to see if a route really 'works' on your PC.

Paul

There is an added advantage too, Paul. When walking the route, I also not oddities like floating roads, buildings, crooked things, bad textures, etc., that I jot down on scrap paper and go back in and repair.

I have found the statistics aren't really that useful for me. On my route, I use very few spline objects other than roads, power lines, and telegraph poles. There are occasional fences, but there's not enough for them to to impact the running and after removing them, I saw no difference in performance. The big culprits that these stats picked up were the things we need for Trainz - i.e., track. This didn't matter either which track I used whether it was the JR track included with TS12 or some of the older chunky mesh ones.

I did have a performance issue caused by a video card setting a few months ago. One particular section of my route developed wicked stutters that I could not find. I replaced track, deleted buildings, etc., but nothing would stop that section from becoming a slide show. After getting ready to cut baseboards, it dawned on me that I had fiddled with my video card settings! I had set my AA to Edge detect from standard and that little pull down box made a big difference in Trainz performance. This may work for other applications, but Trainz did not like it.

John
 
Thanks, John. The next release of RubyTMIX will place loading information in the 'description' text inside the config.txt file, using the 300 polys-overhead-per-texture figure (after the first). It may not be perfect but it will be more information than has been available so far, and will hopefully help users avoid 'sticker shock' before an overly-heavy asset is placed in a route.

Regards,
-Mike


it still doesnt add any polygons. that is what i was trying to get across. you could do a total poly count and a material count i suppose. and if you are able to maybe the size of the texture files used in megabytes or something, not poly equivalents...
 
it still doesnt add any polygons. that is what i was trying to get across. you could do a total poly count and a material count i suppose. and if you are able to maybe the size of the texture files used in megabytes or something, not poly equivalents...

True, but most users won't understand if an object with 300 polys and 10 8x8 textures is more efficient than an object with 2000 polys and one 1024x1024 texture. Using the 300 polys approximation, we at least have a rule of thumb. The other challenge in working out an indication of load is the use of LOD. Plus the file size of the textures is no help, as a 1024x1024 .jpg and a 1024x1024 .bmp will have the same load in-game, but may have wildly different raw file sizes.

Anyway - we need some kind of indication, and it wants to be pre-doenload, AND in Surveyor.

Paul
 
We've been complaining about the high-poly models, but has anyone else noticed that many of the simplified models with double-sided textures actually cause more performance issues than the high-poly objects?

I saw this recently on my own route. I had used an old building which fit the spot, but the stuttering was awful when I approached it. I removed the building and the performance picked up. As I was poking around the area, I went "inside" the building and noticed the same texture on both the inside of the asset as well as on the outside.

Something to think about...

John
 
That's an eye opener John! Perhaps we should just have a thread for troublesome assets where the kuid could be brought to the attention of others, whether high poly, or just an asset that affects performance badly.
 
Sloppy object creation is sloppy no matter what program was used to make it. Good tools make the creation easier and let you do more, IF you know how to use them.
 
Back
Top