End of Support for Anything below TS12

Status
Not open for further replies.
It's all to do with the improved error checking. Content made for unsupported versions simply will not be able to be accepted under the current error checking requirements, and in order to meet some of those means that the older versions cannot then use the content. Older versions of assets can still be downloaded in most cases, but only using the black pages or the 'Download This Version' command in TS12 SP1's Content Manager.

Shane
Shane, Let Zec reply with the company mantra. For pete's sake, think! Increased error checking isn't what has gone on--it's a CON. SPIN! and much of it (or at least error message generation) wouldn't be necessary if they took the elementary and SIMPLY EASY step of writing a translation pre-stage into their input process before the validation stage. Part of pre-processing in other words.

These callous and in effect, incompetent programmers have been spinning that yarn and this whole community has let them get away with it since 2009, STOP IT! You were played. If TC3, TC1&2, and The TRS's from 2.0-2.6 could find a texture in a subfolder why couldn't TS09, TS10, TS12? Can you reference up and down like the DOS/Windows system would allow in a texture.txt? No? Why not? Primary=..\texture.tga should be a legal value, as Primary=subfolder\texture.tga is, as should be Primary=Other_subfolder\texture.txt etc. But the first bombs, because they didn't handle the obvious case. Good programming? Good handling of a predictable need? Nope. Lazy, short-sighted, uncaring, all adjectives which fit better than good.

Was the nightmode sub-folder or _body sub-folder written by a secret Rail-Sim spy nefariously trying to get Trainz confused with his secret hacking of the DLS to change that legacy asset, or was it-- since it's part of the CDP by the same uploading CC--the very texture the Content Creator meant, the texture that those older flavor Trainz had no trouble finding? (e.g. Trainz 1.0--TRS2004!)

Was there a difference with me hand adding a bogeys container and relying on the defaults any different than a computer translation of bogey to '0'..., bogey-1 to '1'..., bogey-2 to '2'...??? Is there some reason Windwalkr and Moody can't figure out how to take a thumbnail keyword and write an if-then-else conditional to test for an '_art' subfolder, and if present, create a thumbnails container with apropos sizes? GIVE ME A BREAK. Sure, the data model changed. They know exactly why and how (it is to be hoped), so why didn't they, don't they, handle the evolution? Declare new assets with newer trainz-builds need to handle the newer data model, fine--have no problem with that philosophically, but to arbitrary also say legacy keywords are verbotten? What harm is there in type and region and asset-filename in newer assets? Ignore the line, and go on with your parsing!

How does a formerly legal keyword suddenly become an error? Bleak! This is criminally callous indifference to the body of co-operative work already in place on the DLS. Take something nearly as silly: Why is punctuation such as '=' and ';' in a Trainz asset name worth excluding if everything has to be in a quoted string anyway. Windows takes both in a folder or filename, you really want to stand there with a straight face and tell me Moody hasn't the capacity as a programmer to handle a quoted string? Well, perhaps. He hasn't figured out how to trim trailing whitespaces in a quoted string, so maybe they just need to learn how to parse strings a bit smarter and better. Or just care enough to exercise their brain in ways that won't adversely affect asset parameters that are known and occur on occasion.
Code:
if (!iswhite(*current_char)); zlastblk=current_char;

... OKAY, maybe that's too tough a line, but if they set that, and the value wasn't a quote, nor legal in the keyword type, and the pointer value didn't match the end of line, just maybe, they could perhaps figure how to eliminate the whitespace between the newline code and that lastblack pointer, don't you think?

We're talking a trivial amount of text parsing here, and amount dwarfed by a typical internet page display such as viewing this--all of which happens in less than the blink of an eye. Error checking which is needless isn't error checking, it's erroneous programming, and these guys owe this whole community an endless apology and unforgivable debt for just staying with the software.

Tony Hilliam and Grahm Edelman need to knock some heads together. This had better be handled better in TANE! // Frank
 
wow, you've got it all figured out. So instead of fixing the config file, just add more layers of code to catch and paper over the errors.
 
wow, you've got it all figured out. So instead of fixing the config file, just add more layers of code to catch and paper over the errors.
Well, yes to an extent, for I've been studying it all year--why else would a crazy Polack be installing and running each and every version of each code build available to me? AND, No Martin, not the config... that's the problem. THEY don't fix the config, WE DO.

The false faults I'm talking about can be handled by simple translation of old model data ( Pre-TRS2004, when they added bogeys {}, Thumbnails {}, etc. in the config and later, texture.txt files) with apropos mappings and a short subroutine for 'handling' each old tag. If you doubt it, consider all the assets which are v1-3 to v2-6 which work fine in TS12--they translate AFTER saving--on input to Railyard, Surveyor, Driver, I'm pointing out they should be translating first then saving an optimized form-- which in run-time loading would give large speed/performance benefits and be more robust. The later, because if they can't code it on committing it, they can't load it--which takes away the excuse they are error checking tighter and tighter. THAT measure is as tight as you can possibly get!

And moreover, if they did so, instead of a chump file with compressed data forms which still must be read in then put away, can save the committed file as a binary, an object file with a 'what do I do, and how to treat me header' which they can suck in much faster into a memory block. (a struct or union in C/C#), whose positions are known (The structures their application needs--direct transplants, so to speak). Oh, there are variable length objects, all the textures and text string elements, but again already know how to handle those, the struct or union is the handle and contains the pointers to such in general memory. They've got to handle the data anyway, there is no good reason to do it inefficiently and slowly! BTW, Windwalkr agreed with that assessment last July--but TANE was already underdevelopment, wasn't it? Hopefully, he gave it some attention and will surprise us with more modern approach.

  • (Feel free to look up any of these terms in Wikipedia, if unfamiliar, I'm feeling too lazy to link tonight! Sorry.)
Consider misspellings, another class of faults they could easily have 'coded around' (handled) with a heuristic algorithm that simply added the characters values (a hash table), compared that against a lookup table of legal keywords and their hash values, and a test as to whether such a key word was already filled, or remained empty.

They should already have the lookup table, and be tokenizing against it for all key words. Cutting the count in that table is the one slim real rationale that makes some logical sense, but instead of testing against that and returning a token saying this is a load1 product-kuid, they could simply return a 'ignore this line and treat it as a comment' code. We beat up Windwalkr pretty bad last July over eliminating comments (REM) in the config--where he agreed the pre-processing was a feasible approach.

Bottom line, in 2000, using the config as a loadable data source made sense given the state of computer hardware. Retaining that approach in 2005-2006 was far less justifiable, and by HDD's and memory available in 2008 for TS2009, highly questionable. The decision to ride that horse to the death has gone down hill from there, and a lot of the crap the CCs like yourself and those who've hung up their creativity-hats because of all the maintenance (did you sign a contract to be a perpetual slave, or figure that asset made in 2005 was going to be good for a decade or better?) and upgrades to an unnecessarily high base trainz-build spec simply isn't justifiable upon demands of data model evolution. The engine specs changed a lot in the TC's, but aside from that, how many structures (containers, tags) are truly different in function? Oh, some kinds now support different scripting specs--scenery items can no longer use a software class restricted to mocrossings, but that is the kind of change where tightening up a specification is entirely understandable and appropriate. But thumbnail versus thumbnails? bogey versus bogeys? ... Pfffle! What else moved, Ah, smokes, now effects in the mesh table, iirc. I've only fixed a handful of bad tags in those, but nothing I recollect would prevent automated repositioning. They already know what they default, if undefined. They know what they've eliminated (epbrakes, name-XX, old category-XXXXX-nn etc.). Converting each eliminated old form to a new one is and will be simple. Hell, I can do it!

THEY can probably, like PEV, even correct texture names with foreign alphabetical characters because at the basic level, all values in a computer are just numbers--codes given meaning by context, and the mesh IM has that sort of context. Indexed Mesh has, well, an index--a lookup associative list. Apply a similar hash when a texture isn't found--but compare to textures your alphabet restriction does allow, and a couple of odd characters in the sum and a check against the count of the inventory of textures will find such mappings as well. If and when detected, the mesh can be patched with the nearest legal character, and the filename adjusted if necessary. The original mesh retained like the original config.txt as a meshname.org.im file... one of the legal types they bundle into the chump files. AT THE LEAST, such an asset could generate a EXPLICIT error message pointing out the mesh had some French character or Cyrillic character, and so forth.

Then there's the whole body of I can't find the texture issues--I pointed out the absurdity of that above, and they have the code from the TRs and TC's they only need cut out and use as a handler core. Trimming the whitespace onwards after a value inside a quote... I sat down last week and wrote out a parsing function core which:
  1. Retained and numbered a source line pointer
  2. detected the trial keyword, and passed it to a function for validation as a key-word, handled a bad return value (misspelling) with a second function call, or
  3. continued parsing the line (return code indicates what to expect) for a begin-quote or digit until the next-black
  4. determined whether was inquote, or in number, or began a container (the '{' from C)
  5. assembled the value until not-inquote, not inline, or iswhite
  6. handled those if not a block inquote like license and description, finished incrementing to the line's end if in one such.
  7. and if still inquote and a value, trimmed off the whitespace, closed the quote, and if numeric, called a function to parse that and return a numeric value union with a code defining the type, so returned both.
  8. which was put away in a line definition struct, whether a reference (kuid), a string value (texture.txt, file name, etc.) or a float, boolean, or integer.
  9. then loop until the end of line
  10. then loop until for the next line
  11. then loop until the file bottom.
All that fits on just about the same number of coded lines as the numbered lines here above, and since it was preprocessing it wasn't doing a lot of processing but value capture and pairing with keyword identifiers, and processing flags, but as a first pass, it will handle comments, descriptions, license, author, etc fields and filenames (which could then be tested as a next step). It also sets flags (line is part of a container, part of a subcontainer, starts a container, starts a subcontainer, starts a quoted block, ends a... etc.) in the struct holding that pre-processed (prepared) line, logs the line number, and packs the line raw data, while retaining the input line unmolested. Gee, not even a screenful of code and the accumulated lines already know which lines to ignore in further processing, and which have container, which have external references (kuids) and which point another file in the folder.

In a good pre-process, nesting would be checked as well, so each line struct and/or raw line would have a pointer to it pushed on a stack, so they can be popped off in reverse order to check parsing from bottom to top. AT THE LEAST, the back read vs Front read will specify the lines with troublesome mismatched ends, and abort the generation of false faults lists before additional evaluation and validation. That eliminates the long lists of false errors from typos and unpaired container ends... speeding the by hand fix, by generating a proper error. Bottom line, they try to do too much with one pass, and cost us time by not taking the time at this input and evaluation stage to eliminate what they could do fairly simply. A screen full of code and an intermediate file compacted and ready to process with the confidence of knowing it's stripped, balanced and what can be ignored, has been.

Let's see, hand edit someone else's typos and hang onto version relevant tags from an earlier era, retaining a historic relation and preserving the ability to make a few minor regressive changes or have the user community spend the their hobby time chasing down trivial errors that could be handled far more efficiently with well designed code...

Hmmmmm. Tough choice here. Incompetence, disrespect for the users time, and unnecessary, needless, or incorrect faults messages or competence, consideration and efficient effectiveness...with far less curable assets reporting as faulty from the DLS. Which do you prefer? Personally, I think I'm for proper translation of older assets, before requiring human intervention. // Frank
 
I guess that I am done with the DLS.

I don't understand how cutting off both 09 and 10 users is going to help new content or skins that GROWS the game.

BTW, I have a FCT but there's no point to it if I can't access new content.
 
... Build to the highest version but use a TB of the lowest version you wish to support. That's how most savvy content creators have done it so far.

While not scientific, most of the TS12-version content I've seen on the DLS doesn't need to be, indicating those creators simply don't know what they're doing ...

Oh dear! I guess I fall into that category. A blunt way of alienating those who try.
 
We have known this was coming for quite awhile now - we are people acting so shocked.

It been well published and the way trainz has been since the very first versions of TRAINZ CE.
 
Last edited:
Less than 3 weeks notice? Swell!

I have been working on a TS2009 route for 3 years.
I see your latest version is 2009. It was already known in 2010 that your version would run out of support in 2014. In other words: Before you even started creating that route.
Can't find a post on this forum about it (although I have seen them come by at least a year ago already), but I did find a post of April 2010 on a different forum about it.
Apparently this information was also shared in an email years ago.

Based on the revision history of this page:
http://online.ts2009.com/mediaWiki/index.php/Trainz_Life-Cycle_Policy
... it was known in March 2010 when TS2009 would run out of support.
... it was known in June 2011 when TS2010 would run out of support.

Even if it was not known to someone for a specific piece of software, it is pretty normal for a software vendor to stop supporting a piece of software a year or two after a new version was released.
You got lucky with almost 6 years for TS2009 (released in November 2008) and almost 5 years for TS2010 (released in November 2009). It is over 3 years since TS12 was released.

Now don't get me wrong, I do see the point some of you are making that technical support for software via a helpdesk seems different than supporting the up- and download access for that software, but at the end of the day they are a commercial company. TS12 has been on sale for a decent discount a few times already and will be supported till September 2016.

Just for the record: I do not work for N3V.

We have known this was coming for quite awhile now - we are people acting so shocked.

It been well published and the way trainz has been since the very first versions of TRAINZ CE.
Agreed.

Although I did fear some people might have forgotten about it, which is why I made a link to that page in my latest freeware announcement.

IBTW, I have a FCT but there's no point to it if I can't access new content.
Not for new content. It does keep it possible for you to download current / older content. Via the "black pages" that is; Content Manager might stop working.
 
Last edited:
Even with sp1 and hf4 the default is still to the 2009 build number and I keep forgetting to change the build number in the config of new items Even when I put in the name that it is a TS12 item I get so many abusive email from 2009 users when the scripting or some feature does not work in 2009.

This will probly apply to TS12+SP1 since my upload made in TS12 without the service pack say they were made with Trainz 2009,
 
I'm sure I read somewhere on this forum that Payware folks, such as Jointed Rail, will be creating content purely for TS12 and upwards in the future, which probably stems from the fact that N3V will be unsupporting TRS09 & 10 versions from the end of September, it's possible that other content creators will follow suit in future, but, at the moment it's all conjecture until after September and we can see exactly what N3V have done on the DLS, etc.

However, as has already been stated regarding the thread title, THIS UNSUPPORTING MALARKEY IS NOT NEW, WE'VE HAD YEARS OF IT ALREADY, GET WITH IT...:hehe:

Cheerz. ex-railwayman.
 
It seems that as soon as N3V invents a new version, they much too quickly cut off support for lower versions ... Other Companies continue support for many, many years after a new operating system is put into place ... ie: ancient Vista is still supported by Microsoft, but I think XP is unsupported
 
It seems that as soon as N3V invents a new version, they much too quickly cut off support for lower versions ... Other Companies continue support for many, many years after a new operating system is put into place ... ie: ancient Vista is still supported by Microsoft, but I think XP is unsupported

To take your Microsoft example - they usually support products for around 5-8 years, although it's only the first few years or so that include new features. N3V's current support policy of around 4-5 years is reasonable in my opinion as technology moves on quite quickly and creators and users have to keep up.

Shane
 
Oh dear! I guess I fall into that category. A blunt way of alienating those who try.

Sorry, but if your content falls into that category, all you have to do is learn from it and fix your mistakes in the future. You can even fix past ones; not all systems allow that. For one to believe their content was absolutely perfect from the very first release...well...let's just say that it's "probably not true." Everybody makes mistakes. I've made plenty of boo-boos. Fortunately, the system was have is very flexible and forgiving of them, although it will be much less so after September 14th.
 
The problem here is one of planned obsolescence. Of a perfectly viable format, no less. There's really no reason to disallow TS2009-TS2010 content other than for purely punitive reasons. Very few (if any) new features were added between TB 2.9 and TB 3.6. Again, the system supports versioning (Trainz-build) so that content that takes advantage of features specific to a newer version can be set to require that version; content for a too-high version won't even download into lower versions.

Personally, I don't care that much about the DLS issue; there are plenty of workarounds. The part that's irritating are the phony-balony, bullstuff excuses that even a two-year-old who isn't a fanboy can see through. Just 'fess up, admit that the purpose of this planned obsolescence is to force users into buying a new version or eventually be left out, and move on. And, for those that don't like it, upload somewhere else. Problem solved, debate over.
 
Last edited:
The problem here is one of planned obsolescence ... no reason to disallow TS2009-TS2010 content other than for purely punitive reasons ... the purpose of this planned obsolescence is to force users into buying a new version or eventually be left out, and move on.

The purpose of "planned obsolescence" (as you put it) is not just to ensure continuing sales but to improve the product. You don't survive commercially by standing still. True, you could make your product fully backwardly compatible with all earlier versions but that always comes with a price. I have numerous routes from the DLS that use older assets (TS2004/6) that have passed all the TS12 error checks but frankly look terrible compared to assets created for TS10+ - some trees are a classic and often complained about example.

How often do you buy a new car, a new TV, a new computer? Why buy a new one if the old one still works well? I bought my current flat screen digital TV a short time after it was announced that all analogue TV transmission in my city would eventually be switched off. That switch over occurred last year. Was this a "dastardly plan" to force all TV viewers to buy new TV sets? Perhaps it was but I now have more "free to air" TV channels that I ever had before and each with a far better quality picture - the product has improved. (Yes - there is still little that is actually worth watching as I hate "reality" and "talent" TV shows).

As other posts in this thread have pointed out, older content that was created correctly will still work (and does work) in TS12. TS2009-TS2010 content has not been "disallowed". If you want to keep creating assets with TS9 or 10 and upload them to the DLS, then the solution may be as simple as updating the build number in the config.txt file - all other factors being correct.

Auran/N3V have always made it clear that support for their products does have a timeline. It is even possible that, legally, N3V could have immediately dropped all support for older Auran versions of Trainz when they took over the company. Obsolescence and change are a permanent feature of modern technology. You either move along with it or you are left behind.
 
Last edited:
I guess I'm with PWare and MartinVK on most of their points. My view is that newer versions of Trainz should be better than earlier versions and, if one of my assets can be updated to take advantage of such improvements, then I will update it. But I fully accept that others feel differently and its their choice.

I do have a problem making an asset to TS12 requirements, for example, and then deliberating making the Trainz build in the config.txt for a lower version. This means I need to validate it not only in TS12 but also for the build specified and any other builds in between. I made a decision recently that I would no longer do that. If that makes me less "savvy" than smart content creators then so be it.

To illustrate my point I recently did some script writing for another content creator who wanted his asset to run in TS10. I have TS10 installed but my preferred script debugging tool (Andi Smith's MessageBox) will not work in TS10 so fault finding became tedious.

Don't get me wrong. I've loved every version of Trainz since my first version (TS04) and have the whole collection since. But If I'm going to watch football on TV, I'd rather watch it on a wide screen colour digital TV than a clunky analog B&W TV I might have out in the garage. (this is a moot point since our analog TV broadcast signals are also switched off).

Cheers from a wet Sunday morning.
 
Even with sp1 and hf4 the default is still to the 2009 build number and I keep forgetting to change the build number in the config of new items Even when I put in the name that it is a TS12 item I get so many abusive email from 2009 users when the scripting or some feature does not work in 2009.
Well, the obvious solution there is to vett it in TS2009 as well like Paul says below, and if there is a difference, perhaps generated a TS2009 suffixed version, and the TS12 v3.7 up version. This is one reason for CC's like you and Paul to run multiple installs of an asset. If the build needs only v3.4, then assign that, if it doesn't need a higher, or graphics feature in a higher, then one or more versions is a good way to go, I'd think. Hell, with the KUIDs, you don't even have to change the identity--just put a disclaimer in the description. If the asset uses a texture feature not supported in the lower, hack-hack ('//') line beginning comments work fine in texture.txt files... unlike config.txt files. // F

I'm sure I read somewhere on this forum that Payware folks, such as Jointed Rail, will be creating content purely for TS12 and upwards in the future, which probably stems from the fact that N3V will be unsupporting TRS09 & 10 versions from the end of September, it's possible that other content creators will follow suit in future, but, at the moment it's all conjecture until after September and we can see exactly what N3V have done on the DLS, etc.
Cheerz. ex-railwayman.
Personally, I figure they aren't treating the CC's as adults. If the asset needs a script available only in a newer version, the CC would know it. Like pre-processing, they have the capability to pre-test and vett to the build standard of the version, they choose not to do so, and instead force the community to chase higher version numbers when some asset doesn't need it. It will also inhibit those like myself who want to begin adding content to the DLS, but must first master fundamental Gmax, Blender techniques... and are a long ways away from appreciating without more basic experience some of the advanced techniques I hear Paul Cas, Noel (Brasshat), DAP, Paul Weiser, Jananton, whitepass, and such CCs in Yz-Tz kicking around. I can appreciate what a normal map is, but how it affects an image is a whole 'nother level of understanding. Same with baking textures, and UV mapping, and other esoteric methods I'm totally ignorant of the basis about. So, like PhilC and others who've hung up the creator's hat, this creeping version enforcement is going to thin the new CC crowd as well as the old hands. // F

To take your Microsoft example - they usually support products for around 5-8 years, although it's only the first few years or so that include new features. N3V's current support policy of around 4-5 years is reasonable in my opinion as technology moves on quite quickly and creators and users have to keep up. Shane
MS lifecyles are much closer to a decade than six years Shane, and note both are more than the N3V who have a less complicated application, whose real power is the ability to stay compatible with it's predecessors. I'd admired that a great deal in Trainz and apparently they threw the value out the Wndow when they chose that arbitrary time interval for THEIR CONVENIENCE--not their users. // F

Sorry, but if your content falls into that category, all you have to do is learn from it and fix your mistakes in the future. You can even fix past ones; not all systems allow that. For one to believe their content was absolutely perfect from the very first release...well...let's just say that it's "probably not true." Everybody makes mistakes. I've made plenty of boo-boos. Fortunately, the system was have is very flexible and forgiving of them, although it will be much less so after September 14th.
I don't follow that reasoning. Tightening up the submitted format for a new asset is fine by me, as class and applied software rendering benefit the Drivers and Surveyors on throughput processing--at least in theory. But a proper pre-processing stage for older content (translating such to the newer version's needs), a binary load file with load N Go data, will help that throughput (frame rates!!!) even more. Windwalkr was reluctant to admit that point last summer, but he was thinking LOUD about it! // F

I guess I'm with PWare and MartinVK on most of their points. My view is that newer versions of Trainz should be better than earlier versions and, if one of my assets can be updated to take advantage of such improvements, then I will update it. But I fully accept that others feel differently and its their choice.
No problem with that, but for when and where a tech level, need be applied. If you want to orphan TS09/10 users, it's your reputation and choice. But unless a asset is designed for near track viewing, how much tech level does it really need above v2.4? Script dependencies are a whole 'nother animal, so if it needs S/W hooks of a higher build, that's the way you designed it, and nuff said. // F
I do have a problem making an asset to TS12 requirements, for example, and then deliberating making the Trainz build in the config.txt for a lower version. This means I need to validate it not only in TS12 but also for the build specified and any other builds in between. I made a decision recently that I would no longer do that. If that makes me less "savvy" than smart content creators then so be it.

To illustrate my point I recently did some script writing for another content creator who wanted his asset to run in TS10. I have TS10 installed but my preferred script debugging tool (Andi Smith's MessageBox) will not work in TS10 so fault finding became tedious.
Cheers from a wet Sunday morning.
Smart or Savvy doesn't come into it, it's personal preference and a decision on how YOU are going to spend YOUR (LEISURE) time. THAT is VALUABLE, and THAT RELATION is what I feel N3V has never focused on, as Greg Lane and Auran of old DID. Again, an translation would ensure an older build would work fine in newer builds, and such would and should encompass script adjustments if they go about moving language elements about as they did some of the containers and tags. How hard is it. Read the config's trainz-build, then pick the function branching which vetts to that level. They tighten things up iteration to iteration, fine, so long as the tightening doesn't cause needless extra work for the CCs! In a right-wrong sense, the build level for a designed asset should be with the CC, not a souless single standard vetting uploads--that's just N3V doing a half-assed job YET AGAIN! Laziness and false convenience to boot. They ought be testing each upload vetting software module, so just add it to the list, and vett the new to the new, the older to the older. Not difficult, just an Case statement or IF-THEN-ELSE in the loader! // F

The purpose of "planned obsolescence" (as you put it) is not just to ensure continuing sales but to improve the product. You don't survive commercially by standing still. True, you could make your product fully backwardly compatible with all earlier versions but that always comes with a price.
So long as you figure in the lousy reputation the product gets because they try a download, and then 'Nothing works'. (an overstatement, but not everyone wants to tinker, and the reviews on Amazon, and other web boards present a dismal picture. I suppose they could limit back version downloads to new users lacking the license, limiting the trials and tribulations of exploring good older content to those of us with the license. But unless they do, without a translation pre-processing stage they continue to cause themselves disgruntled customers. // F
I have numerous routes from the DLS that use older assets (TS2004/6) that have passed all the TS12 error checks but frankly look terrible compared to assets created for TS10+ - some trees are a classic and often complained about example.
Of course, but they load and work! The replace asset feature was added in part to allow adjustment of that. A better one would allow zones and/or map square changes. One performance hit many a new route takes is using speed trees at a distance when those old flip trees would do a better framerate. The content was written for an older era, and surveyor allows them to upgrade it. If YOU choose to upgrade it, as ChileanLama did with his route, more power to you! But if you don't, all I'll do is defend you--so long as it works! // F
How often do you buy a new car, a new TV, a new computer? Why buy a SNIP
The difference is when I drive an old clunker off a car lot, I expect it will get down the road. If it doesn't work, I don't have to work on it for umpteen hours to try to get it off the lot to test drive it. // Frank
 
How often do you buy a new car, a new TV, a new computer? Why buy a new one if the old one still works well? I bought my current flat screen digital TV a short time after it was announced that all analogue TV transmission in my city would eventually be switched off. That switch over occurred last year. Was this a "dastardly plan" to force all TV viewers to buy new TV sets? Perhaps it was but I now have more "free to air" TV channels that I ever had before and each with a far better quality picture - the product has improved. (Yes - there is still little that is actually worth watching as I hate "reality" and "talent" TV shows).

I buy a car or a computer when it breaks too often i.e. when it costs me more to fix it than to buy a new one. I guess I'm funny in that I don't feel the need to buy a car or computer every 2-3 years in order to keep up with the Joneses. But as long as it does what it's supposed to and doesn't cost me money, I keep it. Perhaps the software industry ought to emulate the auto industry.

As for the TV comparison, well, that kind of makes a point I didn't intend to, that "upgrading" is often a step backwards. You live in a city, so you get lots of channels. Some, if not most, of us who don't live in or very close to a city get fewer channels than before. I live in an affluent suburb just 25 miles from Philadelphia and 50 or so from New York City: In other words, one of the biggest media corridors in the United States. I also recently "cut the cord." How many over-the-air channels do I get since the transition to digital? Two. In theory, I should be able to get a whopping 5, if I mounted an antenna high enough, which I'm not allowed to. By comparison, back in the analog days, I got 10+ with rabbit ears and 20+ with a rooftop.

As other posts in this thread have pointed out, older content that was created correctly will still work (and does work) in TS12. TS2009-TS2010 content has not been "disallowed". If you want to keep creating assets with TS9 or 10 and upload them to the DLS, then the solution may be as simple as updating the build number in the config.txt file - all other factors being correct.

Nobody is disputing that; indeed, we all agree that pre-TS12 content is perfectly good provided that it's been properly made. The DLS upload bot enforces proper construction. Which all the more demonstrates that excluding pre-TS12 builds from upload to the DLS is totally unnecessary, has nothing to do with error-checking and everything to do with forcing users into newer versions.

Auran/N3V have always made it clear that support for their products does have a timeline. It is even possible that, legally, N3V could have immediately dropped all support for older Auran versions of Trainz when they took over the company.

True, all promises and contracts go out the window in bankruptcy.

Obsolescence and change are a permanent feature of modern technology.

It only is because people let themselves be used like walking ATMs.

Again, however, N3V could just confess and admit that's the purpose behind locking out pre-TS12 content. No need to turn this into a debate when it doesn't need to be.
 
Last edited:
Status
Not open for further replies.
Back
Top