wheatpenny opened this issue on Oct 12, 2006 · 141 posts
kuroyume0161 posted Mon, 16 October 2006 at 2:09 AM
They will always happen - that's the thing. In reality, the programmer's job is to mitigate them as much as possible. Look at M$ and the overflow bug. This is a well-known problem that is hard to avoid. It can be avoided, if you don't mind software that runs like mud in winter. The design of an application must compromise between making something that the user can actually use and checking for every conceivable problem. To go the latter route completely would create applications that run nearly perfectly - at about 100000th the speed... ;)
SHONNER shouldn't take this the wrong way, but there is that saying about masturbation: "90% of the people admit to it, the other 10% are lying." I know or have been in contact with hundreds of developers in my time and not one has ever purported to write flawless software - even the very good professional ones. You can get pretty darn close when certain conditions are met and you can control the scope of possible combinatorics. But as the complexity increases, so do the chances of bugs creeping in. Methinks that Poser's problems really started when they put their dependency upon third-party 'black boxes' (Material Room, Cloth Room, Dynamic Hair, etc.) which removed some of the more direct control.
People also seem to forget that it isn't always the application itself that causes the problem. I've heard of many things being broken by, for instance, MacOS 10.4.7. No change of application - change of OS. Drivers (particularly graphics drivers when it comes to 3D) have never been known to do this either - ever (big sarcasm there). Can anyone actually blame the application developer for that?
C makes it easy to shoot yourself in the
foot. C++ makes it harder, but when you do, you blow your whole leg
off.
-- Bjarne
Stroustrup
Contact Me | Kuroyume's DevelopmentZone