Forum Moderators: TheBryster
Bryce F.A.Q (Last Updated: 2025 Jan 04 3:16 am)
"Unless Bryce is rewritten to take proper advantage of parallel processing, the 'new' processors will do little for us" Amen. Bring on the future Bryce that is ready to rock&roll with multi-core cpu's, multi cpu motherboards, hyper-threading, and 64-bit everything! AS
Contact Me | Gallery |
Freestuff | IMDB
Credits | Personal
Site
"I want to be what I was
when I wanted to be what I am now"
"The techies have run out of steam (ya cannee change the laws of physics, cap'n). They can't get new processors to run any faster, so they've resorted to nailing multiple old processors together" hate to rain on your parade bro, but those clever marketing guys at intel have been doing that from day 1, in the beginning was the 8086, then the 286 (or 80286 to give it's full name) and blow me if it wasn't just 2 8086's on the same subsrate.... you'll never guess what the 386 was?, 486 anyone??? p1 had a few new wrinkles (like it coudn't add up) but was called pentium rather than 586 for legal rather than tech reasons (i.e. intel were trying to go down the microsoft road, rather than up the IBM creek by copyrighting their product properly) And Moor's law is another bit of marketing hype. You can plot the numbers... And, so long as you use very selective dates. like 18 months +/- 12 months... And continually re-define what you mean by speed/power, numer of transistors, clock speeds, size of transistors, acceptable yields... not to mention constant changes in operating systems and bryce itself - and did you ever see what happens to windows when you throw a couple of gig of ram at it, and what speed is the ram running, system bus?? Plot speed against time... And if you screw up your eyes and tilt your head to one side, and hold the graph at arms length (assuming you have long arms... very long arms) Then some of the points kinda look like they lie close-ish to a line. hardly y=2x, but this is marketing, not science. :-) That said you initial point is spot on. Unless you really HAVE to have the latest kit (viagra is a cheaper solution to that problem), it's much more cost effective to wait a couple of years before upgrading. plus, that way you almost never buy a 'betamax'
As far as pure cpu speed goes; Cpu's can only be made to go so fast before home users would HAVE to use some sort of liquid cooled system. Or, enough fans that would drown out any other noise...whatsoever, lol. AS
Contact Me | Gallery |
Freestuff | IMDB
Credits | Personal
Site
"I want to be what I was
when I wanted to be what I am now"
Did someone say 64 bit? ;-) Bryce crys out for that, but as far as my tech upgrades went Im very happy with Bryce on my rig.64 3200+(512cache)crushed my old 32 2800+ semp(256cache) comp. And I was scared to even think of getting Bryce because of all the people who said they would render for days cause it was slow. When I got it (first ver I got was 5.5 soooo..) I was amazed at how fast it was. Been a happy Brycer since. Wonder what the new stuff will take the MOST advantage of, 64bit compile or dual cpu threading? As for dual cpu I never saw those take off back in the day , why now besides being affordable?
Snap! - As far as machine upgrades go, I try to go for at least twice as fast when I upgrade.
As for the progress of CPU technology, I'm actually quite pleased they've reached the limit. I just don't think it's wise to run something at the extremes of material ability. I think Multi-CPU is the future, and whilst it's been around for a long time, it hasn't exactly lost it's status of being slightly exotic - just yet.
I think sometimes though, the drive to improve upon something doesn't always push in the right direction. I mean ok, more processors means more board space, which in turn means a bigger motherboard, which in turn needs a bigger chassis to contain them - but so what.
Let's face it, Multi-CPU monsters are gonna replace your TV, Hi-Fi, DVD etc (they probably already do).
It's not so bad to expect slightly bigger systems when you consider the amount of hardware they make redundant. But try telling that to MrWong who's having nightmares about how to put a P4 into the next-gen mobile phone!
"Bring on the future Bryce that is ready to rock&roll with multi-core cpu's, multi cpu motherboards, hyper-threading, and 64-bit everything!"
Bloody hell, I hope so. I like Bryce, but not enough to sit around using old-hat technology whilst others play burn-out with their oh so wonderful competitive app's ;-)
I have a good feeling about Bryce6 though. DAZ know what they're up against, so I hope they'll pull it off.
Len.
The wait can be horrific, but the outcome can be worse - pumeco 2006
And, on the other hand concerning future Bryce's rendering speed; *We could use options for variable quality renders like Vue has (quality vs speed). They have a lot of options in that dept. *Some sort of option to render with 3rd party renderers, or a 2nd type of rendering engine within Bryce. Becuse we don't always need ultra precise raytracing when rendering a dirt mountain. -Obviously, these things have been mentioned a hundred times, but they pertain to speed, so... AS
Contact Me | Gallery |
Freestuff | IMDB
Credits | Personal
Site
"I want to be what I was
when I wanted to be what I am now"
I need multiprocessor and multiple Gb of RAM! Radiosity/Area lighting/sub-poly displacement, blurred anisotropics - my latest had just reached about 80% after 24 hrs of crunch in C4D!
Abandon ye all hope, render times are constant, you always expand your techniques to compensate for increased power! (edit for splling (grin))
Message edited on: 02/23/2006 19:44
Pass no temptation lightly by, for one never knows when it may pass again!
The processor speed is running into problems now . At very high frequecies there is data loss. In the radio world that means a switch from just plain wire to waveguide. I think you just can't do that in a computer so they have to do it another way. multithreading and greater bit depth are the way to go.
The wit of a misplaced ex-patriot.
I cheated on my metaphysics exam by looking into the soul of the
person next to me.
If you ask most pros in the CG field, they will tell you except for the occasional print image they use scan line renders. Ray tracing is a waste of time. Most people cannot tell the difference. Why sit around for hours waiting for a ray traced image when you can have a scanline in minutes. I lived through a 29 hour render only to discover I didn't like the end result. I want options I can live with, not live through.
Right. Raytracing is great for accurate reflections, refractions, etc. When you want them... AS
Contact Me | Gallery |
Freestuff | IMDB
Credits | Personal
Site
"I want to be what I was
when I wanted to be what I am now"
Thank you PJF, finally some tech talk I could understand. I don't understand the hyperthread thing even though my puter has it. I don't understand the dual core other than it is two processors sharing the load. I'm with PJF, results in the application you are using are what counts. What I've always wondered about is why no one took off with the idea incorporated in old Amiga system: multi-processors assigned to do a single task (i.e. one for sound, one for graphics, one for IO, one as the traffic cop) instead of microsoft approach of increasing clock and single processor speed and call it multi-tasking when it was really doing one thing at a time but so fast you didn't notice the breaks. As you can see, I live in the past. Way past my bedtime again.
Basically, because that already exsists to a certain extent; I mean my video card has a gpu, my audio card has some sort of cpu chip thing. Yes? *That reminds me of the rumors that (in the past) DAZ wanted Bryce to use a users video card gpu to help calculate renders. (For that they need Nvdia's help). AS
Contact Me | Gallery |
Freestuff | IMDB
Credits | Personal
Site
"I want to be what I was
when I wanted to be what I am now"
64 seems a magic number in computerworld any body remember the c 64 lol, that amiga was way beyond its time, still keep one under my bed.
for
some free stuff i made
and
for almost daily fotos
Probably before they pump out any more power hungry cpu,boards, software bloat, they should optimize apps to take advantage of intsructions we have now. I mean even a 1 ghz cpu can almost double program speed if instructions were utilised. Anyone remeber stuff like AMD K6 and quake 2. That was an eye opener. Instead we are overclocking to get the same results code could easy achieve.x64 will run 32 bit apps fine but being 64bit the optimisations will get turned off to run in WOW64 and thus slower programs.Hope this isnt the case with Vista. But when/if Bryce might get a 100% increase in 64/dual cpu, look what they did with 5.5, 30% increase from earlier versions(most cases). So if we cant get that $$ new cpu aleast we can sit back and see what Daz does with ver 6.
The better speed increase will come from being able to take advantage of dual cores cpu's over being made ready for 64-bit. Having both would be nice though! (and the smartest route to go) I assume having Bryce using a dual core cpu would be in theory, equal to using a normal computer with a 2nd PC running Bryce Lightning. Then yeah, I would assume you could just cut every rendering time you ever would have right in half. (approx.) AS
Contact Me | Gallery |
Freestuff | IMDB
Credits | Personal
Site
"I want to be what I was
when I wanted to be what I am now"
If anything will get the tech guys to get processers running faster it's 64 bit processing. I'm fairly sure most people will be disapointed in processor speed under a 64-bit environment given the processing speed of current processors. Machines will have to at least double in speed to process as fast as the compparable product running under 32 bits. We can only go so fast before we reach the atomic level i.e. the transistors etc reach atomic size which seems to me to be the limit unless the technology improves. Once you get down to sub-atomic sizes the physics changes. I don't pretend to understand this but I think a working application of string theory would be involved - from what I understand that's still years off. So it indeed seems that Moore's theory was indeed prematurely elevated to the status of a law. As to Bryce I think that improvement in the function of code will be more likely. I agree that allowing the program to take advantage of multible processors is at least in part the key to faster Brycing. I dream of the day I could press the render button and see a render finished in less than a minute. Realistically I'm thinking that I should just get an older computer and just use that with Bryce and save my newer computer for other things. A friend offered me his old 900Mhz p3or4. I've got a choice of ME or W2000 to put on it so if my friend comes through and it works acceptably fast I'll post on that. meanwhile I'm trying to nurse a sick 2000 computer back to health. I changed my logon settings and can't logon. The logon.scr trick(Petri.com)seems to be working but ntrights(MS TOOLKIT) isn't so it'll be a while yet. Almost makes me wish I had bought a quart or two of beer today to take away the physical tension - I'll have to settle for some tea and Top (tobacco). Ah well - Is it an appropriate time to Quote Peter Weller in "The Adventures of Buckeroo Banzi in the Eight Dimension" by saying "Nomatter where you go there you are?" ... well if not; "Bob's your uncle, Fanny's your aunt and there you go" - Johnny Depp in "The pirates of the Carrabean" - TJ
Maybe we can steal and use Black Lectroid technology to build a PC? I guess that would be "alienware"... AS
Contact Me | Gallery |
Freestuff | IMDB
Credits | Personal
Site
"I want to be what I was
when I wanted to be what I am now"
"steal" is such a harsh word - perhaps we could just "borrow" it. There is a company called "Alien Technologies" though. They sell computer parts as well as those little microchips that brodcast a signal to a scanner so some computer somewhere knows who and what just walked past it's scanner. I still think that Captain Kirk's computer on the NCC 1701 would be the one to get. The problem is that all the Klingons would wanna come on over and party Saturday night and Sunday afternoon on the bridge because Kirk had the big screen T.V. and the most comfortable seats. I wonder how the Ship's replicator would handle making Budweiser? on the subject of cooling: liquid nitrogen? keep the chip close to absolute zero. I read somewhere that the cooler a CPU is the faster it runs. You'd be in trouble though if the cooling system sprung a leak. Finally saw "Terminator 3" on T.V. After watching that I don't know if too smart or too fast a linked computer is such a good idea . . .
At some point in the process we need to go organic. An ant has the equivilent of twenty thousand transistors in its body yet is able to function in the real world. A standard digital watch has 1000 times that many but can only tell you the time, and even then it needs input from the owner to establish a base line to work against. A new born baby is orders of magnitude smarter than every computer on this planet combined but cannot speak, see, walk or any of the things it will learn during its life time. If you create an organic computer with the ability to KNOW what a question means, it gives it the ability to supply the answer in a way anyone will INSTINCTIVLY understand. You can build networks that act like communities, and will want to involve you in its activities. In other words a computer that aims to please.Which would make a nice change.
The notion of the quantum uncertainty render seems strangely familiar. Hit the throbbing button before going to bed and the render is in a flux state of being fantastically good and utter shite at the same time. Only the action of observing it the next day will decide which.
O.K. so who let the cat out of the bag ? the only way to defeat quantum logic is not to look. One advantage that the ant's brain might have is that it is arranged in 3 dimensions rather than a flat surface thereby allowing more direct interconnections. One problem in arranging computers that way (if you could) is as Agent Smith noted earlier, increased heat. I am under the impression that a closer proximity of transistors will process signals quicker. A quantum computer ? not real sure what that means - I know that a Quanta is a theoretical minimum packet of energy. I've heard the term "Quantum Singularity" to describe a small black hole in "Star Trek" and the like. Last night on P.B.S. I saw a program entitled "The Ghost Particle" on the development of theories on Neutrinos. What stood out to me was that they described the neutrino as having 3 states and that as you approach the speed of light time stops - in short the neutrino changed state over distance therefore was travelling at less than the speed of light and thus contrary to accepted theory had mass. Some guests on the George Noory program have explained that the Quantum Computer would involve reaching into parallel universes for access to 3 d space or would somehow involve convoluting time in some obscure manner. I hope not - I'm hard pressed enough cashwise - imagine the cost of such things. As to organic computers - I think the same people who object to stem cell research and cloning would object to such things . . .
"...the Quantum Computer would involve reaching into parallel universes for access to 3d space or would somehow involve convoluting time in some obscure manner."
So not only could you see your render before you bother going to all the effort of constructing the scene; your creation would actually become real on another plane of existence.
Trouble is, forum disputes could easily erupt into full pan-dimensional warfare. Unforgiving, bloodlust vampire demons battling relentless, long-winded tech marketeers.
Er, wait a minute...
"...the Quantum Computer would involve reaching into parallel universes for access to 3d space or would somehow involve convoluting time in some obscure manner." "I think Cyba Storm is on to something. Let's all collectively at the stroke of midnight (GMT) "think" a render into existance. AS you supply mats, Dann-O some organic models, bikermouse the beer, Drac the ... " MUST REVISE 'THINK' RENDER SCHEDULE. SEE BELOW REVISED SCHEDULE: EVERYONE THINK RENDER AT 12 MIDNIGHT GMT YESTERDAY.
I hate to say it liquid nitrogen cooling would be too far, many materials become more conductive as they approach absolute, so the whole shebang would fail to work. You can do some nice stuff with freon cooling -50 is far enough to get some good gains in speed. Intel techies are finally catching with AMDs on the simple idiom - don't work faster work smarter. AMD hit clockspeed limits a while ago (i.e. started finding it difficult to go any faster) so they started getting smarter - more efficient pipeline prediction and so on. Now that Intel are getting the same problem - difficulties in adding to the clock speed they also need to design smarter. The dual core thing is not just a cheap trick either - sure it doesn't do anything for Bryce's antiquated code architecture - but it's fantastic to be able to play Civ 4 while bryce renders in the background :). Load up 3DS max and it rocks.
----------
Toolset: Blender, GIMP, Indigo Render, LuxRender, TopMod, Knotplot, Ivy Gen, Plant Studio.
If you don't mind paying about $4000 or so I saw something on TV Sunday that stated that a company called Falcon Technologies in Florida is selling a 4gh+ liquid cooled computer set up for mostly games but it might do the trick. Personally I'm 16 bits short of NULL so it's a ways off for me - and I still want Captain Kirk's computer...Ka-Plaugh!!
Attached Link: http://www.dell.com/html/us/products/ces/index.htm
Something like this, liquid cooled with 4 graphics cards..... ROTFL won't do bryce any good at all!----------
Toolset: Blender, GIMP, Indigo Render, LuxRender, TopMod, Knotplot, Ivy Gen, Plant Studio.
This site uses cookies to deliver the best experience. Our own cookies make user accounts and other features possible. Third-party cookies are used to display relevant ads and to analyze how Renderosity is used. By using our site, you acknowledge that you have read and understood our Terms of Service, including our Cookie Policy and our Privacy Policy.
Very soon after I got into computers I lost my youthful figure. But that's not important now.
Very soon after I got into computers I learned a couple of useful hardware lessons. Don't pay out for the absolute leading edge of technology unless you need it for paying work (or are just helpfully rich); and don't bother changing a computer unless the new one will be twice as fast as the old (unless you are helpfully rich).
My computer buying (constructing) shadowed Moore's law reasonably well, getting me a machine twice as fast every eighteen months or so. With careful watching of marketing and avoiding the dog processors, this meant properly faster not just 'on paper' faster. Bryce was the benchmark, and every time I got a new machine Bryce rendered just over twice as fast.
My last change was to a 2.53Ghz P4. But that was nearly 4 years ago, and I'm still waiting for hardware that will render Bryce twice as fast. The top 'speed' is at something described as +4800, which means maybe it'll go that fast if you drop it out of a plane tied to a lump of non-depleted uranium that destroys air resistance.
The techies have run out of steam (ya cannee change the laws of physics, cap'n). They can't get new processors to run any faster, so they've resorted to nailing multiple old processors together and promoting the conglomeration as a shiny new thing with a go-faster stripe. Not that there's necessarily anything wrong with parallel processing; my first home build was a dual Pentium 200 running Windows NT and very nice it was too.
Except for Bryce. Bryce ran faster in Windows95 on one Pentium 200. And this situation remains today. Apart from the unreliable hassle of Lightning, Bryce is a single thread program that runs on one processor (or core).
Since there's been lots of forum chatter about 'new' technology (mostly from Mac fans who are understandably excited about finally being allowed to work on a PC), I thought I'd be a humbug and cut through the crap. Unless Bryce is rewritten to take proper advantage of parallel processing, the 'new' processors will do little for us.
If you want to edit video whilst running a web server on your laptop during the long train trip into work, the new Intel Core thing is going to rock your world. But for Brycing, even the fastest one available now will be lucky to beat my near four-year-old machine.
Parallelism is the future, and unless Bryce signs up for multi-threaded duty it is dead. But for now, save your money.