Forum Coordinators: RedPhantom
Poser - OFFICIAL F.A.Q (Last Updated: 2024 Nov 25 12:38 pm)
Quote - Not to me it isn't, I like being able to follow your progress & how you thrash the problems as they arise, if you decide to scoot off to some dark cave to perform your python sourcery in private we'd have to have regular reports at the very least.
They can still have a place to collaborate quietly and still post progress reports here. I don't see a problem ;o). In this thread as it stands, they have to root thru all the other posts to get to a post by one of the collaborators, which, were it me, would be irritating...lmao.
I'll go ahead and set up a private forum for the programmers. If you want to use it, sitemail me here and give the username you'll be using and I'll give you access. Persons that aren't helping won't be given access. It's as simple as that ;o). It's waiting for you if you want it. If you don't I can always just delete it ;o).
Laurie
But I don't to ignore kawecki - sometimes he says something useful.
I want his behavior to change - I want him to stop confusing people with contradictions or statements of absolute certainty that are not just in doubt but completely and certainly falsehoods.
Renderosity forum reply notifications are wonky. If I read a follow-up in a thread, but I don't myself reply, then notifications no longer happen AT ALL on that thread. So if I seem to be ignoring a question, that's why. (Updated September 23, 2019)
Quote - But I don't to ignore kawecki - sometimes he says something useful.
I want his behavior to change - I want him to stop confusing people with contradictions or statements of absolute certainty that are not just in doubt but completely and certainly falsehoods.
He says what he thinks is fact. You have two choices: Either tell him he's wrong and why (because after all these years he's still the same) or take all your toys and go home because of one guy. It's up to you. I can guarantee that if you tell him he's wrong and why, it will be a waste of your time and energy. Others have tried. So it's ignore him or quit playing ;o).
Laurie
This is rather silly. Kawecki has some good programming knowledge - at least it looks that way to a bystander who can vaguely follow logic, if not so much numbers and variables - and this really has little to do with getting Poser stuff into Lux (i.e., there's still plenty of other things to do, if gamma correction isn't one you wish to contribute to). If the majority want gamma correction as a part of that process, then arguing really won't help. If you still mightily object, you can find ways around it, I'm sure. This is only one detail of a much larger project... it's not worth getting bogged down in whether it's necessary or not. Maybe consider it a challenge to undo it later, privately, if you wish.
----------------------------------------------
currently using Poser Pro 2014, Win 10
Content Advisory! This message contains profanity
Quote - This is rather silly. Kawecki has some good programming knowledge - at least it looks that way to a bystander who can vaguely follow logic, if not so much numbers and variables - and this really has little to do with getting Poser stuff into Lux (i.e., there's still plenty of other things to do, if gamma correction isn't one you wish to contribute to). If the majority want gamma correction as a part of that process, then arguing really won't help. If you still mightily object, you can find ways around it, I'm sure. This is only one detail of a much larger project... it's not worth getting bogged down in whether it's necessary or not. Maybe consider it a challenge to undo it later, privately, if you wish.
Well, shit happens ;o). People get testy (I sure as hell have). And with all the eccentric personalities and egos around here, this kind of thing is bound to happen.
I personally will not beg anyone to keep going if they don't want to. Especially not BB. He'll do what he will do (which is his prerogative).
Laurie
Why old 14" monitors have dark gray colors?
In a simple monitor the RGB signal from your computer goes to some transistors and then to the cathode of the tube.
In more advanced and modern monitors the RGB signal goes to an integrated circuit and then goes to the transistors and cathode of the tube.
For what has been added this integrated circuit? to make you waste money?
Quote - I am overwhelmed and depressed with kawecki's near-constant disinformation campaigns.
I am not going further in this thread, it needs another thread about gamma correction.
BB, is you are are disinformed and it doesn't matter if the always the same trolls cheer you. At least you do many things and has some knowledge, but the other trolls are nothing more than ignorant trolls as any troll is.
For whom want to get informed, there are many fabricants and models of the integrated circuits used in the RGB path. Just get the datasheet of these integrated circuits from the fabricant and learn what it does, you will learn something new.
Stupidity also evolves!
Quote - Why old 14" monitors have dark gray colors?
In a simple monitor the RGB signal from your computer goes to some transistors and then to the cathode of the tube.
In more advanced and modern monitors the RGB signal goes to an integrated circuit and then goes to the transistors and cathode of the tube.
For what has been added this integrated circuit? to make you waste money?Quote - I am overwhelmed and depressed with kawecki's near-constant disinformation campaigns.
I am not going further in this thread, it needs another thread about gamma correction.
BB, is you are are disinformed and it doesn't matter if the always the same trolls cheer you. At least you do many things and has some knowledge, but the other trolls are nothing more than ignorant trolls as any troll is.For whom want to get informed, there are many fabricants and models of the integrated circuits used in the RGB path. Just get the datasheet of these integrated circuits from the fabricant and learn what it does, you will learn something new.
I rest my case.
Laurie
@BB:
Quote - Now it's that sRGB colors are linear and do not need to be anti-gamma corrected.
Seems to me that Kawecki (and perhaps others) don't understand what
anti-gamma correction
means.
@all:
Most images (even those taken from a camera) are already gamma-corrected. To use those images in a render (as background or texture) the gamma effect has to be reversed
At least if the render-engine is using gamma correction for his output (better: for the LUT = Look Up Table in use).
With Lux this output gamma-correction can be done at any time - even after the image is completly rendered (back to topic)!
In a perfect world it is up to the user to anti-gamma correct used images. Just because software can't know wether an image is actually gamma corrected or not. But in Poser world mutch people don't care about things behind the scene (if a resulting images looks good anything is fine).
What BB tries to do -as far as I understand- is to find a good guess: Should he do anti-gamma correction on loaded textures or not? Because most images are gamma-corrected already, anti-gamma correction as default is a good choice. Perhaps with a button in the userinterface for exceptions.
But the real problem is to find an "embedded" gamma correction in a Poser material node set and correct (remove) this. Or vice versa: Is there a node set in front of an image that does anti-gamma correction? Try to find out while looking on a complex Poser material and you get a clue what BB tries to do with software. Just to help Poser users to get a render result in Lux close to what is seen in Poser.
Quote - With Lux this output gamma-correction can be done at any time - even after the image is completly rendered (back to topic)!
You can also do that with Poser Pro if you export as HDR or OpenEXR, which are linear file formats by spec. Then the program that displays or converts the HDR/EXR is then handling for gamma correction, and you can pick anything you want, after the render.
Quote - Try to feed your monitor with an HDR-Image linear downgraded to 8 bit RGB. Dosn't look good, right? Now correct and use a Gamma of 2.2.
You must first understand what is a HDR image. To view a HDR image you need a HDR monitor, it is impossible to do it with a normal monitor.
There are very few and expensive HDR cameras and monitors, if you haven't one the only thing you can do is to use a subset of the HDR image, do tricks or make fake HDR.
HDR images can be used by the internal rendering process and can improve the scene, but only when the rendered scene has a final illumination with a normal dynamic range.
Stupidity also evolves!
Quote - > Quote - With Lux this output gamma-correction can be done at any time - even after the image is completly rendered (back to topic)!
You can also do that with Poser Pro if you export as HDR or OpenEXR, which are linear file formats by spec. Then the program that displays or converts the HDR/EXR is then handling for gamma correction, and you can pick anything you want, after the render.
Thanks for this hint!
By the way: What about a post-gamma-dial for the next Poser Version? And color correction ... brightness .... :)
Quote - [
HDR images can be used by the internal rendering process and can improve the scene, but only when the rendered scene has a final illumination with a normal dynamic range.
Exactly this is the case with Lux and also with Posers Firefly. Internally there is HDR. To get a nice JPG, you have to convert it (Poser has) - and better do a gamma-correction.
See the last post from stewer.
Quote - Most images (even those taken from a camera) are already gamma-corrected. To use those images in a render (as background or texture) the gamma effect has to be reversed
Cameras have not gamma correction, and are made to have a linear response. TV stations correct the gamma at other stage and not at the camera.
I don't know about digital TV, the standards are different.
Photographic images also also not linear and much is corrected in the laboratory.
The non linearity of a photograph is not the same as a cathode ray, plasma tubes has other non linearities.
Photographic images are not made to be seen on monitors, are made to be printed, so no cathode ray tube is used and the gamma corrections are very different.
The famous nice curve x^2.2 is only for the cathode ray tube at voltages near the grid cut voltage (black level). For other intensities the curve is more or less linear and when the grid saturates it begins again very non linear (ultra white).
If you set the intensity of your monitor for strong daylight illumination and the change the intensity to work at night with little ambient illumination the gamma correction required in both cases are different, so you will need to render a scene to watch at day and another to watch at night.
Of course that you can add gamma correction to make your scene look nicer and Photoshop has a lot more plugins to make your image nicer too.
Stupidity also evolves!
Quote - > Quote - Most images (even those taken from a camera) are already gamma-corrected. To use those images in a render (as background or texture) the gamma effect has to be reversed
Cameras have not gamma correction, and are made to have a linear response. TV stations correct the gamma at other stage and not at the camera.
I don't know about digital TV, the standards are different.Photographic images also also not linear and much is corrected in the laboratory.
The non linearity of a photograph is not the same as a cathode ray, plasma tubes has other non linearities.
Photographic images are not made to be seen on monitors, are made to be printed, so no cathode ray tube is used and the gamma corrections are very different.(snip)
Ok, I'm a hardware guy not a coder and a lot of the math etc goes WAY over my head, but even I can figure out that in a thread about an external rendering engine and how to export to said engine, you don't threadjack it.
@Kawecki: I know you mean well, but if you want to have a diatribe about GC, please just create a new thread and let BB, ODF and ADP get on with what most of us think is a really cool thing.
Grabbin my popcorn and heading back to the shadows now.....
Dammit, now I'm seriously contemplating downloading Lux again on the new rig and having a play with it!!
As if I didn't have enough render engines to learn :biggrin:
[PS - how about a Poser to VRay exporter next ??? - (mostly kidding about that)]
My Freebies
Buy stuff on RedBubble
Kawecki spewed a bunch of MORE falsehoods, even though I specifically asked that there not be any more. He doesn't understand what he doesn't understand, he doesn't realize what trouble he causes for others trying to learn, and he doesn't understand that explaining the same things over and over to somebody who has a false belief system is infuriating.
I'm not abandoning the project, but I'm abandoning this thread.
Renderosity forum reply notifications are wonky. If I read a follow-up in a thread, but I don't myself reply, then notifications no longer happen AT ALL on that thread. So if I seem to be ignoring a question, that's why. (Updated September 23, 2019)
Content Advisory! This message contains nudity
Finding it very hit-n-miss at the moment which textures are successfully exported.
Software: OS X 10.8 - Poser Pro 2012 SR2 - Luxrender 1.0RC3 -
Pose2Lux
Hardware: iMac - 3.06 GHz Core2Duo - 12 GB RAM - ATI Radeon HD
4670 - 256 MB
Both of these were exported using ODF's alpha version python script.
Software: OS X 10.8 - Poser Pro 2012 SR2 - Luxrender 1.0RC3 -
Pose2Lux
Hardware: iMac - 3.06 GHz Core2Duo - 12 GB RAM - ATI Radeon HD
4670 - 256 MB
Quote - Thanks kawecki. We all appreciate it very much.
BTW, that's called sarcasm.
So how do we all keep up with the progress then? Those that are coding and those that would like to test? I'm just curious ;o).
Laurie
I'm sure Bagginsbill will keep us posted, perhaps in another thread, once he's made more progress.
Not to revive a sore subject but I did a google on GC... the digital camera sensors record the signal linearly (by all accounts) but all but the fewest specialised cameras do GC to images to convert to the sRGB specification.
Be curious where that other "info" came from. I guess misinformation also evolves, or so I've read, somewhere. :biggrin:
Monterey/Mint21.x/Win10 - Blender3.x - PP11.3(cm) - Musescore3.6.2
Wir sind gewohnt, daß die Menschen verhöhnen was sie nicht verstehen
[it is clear that humans have contempt for that which they do not understand]
Another luxrender, www.renderosity.com/mod/gallery/index.php.
This one took 3 minutes to export from Poser and then I let it render for 2 hours.
For an alpha version the export script it works quite well, but it's to test the export after every new addition to your scene, there's quite a few objects and textures which fail to export.
Software: OS X 10.8 - Poser Pro 2012 SR2 - Luxrender 1.0RC3 -
Pose2Lux
Hardware: iMac - 3.06 GHz Core2Duo - 12 GB RAM - ATI Radeon HD
4670 - 256 MB
I've been busy modeling and just got in late to this thread. Someone noted a need for beta testers. I can help with multi-thread and high-complexity scene tests; Cameron has the 3.33GHz H/T dual hex processor pack (two Intel Xeon x5680 procs) and 96Gb RAM. Poser Pro 2010 defaults to 24 rendering threads. What do I need to get/do?
Poser 12, in feet.
OSes: Win7Prox64, Win7Ultx64
Silo Pro 2.5.6 64bit, Vue Infinite 2014.7, Genetica 4.0 Studio, UV Mapper Pro, UV Layout Pro, PhotoImpact X3, GIF Animator 5
Quote - I've been busy modeling and just got in late to this thread. Someone noted a need for beta testers. I can help with multi-thread and high-complexity scene tests; Cameron has the 3.33GHz H/T dual hex processor pack (two Intel Xeon x5680 procs) and 96Gb RAM. Poser Pro 2010 defaults to 24 rendering threads. What do I need to get/do?
I'm not sure we're in beta mode yet, but as far as I am concerned, knock yourself out. :D
Unless I've missed a recent PSA, this link should set you up with everything you need: http://www.poserprofis.de/PoserLuxExporter_Alpha.
-- I'm not mad at you, just Westphalian.
In this version however, the mesh is not actually welded at actor boundaries. That is because Poser does - as far as I know - not provide explicit information on whether and how to weld texture vertices. An actor boundary may or may not coincide with a texture seam, so it's important to test this explicitly; a test which I haven't implemented yet. That means that effectively I have a texture seam at each actor boundary, and the previously welded vertices are unwelded in the next stage. I'll fix that soon. It shouldn't have any visual effect on end users, anyway, unless one tells Lux to subdivide the mesh and generate new normals.
Now I need to go and catch up with the rest of the exporter.
-- I'm not mad at you, just Westphalian.
Quote - For my fellow coders, here's my latest contribution. The LuxportActor class (which I guess might be up for a name change) can now be instantiated with a figure rather than an actor object
Thanks! I'll add that now. Should be available within the next hour.
Name your files as you like :)
I have so mutch things to do here that I can't work mutch on the project.
The Poser-Lux-Exporter is open source. Any Poser fan should have an interest to support this project because it is a big step toward "reality render" your Poser scenes.
The geometry exporter part is written by **ODF.
**Material export and light is written by Bagginsbill.
If you can program in Python: Welcom to the club! Grab the code and put your part into the project. Available code is easy to understand/follow, just wrap your code into a class.
If you are somebody whith an understanding how Lux handles materials: We all are waiting for your genial shaders.
If you are good with text: We need somebody able to document/decribe things in a form a common Poser user is able to understand (HTML prefered).
Until yesterday any detail of the project was discussed in this public forum. So even those fans without programming skills could follow, contribute test renders (an important part, because a render isn't done in minutes) and feel as a part of the project.
If you are really interested in this project, here are some important links:
http://www.luxrender.net/
Here you can find ready to install packages for Windows, Mac and Linux. Don't miss to read the documentation. You are on a professional stage with Lux - no click and go anymore (I'm kidding) ;)
http://www.poserprofis.de/PoserLuxExporter_Alpha
This packages is a snapshot from the latest exporter sources. Just start the first script (PoserLuxExporter.py) to export from Poser to files Lux can use.
This project is at an early stage and far from beeing perfect (Alpha stage).
Here is a rather drastic illustration of an effect BB mentioned a while ago in this thread. I came across this low-light situation by pure accident - a spotlight pointing astray - and rather liked it. But as you see in the image on the left, which was rendered from an unaltered export file, Lux can have a hard time shading a relatively coarse mesh like Antonia's correctly. (Please ignore that that side is also grainier. I just didn't have the patience to let it render 3 hours like the right side.)
The image on the right was rendered after adding the following two lines to every shape definition in the .lxo file:
"integer
nsubdivlevels" [2]<br></br>
"bool dmnormalsmooth"
["true"]<br></br>
This tells Lux to subdivide the mesh twice (each step turning every triangle into four smaller ones) and to interpolate the normals for the subdivided mesh. This gets rid of the nasty artifacts at the terminators, but now we have a new problem. To make the texture mapping work correctly, the exporter has to cut the mesh apart at texture seams. As a consequence, Lux takes these seams as mesh boundaries and does not interpolate normals across them. Also, where the cutting of the mesh creates a corner, the subdivision process results in a small gap in the mesh which is visible as a black dot.
Obviously, I am hoping that we'll soon be able to use per-polygon UV coordinates in Lux (or the slightly modified Lux-derivative bagginsbill talked about), which will eliminate the latter problem and enable us to have Lux subdivide the mesh for us. But until that happens, I'm considering adding an optional subdivision step to the exporter. A subdivided mesh will obviously result in considerably longer processing times, larger output files and also more work for Lux when importing the mesh. So it would not make much sense to subdivide by default. It really depends on the scene, and maybe even the object that's being exported.
-- I'm not mad at you, just Westphalian.
Can some give me a little jump start, on how to use the exporter, please?
I'm pretty curious about it ^^
Ok, so I have a scene ready in PP2010 and run the PoserLuxExporter.py (from the previously mentioned alpha vers. of adp001's latest post) from File -> Run Python.
I then get this new pop-up window with lots of info lines in it.
But what's next? .. Do I get a new file or batch of files? I can't seem to find anything beside a string of 3 LuxExporterParameters.conf.bak, .dat and .dir files.
Quote - Downloaded it - this is what I get
File "/Applications/Poser 8/Runtime/Python/poserScripts/LuxExporter/luxmatic/matcore.py", line 503, in emitLux
print >>s, 'Texture "%s" "%s" "%s"' % (name, vtype, self.LuxType)
AttributeError: 'Turbulence' object has no attribute 'LuxType'
Sorry, Bagginsbill isn't ready with converting all material nodes from Poser to Lux.
I'm not sure at the moment if he will make another intermediat version.
As BB previously stated, Lux does not support some of the procedural functions that Poser uses for procedural textures (he's working on that). So, if you have any procedural textures in your scene (non-image map based), it won't be able to convert some of them. Looks like turbulence is one Lux doesn't have. Yet ;o).
Laurie
Quote - Can some give me a little jump start, on how to use the exporter, please?
I'm pretty curious about it ^^Ok, so I have a scene ready in PP2010 and run the PoserLuxExporter.py (from the previously mentioned alpha vers. of adp001's latest post) from File -> Run Python.
I then get this new pop-up window with lots of info lines in it.
But what's next? .. Do I get a new file or batch of files? I can't seem to find anything beside a string of 3 LuxExporterParameters.conf.bak, .dat and .dir files.
Look into the directory you started from.
You should see a new created path named "toLux".
You should find (at least) 3 files in this path. The file with extension "lxo" is the file you have to load with Lux.
If path/files are not created, please post a screenshoot (or the content of your "new pop-up windows" (this is what we call "Python Status Window").
Quote - For my fellow coders, here's my latest contribution. The LuxportActor class (which I guess might be up for a name change) can now be instantiated with a figure rather than an actor object, in which case it treats that whole figure as a single mesh, collecting and using whatever welding information it can gather from Poser. As a result, the normals generated are now consistent on actor boundaries, and fewer mesh objects (one for each material present in the figure instead of one for each material in each actor) are exported to Lux.
In this version however, the mesh is not actually welded at actor boundaries. That is because Poser does - as far as I know - not provide explicit information on whether and how to weld texture vertices. An actor boundary may or may not coincide with a texture seam, so it's important to test this explicitly; a test which I haven't implemented yet. That means that effectively I have a texture seam at each actor boundary, and the previously welded vertices are unwelded in the next stage. I'll fix that soon. It shouldn't have any visual effect on end users, anyway, unless one tells Lux to subdivide the mesh and generate new normals.
Now I need to go and catch up with the rest of the exporter.
My god, ODF, what did you do with the geometry exporter?
It's soooooooooooooo amazingly fast now! Unbelivable ... really just Python? No hidden C++?
Avoid grainy in Lux renders
Tab "Noise Reduction".
Select "Regularization". Reduce "Amplitude" to 50 and check "Enable".
Select "Adv". For "Interpolation type" select "Runge-Kutta".
At the bottom click "Apply".
Watch the status line. It will show "Activity: Tonemapping" for a while. If you play with "Amplitude" or other parameters be aware that you may not see the result immediately. This is because the tonemapping process need time to compute. So click "Apply" if activity is an empty field.
You can anytime click "Reset" to get back what you started with.
Try the "Color Space" tab. Here you can change the base-color your render is using (night, daylight, etc).
Note that nothing you do in the left tabs interrupts the render process.
This site uses cookies to deliver the best experience. Our own cookies make user accounts and other features possible. Third-party cookies are used to display relevant ads and to analyze how Renderosity is used. By using our site, you acknowledge that you have read and understood our Terms of Service, including our Cookie Policy and our Privacy Policy.
Not to me it isn't, I like being able to follow your progress & how you thrash the problems as they arise, if you decide to scoot off to some dark cave to perform your python sourcery in private we'd have to have regular reports at the very least.
Windows 7 64Bit
Poser Pro 2010 SR1