Re: Hdri Images Download

0 views
Skip to first unread message
Message has been deleted

Genciana Haggins

unread,
Jul 8, 2024, 11:37:20 AM7/8/24
to feedbmacolnie

I think we can all agree that Enscape is such a useful and powerful tool in many ways for professionals creating architectural renderings in SketchUp. However, I have run into an issue that I can't seem to resolve, regarding the SkyBox functionality.

hdri images download


Download File https://urlcod.com/2yLVW2



While its great to be able to download and install custom HDRI images from various resources online into the SkyBox settings window, there seems to be no way to scale or position the background image/scene. This renders the SkyBox function useless to architectural designers in my opinion. Most imported HDRI images render too large in scale to the foreground building, whether in my own Enscape projects or demonstrations I've seen in tutorials.

I would say, scaling depends on the camera placement during the real world shooting. Most HDRI are shot at to low level over the ground and you can't change it later, the foreground is too big. Best would be to get HDRI shot a few meters above the ground.

From my time using Lightwave3D, for a long time they have had the ability to add and scale the sphere that is the skybox, which would also affect the scale of the HDRI image as the image was essentially mapped to the spheres surface, which seems the obvious solution in this instance, but I do not know how easy that is to implement in Enscape.

I was attempting an object that would sit in an existing interior space. I have a skybox shot with a GoPro that was about 3 meters in the air. I cannot seem to figure out a way to "scale" the object within the photo so that it appears to touch the ground and be the proper scale. Would it be able to add a way to connect a height/object or maybe distance above ground with the origin point of Sketchup? I'm thinking along the lines of how Sketchup has the photo match feature. Otherwise I am in the same situation as Joshmgrabo . I have an object with no way to control its placement in the world.

I've tried 3 or 4 different HDR images for my exterior scene and I keep getting a very dark background image. Everything I change brightens up the building but the sky and lighting don't change. I bring the images into the material editor, (VrayHDRI texture), mapped to spherical, dragged it to the dome light texture and the environment slot.

Tried increasing the overall multiplier and render multiplier in the material and I get a very slight change. After going past 2 in the render multiplier nothing changes. Decreased the inverse gamma and the building blows out and the background is still dark.

I would turn off your Vray sun for a start, just deal with one variable at a time. Adjust the HDRI or dome light multiplier to get illumination you want, how does the background look now? Are you having the same problem with other HDRIs?

As mentioned above, stick to the HDRI only first. The problem i think you are having now is that the HDR illuminates the scene, and then you have your vray sun that adds additional light to you model, but the vray sun does not affect what you see as the background (hdri), thus the model(geometry) in your scene will be illuminated more/the background will appear too dark. The gamma adjustment in the HDR is there to boost the sun intensity compared to the rest of the hdr, so if you do not get as strong shadows as you want with the default value (1), try to lower it to around 0,75. With this "trick", you should not need the additional vray sun to get strong shadows, and your image will look more coherent straight out of the box so to say, instead of having to adjust the sky in post. Hope this made some sense. Good luck!

It's also worth noting that not all HDRI's are meant to produce strong shadows; an overcast one is going to be MUCH more diffuse light than a clear sky one. It's what makes HDRI lighting look that much more convincing. Not everything requires razor sharp shadows.

If that's so then I'd suggest that the very fact it's an HDRI may answer your question, as an HDRI contains much more light information than a simple (low dynamic range) image, therefore when you increase or decrease the multiplier the HDRI is less visually reactive to those changes, because of all of that extra light information in the file - it works over a larger range.

Perhaps if you simply take the HDRI into Photoshop and flatten it to a jpg or similar and bung that in your environment slot and keep your HDRI in the dome, you'll see a more representative reaction to the changing light exposure.

The problem is that he's doubling up on his lighting by using a VRay sun, and the background thus isn't representative of that. Get rid of the VRay sun and set the exposure correctly and it'll look fine.

I also believe that the vraysun is the problem.You are using the vray dome light and the vray sun as light sources and the vrayhdri mtl as environment background.I suggest that you turn off the vraysun and if you still have the same problem, modify in photoshop the exposure of the sun of the hdri. You can do it in low 32 bit exposure by selecting the sun and adjusting the exposure of the selected area.

I understand Vray is confusing, which this discussion perfectly illustrates, but the original author should just follow A to Z SINGLE tutorial and not all random ones he saw because obviously it's not working well.

Hi, I'm having an issue seeing my HDRI image in my aiSkyDomeLight. I've tried everything but cant seem to get it to show up in the render. It light correctly and everything but I need it to show up in the background as well. Please help!

Go to skydome light's attributes, in Visibility section ensure the Camera is 1.0, and then disable UseColorTemperature switch. Also, you can use two domes - one for lighting, another for BG.. the aforementioned settings then should be opposite for both lights.

Hey thanks so much for your response, I did see a tutorial on this , however, I'm unable to see any visibility attributes to adjust in my light shape panel (or perhaps I'm being extremely blind:) ) - images attached.

As I see from your screenshots, the only thin missing is Camera visibility, which either can be on by default, or you can create large poly sphere and assign material to it with equivalent of Maya SurfaceShader and that HDRI as outColor. Also, turn off shadow casting in its RenderStats.

All of the forums I've read say to turn the camera visibility to 0, which I did, but it still reflects. I tried turning off each slider in the camera visibility attributes, but non of them remover the reflection and leave decent. I read this question here in the maya forum : -shading-lighting-and/hdri-environment-reflections/m-p/5438916#M1...

Is the glass the only thing in the scene? If so, why aren't you just unlinking the HDRI from the Color attribute of the Skydome? That will eluminate everything evenly and not create any reflections or if it is the color in the reflection that bothers you, use your image editing Software of choice to turn the HDRI grayscale.

If this glass is supposed to be the only thing in the scene that doesn't have a reflection and the HDRI should reflect in everything else, create another identical skydome without an HDRI and use Lightlinking to have it only illuminate your glass and having the orignial skydome not influence the glass. But that will create uncanny lighting situations.

The glass is not the only thing in the scene. I just hid everything to show how the hdri was reflecting into the jar. I'm not sure I know how to unlink the hdri from the skydome... but won't that remove the lighting effect of it as well? I want the lighting, just not the reflection. If I just use a skydome without an hdri, everything just looks flat.. and really bad, lol.

I don't mind other object reflecting into the glass, but because of the placement of the jar with the scene, it reflects the image of the hdri, along with the reflections of all other objects as well.

Yes, unlinking the HDRI will remove the lighting effect. The problem you are having is kind of unsolvable, because the reflection literally is the lighting effect. Arnold is a physical renderer so if a light gets reflected it gets reflected realistically, so in order to lose the reflection you have to lose the light on the glass.

The lighting effect will be there but just in grey tones. This normally makes the scene blend a bit better together if with the HDRI if the HDRI doesn't closely represent the Enviroment you are modelling.

Unlike the problem the OP in that thread has, my problem is only with HDR (AFAIK). Images I load in Photo look just fine. I can also mention I've tried out a lot of HDR program the last days and this hasn't happened with the others, so I suspect it's a setting in Photo that I don't understand rather than a computer setting. I haven't made any changes from the default in Photo. Used default for the HDR too except I unticked denoising.

Attach crops from the two images I tried to combine and the HDR result. As you can see the red house has become green and the blue sky has become purple. The HDR bit didn't go well either, but that's another matter.

Special interest into procedural texture filter, edit alpha channel, RGB/16 and RGB/32 color formats, stacking, finding root causes for misbehaving files, finding creative solutions for unsolvable tasks, finding bugs in Apps.

On the assumption that what you wanted was to give detail to the clouds and sky, I selected the clouds and sky from the dark image using the flood select tool, copied this and pasted it onto the light image:

Hi @charr, I've tried out your two sample TIFF files and the HDR merge seems to work fine for me. You don't need to change anything about your shooting process, Photo will equalise all exposures you give it then align by feature matching. If you only need two images to capture the whole dynamic range of the scene, that's fine.

7fc3f7cf58
Reply all
Reply to author
Forward
0 new messages