In3ds Max for example, it is possible to split the actions into a Base and Per Scene (add or override), as shown in this cap from the 3ds Graph Editor (note that there are repeating Scene names that is some bug.
Showing captures from commercial software is highly discouraged here (as legal caution). You should instead describe how this would work in Blender, without resorting to explaining how things work in other software.
This does work but it is not very flexible. If you add some new object to the original collection you have to link the object to every duplicated collection as well.
This gets confusing really fast if you have several scenes with duplicated collections
Ok. How would it be to have a single file with all the content of the scene in a collection? Then you could have a 2nd file that links that collection (multiple times even, across multiple scenes); that setup would allow you to have library overrides, and indeed have different overrides in different instances of the scene.
Yes, I do that sometimes and it works fine. I think overriding actions has been added in 3.0, right?
Sometimes it is more convenient to not use linked files, though, and here, scene overrides would be very welcome.
I'm new to blender, only my second day, and I am following a tutorial by the Blender Guru. I noticed something wasn't quite right when he had live render running and his blender was running just as usual, however my computer seemed to struggle running blender, and it would also cause youtube to hang up and stutter. However when I go to render a simple scene containing a plate, doughnut, and a mug it crashes. The crash makes both of my monitors go black, except for some noise in the top third of my 2k monitor, little squares of color. Then blender closes. I tried following what other people suggested where it was applicable, making sure my sub-surf was only at 2 for everything when it was rendering and setting my memory usage to zero to allow for unlimited memory. However nothing seems to work. Any help on this would be amazing as I'm hopping to continue using this application into the future.
EDIT: So we just tested it on my s/o's machine and we have the same monitor, ran just fine with hers all the way up to 4k, she has older generation cpu AND a step down in both gpu and memory, memory only 16gb.... what is the deal?
START BLENDER and if you CANT and Blender still crashes, the second thing to check is a FAULTY ADDON. Do you remember having added an ADDON recently? IF so: Please uncheck it and delete it from Blenders addon folders:C:\Users\3DCompositor\AppData\Roaming\Blender Foundation\Blender\2.79\scripts\addons
Last but not least try to run blender after all of these operations with only 1 monitor (YES ONLY ONE AFTER CLEANING) for a regular 10 minute session. IF you dont notice anything weird, proceed with the SECOND monitor (plug your monitors with your PC OFF!)Work for 10 more minutes using BLENDER. If you didnt have any crashes, Turn off PC, connect 3rd monitor, and work with Blender for 10 more minutes.
Please take a moment to consider if this thread is worth bumping.Recommended PostsjeanfreedomPosted October 6, 2021jeanfreedomResident 4Share Posted October 6, 2021 Not even sure if I'm in the right place or if I'll get a response, but I am a creator who's wondering if it's possible to import highly detailed scenes from blender into the game. The reason being that I would like a specific scene for my main store. If It can not be done, then maybe it is possible to import some of the props of the scene and then have a landscaper within second life do the other aspects as similar as possible to the blender scene. I'm hoping that somehow it is possible however, because well that would just be a huge relief. However, since I am looking for an honest and realistic response, just be straightforward with me, even if I am actually typing this in the wrong place XD.
First of all, I am really new to all of this. This is my first time so its really confusing me if it's better to create a scene and then import it into Unity or is it better to just create objects like chracters, chairs in blender and scenes in Unity? I just want to recreate a scene of my school and include it in my final project as a Unity game where it will have no function other than walking around and some jumping and fighting? Because my project is not based on the game, it will just help it.
There's nothing wrong to creating your scene in blender and then importing it to Unity, in fact you can directly import the blender file and get the lighting just as you did in blender, but the way unity and blender renders the scene is different, so none of the post processing effects will get transferred to unity, and you'll have to tweak a lot of stuff in unity to get the look you want, I suggest you to make your assets in Blender if you feel it's faster to work with, and then export it to unity and setup the rest as per your preferences. It can be a bit of confusing at first but it all comes down to your personal preferences.
Example:You are working on a simple scene that has a room with a table, chair and character. So in Blender create and save a file with a complete room model (without the furnishings). Next, make and save a table file with a table model with all the details (perhaps a couple knives, forks, and plates). Make and save a chair .blend and then a separate character .blend...this continues until you are done with objects needed for your scene. Of course, all of these would be saved in a "room scene" folder (outside of your Unity project). Export all the models as .fbx files. Finally, import these into Unity with their (named) materials and import your textures...put the objects together in our scene and setup lighting, cameras, particle FX...
Example 2:You are working on a game that programmatically generates levels (ie. dungeons, mazes). The best thing to do for this type of game is to create modular components that get generated by scripting. These components would be something like a hallway piece, a room piece, a turning hall etc.. To make these, model a base for the wall and floor maybe structural details (windows, doors...), then save that as a file and of course export to an .fbx for Unity. Make some detail objects like mentioned in example 2 then import the models into Unity. Create prefabs for hallway, room...by parenting the details to the base, then instantiate the prefabs from a script.
However, I immediately had to think about a YouTube video:In the Boundary Break episode about Telltale Games's The Walking Dead Season 1 one of the original developers speaks a bit about their experiences of creating whole scenes for the game on a 3D editor rather than their game engine:
If this is a project that other people will eventually work on, there's a good reason to create individual assets in Blender but assemble the scene in Unity: this is likely to make the project more accessible for other developers who have Unity experience but not Blender experience. The Unity Editor is targeted at game designers, level designers, and developers. Blender is targeted at 3D artists. Many Unity developers who are comfortable assembling scenes, setting up materials, etc in the Unity Editor may not have any idea how to use Blender and may struggle with learning its more complex user interface and material settings. Team members who are learning Blender for the first time may get frustrated with online tutorials, as the Blender team has completely redesigned the entire UI several times and it's extremely difficult to follow older tutorials in the newest interface.
Thanks for checking for me. I am not really doing much today. I saw this post sitting on the couch this morning. As soon as I saw the op did not make the .blend & it looks like crap in both formats, I knew it was material settings which have no equivalent. Answering from the couch has been blowing up in my face lately, so did not reply.
The non-optimized issues are concept like multi-materials. You can certainly use multi-materials in Babylon, but it is a heavier process at runtime because you need to assign materials per vertex rather than per mesh. You can see in this light fixture that you have three materials assigned to the mesh:
Ideally, if you were to optimize the scene, you would UV the light to be able to use one material with UV islands controlling your color breaks. In the case of materials where you simply want to use a different set of material factors rather than textures, then combine all meshes that use each material. So, for example, if all of the diffusers on the can lights use the same material, rather than combining the diffuser into each individual light, combine all diffusers into one mesh with one material assigned. Do this for each material needed in the scene to prevent some of the overhead and exporting issues of multi-materials.
In this example, it is using the same texture for base color and roughness, while adding a curve operation to the roughness texture before using it as base color. It is also manipulating the Y axis of the normal texture as well as scaling all of the texture tiling. The texture tiling can be handled with KHR_texture_transform which will export correctly with this setup. However, if you look at the Blender manual about how to set up your materials for glTF export you will see that the exporter is looking for png and jpg textures assigned to the shader for export. Any operation done in the shader between the texture and the input will likely be ignored outside texture transform which would come before the texture.
3a8082e126