I’m experimenting with matterport obj files to create virtual walkthroughs of historic heritage sites. I’ve been trying to bake a model of a large church to no avail. The only changes I’ve made to the model after import, is I’ve placed a lot of point lights throughout the model near the windows to simulate the sun (as I want to retain all the textures of the stained glass, rather than making the texture transparent).
While the preview turns out great, every time I’ve baked, the model just looks black (see attached). Any idea what the issue could be?
I also have the same problem. my model is a glass house and have many tree in it. when I try to bake it stop baking and show a result of every thing turn black.
I have trying to solve it by reduce tree model and it work but now my glasshouse have too much space.
why I have this problem and is there anyway to resolve it?
There is also a new WebVR browser called Supermedium: https://www.supermedium.com/ We haven’t yet tested it, but if you would like to try and share results on the forum it would be great!
@Chirag_Jindal after discussing this in the team, we are not 100% sure that GPU baking error is a memory issue, as GTX TITAN has plenty of memory. Could you upload the scene so we can test on our machines?
@jan
Is there any progress with this issue? I have the same problem when I render a large scene (35 million polys) with three GTX1080. It behaves a bit odd. It says 6 lightmaps are required. In the beginning, a lightmap needs about 48 min. But then all maps are rendered in about 17 min. And they are black, as mentioned in this post before.
It would be helpful if this can be solved soon because I have to build more houses which are for stationary usage and they don’t need to be optimized. So I can and want to use complex objects in the scene. But render times are of course crazy (16 hours+).
@tim we did not resolve this issue in May. Could you send a problem report from Shapespark with logs attached and temporarily upload the failing scene to Shapespark hosting?
@tim the logs indicate that this is GPU running out of memory (‘CUDA error: Out of memory’ lines). The problem is that Shapespark doesn’t capture this status and display the error to the user, but continues the bake process. We will add a fix for this.
That means I will get a feedback but the solution would be an optimization of the model. Like smaller textures and less faces? Or will you try to fix the problem with memory, e.g. by baking individual sections step by step? If this is possible at all.
I finished the rendering of my scene (35 million polys). It took my CPU 54 hours to bake it. The result is good. Just the required time is nothing for the spontaneous.
We do see a need to better handle such large models, but unfortunately we don’t see any low-hanging fruit that would allow to improve bake performance or memory requirements without significant development effort. For example adaptive lightmap resolution that @Michael_Campbell mentioned in another threads is something that would help here, but it is a task for which we need to find at least 1-2 months.
Hi @Jan. Thanks for your evaluation. I don’t think anyone expects a solution to this problem tomorrow either. But in the medium term, that’s something that’s important. Also for the further development and functionality and certainly also the competitiveness of Shapespark. More polygons also mean more reality. Especially since the display of normal maps is unfortunately not supported. At least I don’t tend to use fewer polygons, but rather want to move the border upwards. I have bought a PC for rendering that uses 3 graphics cards to reduce the waiting time. Instead of the previous 6 hours for the scene, 54 hours are of course a good step back. I now know that for me there is a limit of maybe 30 million polygons. But it would be a pity if this would stay that way for a long time.