We run into issue of “Graphics context lost, reload to retry” after finishing rendering a big project (the project folder is about 6.5GB after rendering) with super quality option. We rendered the project on a server with 376GB memory and 2 * Nvidia A10 GPU (24GB memory).
The most common cause for this error is GPU running out of memory. Do you have other GPU intensive applications running on this computer when the error occurs? Does reloading the scene help or is it always failing like this?
We don’t have other GPU intensive applications. We reserved 2 * Nvidia A10 GPU (24GB memory each) server for rendering shapespark only. We also run into the same error while try to preview and update light setting of the unrendered project. The memory and GPU memory usage was very low. We are wondering if there is anyway we could increase the memory limit setting of the shapespark.
Could you check if the error happens also in a normal web browser when you open a scene via http://localhost:5000 address while Shapespark application is running?
@wojtek I am wondering if shapespark has some best practice which we can refer to. for example, the max project size should not exceed certain xyz GB. maximum polygon and vertices should be within certain limit, etc. We notice that it’s very easy run into such error once total project folder size exceed 2GB.
If you open the crashing scene for editing through http://localhost:5000 in Chrome, having Shapespark desktop application running, does it work OK or does it also crash?
@Kal, is it a scene that you can share us, so we could try to reproduce the issue on our end? You can send a link to the uploaded link on the forum, or privately via support@shapespark.com