Forum

"WebGL: CONTEXT_LOST_WEBGL" when loading large scenes

28 March 2016 14:06
Hi,

I want to use Blend4Web to visualize concepts for my company's customers (I don't have a commercial license yet). Those concepts include meshes, generated from CAD data with great detail and high poly counts. I noticed that if my file has more than 1 - 2 Mio Polys, the exported HTML / JSON does not load up any more in my web browser (export works fine though). Attached is an example with a duplicated mesh, that reproduces the same error. I tried it with Firefox (31.6), Chrome (49.0.2623.108) and Chrome Canary (51.0.2692.0 canary). My operating system is Windows 7, 64 bit. Im using the blender addon & sdk version 16.2

Here is a blender screenshot:


Here is a screenshot of the error in Chrome:


print.js:50 B4W LOG: INIT ENGINE 16.03 DEBUG (28.03.2016 12:58:59)
print.js:50 B4W LOG: INIT WEBGL 1
print.js:50 B4W LOG: GET EXTENSION WEBGL_debug_renderer_info
print.js:50 B4W LOG: RENDERER INFO: Google Inc., ANGLE (Intel(R) HD Graphics 4600 Direct3D11 vs_5_0 ps_5_0)
print.js:50 B4W LOG: INIT WEBAUDIO: 48000Hz
print.js:50 B4W LOG: GET EXTENSION WEBGL_depth_texture
print.js:50 B4W LOG: GET EXTENSION WEBGL_compressed_texture_s3tc
print.js:50 B4W LOG: Chrome detected. Some of deprecated functions related to the Doppler effect won't be called.
print.js:50 B4W LOG: SET PRECISION: mediump
print.js:50 B4W LOG: LOAD METADATA ../../tmp/preview.json?t=28032016125901
print.js:133 B4W EXPORT WARNING: Missing active camera or wrong active camera object
print.js:50 B4W LOG: LOAD BINARY ../../tmp/preview.bin?t=28032016125901
print.js:50 B4W LOG: LOAD PHYSICS Using Separate Worker Thread, Max FPS: 60
print.js:50 B4W LOG: PHYSICS PATH http://localhost:6687/src/../deploy/apps/common/uranium.js?t=28032016125901
print.js:50 B4W LOG: GET EXTENSION OES_standard_derivatives
print.js:50 B4W LOG: GET EXTENSION OES_element_index_uint
print.js:74 B4W ERROR: shader compilation failed:
1 attribute vec2 a_position ;
2 varying vec2 v_texcoord ;
3 void main ( void ) {
4 v_texcoord = 2.0 * a_position ;
5 gl_Position = vec4 ( 4.0 * ( a_position . xy - 0.25 ) , 0.0 , 1.0 ) ;
6 }
({"vert":"postprocessing/postprocessing.glslv","frag":"postprocessing/postprocessing.glslf","directives":[["ALPHA",0],["ALPHA_CLIP",0],["ANAGLYPH",0],["BEND_CENTER_ONLY",1],["BILLBOARD_PRES_GLOB_ORIENTATION",0],["CAUSTICS",0],["CSM_BLEND_BETWEEEN_CASCADES",1],["CSM_FADE_LAST_CASCADE",1],["CSM_SECTION0",0],["CSM_SECTION1",0],["CSM_SECTION2",0],["CSM_SECTION3",0],["DEBUG_SPHERE",0],["DEBUG_SPHERE_DYNAMIC",0],["DEPTH_RGBA",0],["DISABLE_DISTORTION_CORRECTION",0],["DISABLE_FOG",0],["DOUBLE_SIDED_LIGHTING",0],["DYNAMIC",0],["DYNAMIC_GRASS",0],["DYNAMIC_GRASS_COLOR",0],["DYNAMIC_GRASS_SIZE",0],["FOAM",0],["FRAMES_BLENDING",0],["BILLBOARD_JITTERED",0],["BILLBOARD_SPHERICAL",1],["HAIR_BILLBOARD",0],["SHADOW_TEX_RES","2048.0"],["MAIN_BEND_COL",0],["MAX_BONES",0],["NUM_NORMALMAPS",0],["PARALLAX",0],["PARALLAX_STEPS",0],["PROCEDURAL_FOG",0],["PROCEDURAL_SKYDOME",0],["REFLECTION",0],["REFLECTION_PASS",0],["REFLECTION_TYPE",0],["REFRACTIVE",0],["USE_REFRACTION",0],["USE_REFRACTION_CORRECTION",0],["SHORE_SMOOTHING",0],["SKINNED",0],["SKY_COLOR",0],["SKY_TEXTURE",0],["SSAO_HEMISPHERE",0],["SSAO_BLUR_DEPTH",0],["SSAO_ONLY",0],["SSAO_WHITE",0],["STATIC_BATCH",0],["TEXTURE_COLOR",0],["TEXTURE_NORM",0],["TEXTURE_SPEC",0],["TEXTURE_STENCIL_ALPHA_MASK",0],["VERTEX_ANIM",0],["VERTEX_ANIM_MIX_NORMALS_FACTOR","u_va_frame_factor"],["VERTEX_COLOR",0],["WATER_EFFECTS",0],["WIND_BEND",0],["DETAIL_BEND",0],["SHORE_PARAMS",0],["ALPHA_AS_SPEC",1],["MTEX_NEGATIVE",0],["MTEX_RGBTOINT",0],["NUM_LIGHTS",0],["NUM_LFACTORS",0],["NUM_LAMP_LIGHTS",1],["MAX_STEPS",1],["BILLBOARD_ALIGN","BILLBOARD_ALIGN_VIEW"],["SHADOW_USAGE","NO_SHADOWS"],["POST_EFFECT","POST_EFFECT_X_EXTEND"],["SSAO_QUALITY","SSAO_QUALITY_32"],["TEXTURE_BLEND_TYPE","TEXTURE_BLEND_TYPE_MIX"],["TEXTURE_COORDS","TEXTURE_COORDS_UV_ORCO"],["AA_METHOD","AA_METHOD_FXAA_QUALITY"],["AU_QUALIFIER","NOT_ASSIGNED"],["BILLBOARD",0],["BILLBOARD_RANDOM",0],["PRECISION","mediump"],["EPSILON",0.0001],["USE_ENVIRONMENT_LIGHT",0],["USE_FOG",0],["WEBGL2",0],["WO_SKYBLEND",0],["WO_SKYPAPER",0],["WO_SKYREAL",0],["WO_SKYTEX",0],["WOMAP_BLEND",0],["WOMAP_HORIZ",0],["WOMAP_ZENUP",0],["WOMAP_ZENDOWN",0],["WIREFRAME_QUALITY",0],["SIZE_RAMP_LENGTH",0],["COLOR_RAMP_LENGTH",0],["PARTICLES_SHADELESS",0],["NUM_CAST_LAMPS",0],["SUN_NUM",0],["MAC_OS_SHADOW_HACK",0],["USE_COLOR_RAMP",0],["HALO_PARTICLES",0],["PARTICLE_BATCH",0]],"node_elements":[]})error @ print.js:74
debug.js:192 Uncaught Engine failed: see above for error messages
viewer.html:1 WebGL: CONTEXT_LOST_WEBGL: loseContext: context lost
print.js:74 B4W ERROR: WebGL context losterror @ print.js:74
print.js:112 B4W WARN: Canvas size exceeds platform limits, downscaling
camera.js:1687 Uncaught No aspect ratio
print.js:112 B4W WARN: Canvas size exceeds platform limits, downscaling
camera.js:1687 Uncaught No aspect ratio
29 March 2016 11:17
Hi! It depends on system specs. I've tested this scene on a Mac with Intel HD Graphics 3000 (while you have Intel(R) HD Graphics 4600 as it's seen from the log) and it works. A heavier scene will cause the same error on a more powerful system. This is probably a memory issue. It's well explained here: WebGL scene does't render because of lost context. So, I suggest you to check the memory usage to find out exactly what's going on.
29 March 2016 23:31
Thanks for your reply, Ivan.
By the way: The technology, that you guys are delevoping here, is very cool. If I would be able to supply our customers with interactive 3D presentations of our concepts, that would be a huge benefit.

On the topic:

I also tried the same scene on my personal laptop (previous tests were made with my work laptop) and everything worked fine. I was able to reproduce the error when increasing the number of objects in the scene.

Thanks for the link. The explanations there were very informative. I'm still not quite sure though what the best way would be to solve my problem. Do you suggest to limit the size of the scene (when loaded by the web browser) to 1,5 GB of RAM memory? How can I make sure that my application can run on any medium sized hardware (no problem if loading times are long or if animations play choppy)?
This is a screenshot of my work laptop's memory with no scene loaded:
This is my base scene: (strangely today, more monkeys are possible…)
This happens when I load this scene succesfully in firefox:
This happens when I add a couple of more monkeys - WebGL crashes in firefox:
This is how a crash looks like in chrome:
30 March 2016 12:47
Hmm, that's strange. 16gb of ram should be enough for such a big scene and you have a lot of memory available. But in my case, when I've reproduced this behavior, it was almost fully consumed.

We'll try to reduce the memory usage (there is a possibility for it), but I can't guarantee that this will be soon. Anyway, the scene should be optimized as much as possible. Limiting the size of a scene or reducing high-poly models is a good idea. I don't know if it suits you, because it can depend on a certain model or visualization requirements. Also, if you have many objects, they can be dynamically loaded/unloaded to control the memory consumption, but it requires some coding.
30 March 2016 21:44
Thanks again, Ivan, for listening and replying to me :).

So… I understand from your reply, that you assume that there is some kind of 1,5 GB memory restriction, imposed somehow by the web browser or the way WebGL works.

It would be helpful for my case if it would be possible to reduce memory consumption for multiple instances of the sames mesh.

Although I have some limited coding skills, I think it would not be feasible for my use case to have dynamical loading and unloading. I have to be able to show the whole model in one shot and generating levels of detail will be inefficient, I think.

Reducing the polygon count can definately be an option. Unfortunately, I don't think that there is a CAD conversion software that generates good quality models with low polygon counts. Also, I cannot go into the CAD models and delete thousand of nuts and bolts and other small details - this would not be time efficient.

I would be happy if you could find a solution for this problem. I will keep reading your blog and following your project along. But I can definately see a use of your solution for what we are doing at my company already. I will try to convince someone to get me a commercial license :).
31 March 2016 12:19
So… I understand from your reply, that you assume that there is some kind of 1,5 GB memory restriction, imposed somehow by the web browser or the way WebGL works.

I don't hear about a certain number, it rather varies for different platforms/browsers. There is also a memory allocation issue. Roughly speaking, a browser should allocate a continuous block in memory for every datablock and it can fail if the memory is heavily fragmented. This gets worse for big scenes and big chunks of data.

It would be helpful for my case if it would be possible to reduce memory consumption for multiple instances of the sames mesh.

We're planning to implement instancing in the future, but only for particle systems.


Reducing the polygon count can definately be an option. Unfortunately, I don't think that there is a CAD conversion software that generates good quality models with low polygon counts. Also, I cannot go into the CAD models and delete thousand of nuts and bolts and other small details - this would not be time efficient.

You can try Blender's Decimate Modifier, but there is a warning in the description: "This is not usually used on meshes which have been created by modeling carefully and economically (where all vertices and faces are necessary to correctly define the shape)."
 
Please register or log in to leave a reply.