Well, this is an interesting question. I tried some things, so look through those material setups - maybe you'll find something that will fit :)
The main problem is that we can't use raytracing algorithms in realtime rendering - so there are some things that must be sacrificed. Well, yeah, there actually are
scenes with raytracing in webGL, but right now they are very
heavy for computers.
Our refraction works like this: we render all the opaque objects behind the refractive object and then distort this render by factor that node Refraction contains. But the problem is - if we need to render several blend objects, we need to render all scene
every time we use refraction. So in this example we needed to render the scene… 18 times!
I don't think it could be fast enough x)
So other transparent objects will be "eaten" by the first refractive object.
But I found some sort of a hack - look, the parts where refraction can actually be seen are bevels - so I made only them refractive. I am talking about third example (first one contains only refraction, second one refraction+some matcap, third one is refraction only on bevels+matcap)
The trick is here: I made an ordinary transparent material and copied bevels. Those bevels I separated to another object and made them refractive - on the screenshot I moved this bevel to demonstrate it. And it actually somehow works! Look:
Hope it will help .blend.html