Blog

Development Roadmap for 2015

2015-01-16

We would like to direct your attention to the Blend4Web development roadmap for this year.

We consider one of most important results of our public release last March is the formation of the Blend4Web developer community. We are grateful to our users for their questions, comments and suggestions in relation to using our platform. Thanks to your feedback received by email and on our forum, we are now aware of the kinds of difficulties you face, which features are lacking and which complications arise because of technical differences in browsers, operating systems and devices.

Consequently, this year we intend to focus our efforts not only on developing new features but also on the following:

  • improving platform usability for developers,
  • ensuring compatibility with all possible systems and devices,
  • reducing the learning curve for newcomers to our technology,
  • enriching the end user's experience.

New Features

In 2015 our team will continue developing features for Blend4Web. Based on the principle "from Blender to Web" and taking into account the suggestions of our users we have set the following goals:

  • Extend functionality of the glow effect.
    Besides outlining objects we are planning to implement Tron-like glowing effects on materials.
  • Glowing effect on materials.

  • Support inverse kinematics and procedural skinning.
    In other words, this will make it possible to control bones of animation skeletons. Examples include: turning a character's head and eyes as if it is following an object, eliminating errors in positioning of a character's feet on uneven landscapes, bending of plants upon interaction.
  • Support morphing.
    We plan to make exporting morphing data (Shape Keys) from Blender possible and to add support for controlling transitions between them via animation and API.
  • Provide complete support for particle systems including physics and softening.
    We plan to support Blender functionality such as force fields, Brownian movement, using arbitrary objects and object groups as particles and many other settings. Implementation of soft particles will remove distinctive borders where billboards come in contact with surfaces.
  • Support text objects.
    We plan to implement automatic conversion of such objects to meshes upon export.
  • Extend possibilities of NLA Script even further.
    We intend to make navigation to web pages possible upon clicking on an object, to implement hiding and revealing objects and to add support for URL parameters in NLA Script registers.
  • 3D web link without coding.

  • Support virtual reality devices.
    We are considering providing support for VR devices such as Oculus Rift, Gear VR, Google Cardboard, aside from anaglyph stereo rendering (for viewing with colored glasses) which is already available.
  • Add optimizations to take advantage of new browser features.
    Due to increasing support for various WebGL extensions which are useful for performance (instancing, Vertex Array Object, texture compression, sRGB textures, dynamic textures), we will be able to implement the corresponding optimizations.
  • Ensure full-fledged support for scenes with multiple light sources.
    We propose to optimize scene rendering with multiple light sources by clipping distant or weak sources. We also plan to support casting shadows from multiple light sources (up to 4), and to implement correct shadow rendering which are cast from Spot lights.
  • Further develop the physics system.
    We are going to add support for polygonal dynamic objects and to implement the loading and unloading objects of the physics system at runtime.
  • Support Blender's procedural textures and gradients.
    Textures generated on the fly have an obvious advantage over image files because there is no need to load them. Since the Blender viewport does not support such textures, we also plan to submit the corresponding patches.
  • Material with a cloud texture.

  • Provide better support for mobile devices.
    We intend either to add or improve support for features such as camera panning, eye camera mode and video textures for touch screen devices, and also to make gyroscope controls possible.
  • Implement an API to control parameters of node materials.
    We plan to make changing parameters of node materials possible, such as values and colors from an application.
  • Implement an API for instancing.
    For cases in which copies of 3D objects appear on the scene, for example, a character throwing stones, or elements for level generation are reused, we plan to implement the corresponding programming interface.
  • Support parameter set-up for non-player characters in Blender.
    Thus, it will be possible to set up typical roaming NPCs without any coding.
  • Non-player characters of the Capri underwater world.

  • Support scene metadata.
    The possibility to specify scene names and descriptions right in Blender can be helpful for creating online catalogs.

Blender

The capability of exporting directly from Blender, that is without any intermediate editor, is one of most interesting features of Blend4Web. We fully understand how important it is for 3D artists to see the same visual result in the Blender viewport as on a web page. That's why, this year we plan to continue and to expand our involvement in Blender development in order to decrease (ideally, fully eliminate) differences between these two environments.

  • Adapt the Blender interface to fit Blend4Web needs.
    We plan to create a specific interface profile in which all supported parameters will be easily accessible.
  • Participate in the development of the new viewport and support its features.
    Since OpenGL ES 2.0, which is a target platform for the next-gen viewport, and WebGL, on which Blend4Web is based, are very similar technologies, we will cooperate with the developers of this project.
  • Make using normal maps in node materials more user-friendly.
    Native support for normal maps in Texture nodes will allow us to get rid of workarounds such as using REPLACE node.
  • Transfer useful nodes from the Cycles renderer to the viewport.
    Such nodes as Vector Transform, Geometry (New), Gamma and others will significantly extend capabilities of node materials.
  • Gamma correction for a texture to be used as a mask. From top to bottom: no transformation, proposed new node, currently used node group.

  • Improve support for transparency in the viewport.
    Currently, transparency setup is performed via the Blend Game Engine interface. We may also implement z-sorting and improve implementation of the Add mode.
  • Get rid of the binary part of the add-on.
    Transferring functionality from the add-on's binary part to Blender would extend the number of supported platforms and would simplify installing and upgrading the add-on.
  • Support features in the viewport which are currently only available in other Blender renderers.
    Particularly, we propose to support the following features in the viewport: procedural textures, WardIso specular model, particles rendering, environment lighting and others.
  • Material samples.
    A library of pre-made and optimized materials will significantly increase the productivity of 3D artists.

Demos and Tutorials

Demo apps and tutorials are used for debugging and optimization, informing the public about engine capabilities and serve as examples for developers. This year, we plan to release several such apps, including, but not limited to:

  • "Pyatigor's Tale".
    We'll complete this gamedev tutorial series. There will be full-fledged gameplay, two levels, monsters, new effects, audio and a list of records.
  • Somebody is about to get hurt now!

  • "Capri".
    The hit app of this year will feature vast landscapes to be explored by users. The beta version is to be released this Spring for registered Blend4Web developers.
  • Harbor concept.

  • We also intend to create an example of using 3D graphics to visualize a real-life manufacturing enterprise.

Conclusion

This roadmap only outlines our goals for this year. We are happy to consider your suggestions and remarks and are prepared to make changes to our plans accordingly. Please feel free to post your comments in this forum thread.

Again, thank you for your support and involvement! With joint efforts we'll be able to achieve remarkable results in such a wonderful and promising area as Internet 3D graphics.

Comments
16 jan. 2015 19:50
I am happy to see so many good things being added and considered for near future development of Blend4web. I would like to add a couple things to consider; and reference some existing forum topics related to them.

The first would be the reference you gave to: "Extend possibilities of NLA Script even further."

As an advanced 3D user/creator, and not really a developer; I am not entirely sure what an NLA Script is or what function it provides. With that said; it seems to relate to my discussion here:

https://www.blend4web.com/en/forums/post/890/

I hope that this functionality is a reasonably high priority; as I could really build off such a feature. (Having a Object Outliner available to a user through the exported HTML file; which would not only allow them to show/hide objects; but also provide more information/notes/attributes, etc about each object. In my case this would be more for communicating manufacturing data about specific parts; which would be populated preferably through and xml or spreadsheet file, so that the attribute data can be synced to and from other sources or 3D programs.)

Another thought I had recently was kind of inspired by this post:

https://www.blend4web.com/en/forums/post/594/

I believe what this person is asking for; is a way to output an HTML file that allows an object to have multiple material choices which can be changed by the user in the viewer. (when I say "Multiple Choices"; my intention would be to curate those choices for the user. -For instance: if I were selling a product such as a mobile device case; and you could perhaps choose from cloth or leather; I want the user to be able to click on a button or thumbnail that then applies that material option. -I do not however intend for the user to endlessly change colors or materials that would not be available to them to buy.)

Going even a step further; would be to offer different versions of a product that might include different configurations or upgraded parts or features. (-Think of buying a car as an example; and choosing an upgrade to the bodywork such as ground effects, or a sporty front fascia . . . ) This might be a possibility through "Multiple Scenes" in Blender; and having Blend4web support exporting Multiple scenes. Maybe this is already possible; I have not tried it yet.

So to recap:

A) Multiple color or material choices for a single object

B) Multiple scenes that automatically hide and show specific objects; to show multiple, yet different, variations of a product; or perhaps even a Character.

Thanks Again for all your Hard Work!
16 jan. 2015 20:10
Glad to see instancing may be implemented. Lots of interesting plans, I look forward to the upcoming releases.
17 jan. 2015 09:10
As an advanced 3D user/creator, and not really a developer; I am not entirely sure what an NLA Script is or what function it provides.
NLA Script is a system of UI blocks to allow visual programming without the need to code. It is useful for simple cases such as click-and-play-animation.

https://www.blend4web.com/en/forums/post/890/

I hope that this functionality is a reasonably high priority;
This is marked in the roadmap as "Support scene metadata." The outliner you suggest could be based on these meta data. There is no plans to implement such outliner though because it is too application-specific in our opinion to be a part of engine. So perhaps you need somebody with programming skills to code such application.
A) Multiple color or material choices for a single object
This can be done using current APIs. Again, you need somebody who can program.
B) Multiple scenes that automatically hide and show specific objects; to show multiple, yet different, variations of a product; or perhaps even a Character.
Exporting multiple scenes which are contained in a single blend file is not supported; however you can just put them into separate blend files to be loaded at runtime. Again, you'll need to code in order to do this.

I hope my answer is not that disappointing . I don't really like to answer to the users with "If you need that, just code that". I think we could prepare typical examples for modifying 3D scenes interactively, in a form of tutorials or demo scenes. On the other hand, Blender devs have plans to implement "interaction" mode in the Blender viewport, which will eventually allow for many cases suggested by yours, to be implemented without any coding.

17 jan. 2015 22:37
Thanks for your reply,

Yes, I need to get more familiar with "Coding"; it's just been something I have avoided most of my life, being more drawn to interactively creating content.

Your response about Blender's upcoming "Interactive" mode, not sure entirely what that entails, but my intention of changing Materials; was for myself to "curate" possible colors/materials and publish them to the output html file as choices the end-user could explore in the web viewer. So I am not sure if maybe you are saying that the new game "Interaction" mode will allow you to more easily support this or not.

I am willing to take a stab at coding; but the only way would be if I could be steered to some step by step tutorials as you mentioned.

Thanks again.



18 jan. 2015 09:27
Your response about Blender's upcoming "Interactive" mode, not sure entirely what that entails, but my intention of changing Materials; was for myself to "curate" possible colors/materials and publish them to the output html file as choices the end-user could explore in the web viewer. So I am not sure if maybe you are saying that the new game "Interaction" mode will allow you to more easily support this or not.

Well, we may only guess how Blender interaction mode will look like. Ideally it could be node-based to allow rather advanced possibilities including interchanging materials. If such mode will work right in the Blender viewport, supporting it in Blend4Web will be a quite trivial task.
20 jan. 2015 16:12
Wow. I guess most of my gripes with Blend4Web are addressed one way or another in this roadmap, this is incredible!
My favorites have to be support for blender's procedural textures and the commitment to help support them in Blender's viewport; and instancing support. Hope this fits in and coordinates well with the upcoming Blender Viewport Project too.

Ответ на сообщение пользователя Development Roadmap for 2015
Adapt the Blender interface to fit Blend4Web needs.
We plan to create a specific interface profile in which all supported parameters will be easily accessible.

Have you considered adding Blend4Web as an additional render engine to the interface? It is after all one form of an external rendering engine for most purposes, with specific rendering, world and material settings. That would allow you to natively add a specific set of rendering and material parameters separate from BGE or Blender Internal, hide away any unused parameters and unify displaying of stuff like the transparency and physics, uncluttering the interface.

I would also really love to see object instancing support. Hopefully will help reduce not only file-size (because it reuses data) and in browser memory consumption (recycling geometry), and with it support Blender per object materials so one can recycle geometry but change object color/materials, eliminating in the process the annoying warning about objects without materials.

Ответ на сообщение пользователя Development Roadmap for 2015
Support text objects.
We plan to implement automatic conversion of such objects to meshes upon export.
Maybe be a bit more broad here and fully support all curve object types instead, and do the same automatic conversion to mesh whenever possible?

Anyway thanks for your awesome support not only here on the forum, but for the great addon and for Blender developers either. It's great you take the time to publish this roadmap and the trust in your community to publish it.
20 jan. 2015 21:57
Hi Duarte,

Thanks for your feedback and support!

Have you considered adding Blend4Web as an additional render engine to the interface? It is after all one form of an external rendering engine for most purposes, with specific rendering, world and material settings. That would allow you to natively add a specific set of rendering and material parameters separate from BGE or Blender Internal, hide away any unused parameters and unify displaying of stuff like the transparency and physics, uncluttering the interface.

Yes, we thought about integrating Blend4Web as an external renderer, and investigated this topic using the example of LuxRender. It appeared to work as follows: an artist sets up the scene and activates rendering using the Blender add-on, then the external renderer reads and processes the file, and finally, an output image is imported back in Blender and shown as a render result. This way we couldn't have achieved real-time rendering like in BGE or even Cycles. If we want more we must go further and add WebGL support to Blender, which could have done, for example, using Chromium Embedded Framework. This looks like a rather big project. Moreover, we should convince Blender devs to allow for such drastic changes.

For this reason we choose a simpler approach, at least in the near future. First, we will focus our efforts on reducing gaps between the viewport and our WebGL renderer. Second, we are going to create a Blend4Web UI profile, bringing there all supported settings from the BI and BGE. And finally, we plan to support new viewport features and to participate in the development.







20 jan. 2015 21:59
Maybe be a bit more broad here and fully support all curve object types instead, and do the same automatic conversion to mesh whenever possible?
Absolutely! Thanks for suggesting this.
20 jan. 2015 23:29
Ответ на сообщение пользователя Yuri Kovelenov
Hi Duarte,
This way we couldn't have achieved real-time rendering like in BGE or even Cycles. If we want more we must go further and add WebGL support to Blender, which could have done, for example, using Chromium Embedded Framework. This looks like a rather big project. Moreover, we should convince Blender devs to allow for such drastic changes.

Actually I was not being that ambitious, I can only imagine that would indeed be a huge and complex project hehe
I was merely stating that Blend4Web's group of settings and way of working as a lot more in common with the way other render engines integrate with Blender's UI then it has to do with a screen layout preset.

Having it as an option you could pick from the render engine menu would allow having an environment setup with all the material, world, object, physics and particle settings, and hidding watever is just not supported or redundant, just like the ones you get for cycles or luxrender or even BGE. All this without touching Blender screen layouts, since most users will probably have their own personal preference and most likely have their own customized layouts already setup.

You would still export from the export menu, or if you were feeling really original instead of having a render button you would just have an "export" button which would actually just create the HTML or corresponding JSON file instead, this could respect the "output" field like other renderers do, allowing a quick explicit way to re-export to the same file. At the most this could call the default system browser on the created file if the user intended so.
21 jan. 2015 04:53

Ответ на сообщение пользователя Yuri Kovelenov
Absolutely! Thanks for suggesting this.
That's great, will be very helpful

Also would like to add that if your went down the road of setting Blend4Web as if it was a render engine, it would allow to declutter the UI by only showing Blend4Web related options for materials and objects, hiding any unsupported features or redundant ones, and rearrange them in a more flexible manner.
This could eliminate the need to constantly jump around render engines, for changing transparency blending options (in BGE) or everything else in BI, and eventually set the viewport to the correct "mode" to support displaying of material and texture previews as closely to the web version as possible.
One could even have several scenes on the same blendfile set for different purposes, like say one scene for baking textures with cycles as default and render engine (this one would be automatically discarded for export) , and another with Blend4Web ready to export.
Please register or log in to leave a reply.