Category Archives: General EVO news

Infos of general interest about EVOs

Drivers: Complex behavior without coding

DriverWaveBlender features a powerful programming interface for Python scripts to achieve almost any level of control over visualizations. This is, of course, very important for developing EVOs that model processes and dynamic systems in environmental visualizations.

Complex behavior in Blender visualizations can also be achieved without coding via the built-in drivers system. Drivers are parameters (of scene objects) that are connected to parameters of other objects such that if the driving parameter changes the other parameter(s) driven change simultaneously. The power of the driver system relates to the fact that almost any parameter can be connected with any other (including material parameters etc.) and in that there is a great flexibility how to connect the parameters functionally (via expressions).

There is a new training resource available that is dedicated to reveal the power of drivers for dynamic visualizations.

Turning Blender into an environmental system builder?

BConf14_FDSo far, modeling EVOs that allow «live» interaction with visualized environmental systems via a customized user interface have already become the «Standard EVOs» at D-USYS. Still, they just scratch the surface of what is possible using Blender and its Python scripting language as a development platform.

A good example of how Blender could even be used to create a user friendly «environment creation kit» was demonstrated at this years’ Blender conference in Amsterdam. Using the full potential of the Blender development platform a «Fluid Designer» has been developed allowing the user to quickly create interior architecture using libraries, drag and drop and very intuitive and flexible interaction with the visualized setting. Check out the presentation by Andrew Peel!

Environmental scientists do not exactly focus on interior architecture. However, similar techniques could be used for developing an environmental system builder within Blender that incorporates «intelligent» rules for creating properly structured and functioning systems.

«Visual programming» with nodes

Example node group for compositing different animations side-by-side in the final video

Example node group for compositing different animations side-by-side in the final video

In Blender a myriad of settings can be used for developing an EVO, and one can also use the Python scripting language. For an increasing number of functional areas within Blender there is a third, very powerful control interface: the node system. Nodes that take data input can be combined (linked) with all kinds of nodes that process these data in some way before the processed data are fed into output nodes. This way, complex control structures can be «visually programmed».

The classic field of using node systems is compositing, i.e. for combining still or animated visualization layers and for post-processing to end up with a presentation-ready animation including the full depth of the visualized system, labels, color correction, special effects etc.

In addition, in Blender the node system of the Cycles render engine is used for the definition of the material of objects (surface appearance). Using this node system highly complex materials can be relatively easily developed.

There are further exciting uses of the node system that are under development for Blender: one is for controlling particles and another one for procedurally generating and processing objects.

Interfacing an EVO with external analysis

image_analysisEVOs may be good at visualizing environmental systems and processes but are often less suitable for analysis of the emerging patterns. If there is a strong desire to couple visualization with analysis, a solution may be to interface the visualization outcome with analysis by external software.

In a recent project the development of wheat in an agricultural field was visualized in relation to nutrient patchiness. The colour- and transparency-coded representation of wheat fitness and density cannot be easily analyzed within the EVO. A working solution is to take a birds-eye snaphot of the field and analyze the image with specialized tools in Adobe® Photoshop™. In this case, colour range selection filters were adjusted such as to detect areas in the image that represent different classes of wheat fitness. The corresponding areas can then be calculated as percentages and compared between various simulated treatments.

Simplifying complex animation tasks

With top notch 3D design and animation software (like Blender of course) you have very sophisticated tools at your fingertips to accomplish even very challenging visualization tasks. While this is true, these tools may require quite some time to learn and even more time to be controlled in a way that the visualization outcome is satisfactory. For example, you can use paths, newtonian physics, collision objects and a LOT of parameters to guide particle streams in visualizations. But too many interacting effectors may render efficient control of the particles hopeless.

In many cases a great shortcut in such a situation can be carefully preparing 2D animations (videos) that can be mapped onto dynamic objects in the visualization. For instance, using this technique, complex deformations of vortices and other flow domains can be designed with relative ease.

3D printing revisited: A soil model for experimenting

3DP_soil_AbertayIt is relatively straightforward to envision 3D printed EVOs (let’s call them Physical Environmental Models or PEMs) to be of use as concept and demonstration models in the environmental sciences. But how about stretching the scope further and using PEMs even for research and experimentation?

A team of scientists at Abertay University (UK) has already taken steps in this direction. They used ct scanning for obtaining the intricate 3D structure of soil and visualized it in the computer. Then they used powder bed 3D printing to produce a nylon model of the scanned soil volume that realistically traces the detailed structure of the soil. Their aim is to use this ‘articficial soil block’ for investigating microbial interactions in soil.

It seems it’s about time to sit back and start thinking about all the possibilities that making specific EVOs physical may offer to environmental sciences.

Analysis of 3D structures – a brain case example

BlenderConference_2013The 11th Blender conference in Amsterdam again offered various intriguing presentations on using Blender in scientific visualization (click on the image to see a full list of presentations and the corresponding live streams).

For this post I’ve picked one by Graham Knott, who is a researcher and head of the Biological Electron Microscopy Facility at the EPFL in Lausanne. He investigates the fine structure of the neurons in the human brain using electron microscopy and advanced image analysis algorithms to arrive at detailed 3D models of brain tissue components at the micron scale. For further analysis and visualization of these components they turn to Blender.

Environmental scientists do not exactly focus on brain research. However, it is not difficult to conceive how similar techniques could be used for substrate analyses and visualization in the environmental sciences.

Camera tracking for enhanced realism in visualization

There may be circumstances when virtual objects or processes should be evaluated in a photo-realistic context, for example to assess the impression of a wind turbine at a given spot in the landscape.

In such situations visualizing a photo-realistic context by means of modeling, texturing and lighting in a 3D-software may be a daunting task. But there is a methodological shortcut you may be able to use: camera tracking.

Origin of the wind turbine model

With camera tracking (which is available in Blender) you place virtual objects or animations in the context of life footage. The magic bullet for achieving a believable result is that camera tracking allows the movement of the virtual camera be trained by the real camera with which the footage was recorded. As a result the virtual object in the real scene stays at the same spot in the landscape reponding realistically to camera transformation and shake.

How this all works? Check out the great tutorial by BlenderGuru (Andrew Price)!

Planning with open software

No separate Investigation EVOs any more in future?

No separate Investigation EVOs any more in future?

Open software also means open communication, i.e. users normally know, or at least get good indications, where further development of the software is heading: Ton Roosendaal, the chairman of the Blender Foundation, recently posted a roadmap for development of upcoming versions 2.7, 2.8 and beyond of our visualization software Blender.

For the game engine module he suggested to integrate its real-time interaction functions into the main Blender environment. This would probably enable EVO developers to seamlessly blend investigation modules in Modeling EVOs so that there would be at least no technical reason any more to distinguish between Modeling and Investigation EVOs. Tons suggestions sparked a lively discussion that both raved about the new possibilities but also feared the likely consequence of shutting down developing this part of Blender into a real game engine.

Anyway, good to know for strategic planning of further EVO development!

Procedural textures: A perfect match for environmental factors

Three cross sections through a terrain using procedural textures for factor visualization in 3D-space

Three cross sections through a terrain using procedural textures for factor visualization in 3D-space

Understanding how spatiotemporal patterns of environmental factors relate to phenomena we observe in nature is a fundamental quest in environmental systems sciences. The complexity of these patterns makes visualization of factor gradients and dynamics particularly useful.

Fortunately, in Blender and in other 3D content creation software we can make use of so-called procedural textures, i.e. material definitions for objects that are continuously calculated through 3D-space using mathematical algorithms. This way it is possible, for example, to visualize patchy nutrient availability throughout a block section of a terrain. For this a specific noise texture can be parameterized. A simpler linear gradient texture can be used for water tables and the like.

To visualize more complex factor interactions overlays of different procedural textures and different blend modes for the colors can be used. Ultimately, for very specific or flexible pattern visualization Blender provides material components that can use material definitions of the powerful Open Shading Language.

Will EVOs materialize in future?

A 3-D printed model „landscape“ for illustrating the metastability concept. Depending on the location of release, kinetic energy and disturbances the spheres come to rest at troughs of different potential energy or perhaps even on a saddle (labile position).

A 3-D printed model „landscape“ for illustrating the metastability concept. Depending on the location of release, kinetic energy and disturbances the spheres come to rest at troughs of different potential energy or perhaps even on a saddle (labile position).

2013 is the year 3D printing pentrates everyone’s mind. New 3D printers and fields of application pop up every week. Thus, the question arises whether EVOs will remain virtual for teaching or wether some of them will turn into physical objects eventually.

An answer to this question should consider two main criteria: In which cases do physical rather than virtual EVOs have a didactical advantage and provide easier access to understanding a concept, and are the efforts for 3D printing an EVO justified (if at all possible)?

It’s probably too early to tell, but the options are on the table and the options will likely proliferate in future …

Tech preview: Investigation EVOs on Android devices

An Investigation EVO prototype running on a 7" Android tablet

An Investigation EVO prototype running on a 7" Android tablet

As most students use smartphones or tablets these days, the question comes up whether “EVO-apps” should be developed for such devices. As smartphones are almost always at hand, such Apps would give students great freedom when to work with EVOs.

Well, at least for the more demanding Modeling EVOs such devices are too limited in many respects. However, Investigation EVOs with their game-like approach may be particularly well suited for being used on tablets or smartphones if they provide on-screen user interfaces and touch manipulation. A player app for Android has been developed that allows running Blender game-engine based EVOs on Android devices. The app still lacks a few important features like text rendering but otherwise is already mostly functional.

We consider including the Android platform for future development of Investigation EVOs.

The first link in a chain: computer graphics research

The EVO project is all about using the open 3D content creation and animation software Blender for developing Environmental Visualization Objects. But for many sophisticated visualization functions in Blender it’s not the Blender developers who made the start. Rather, these functions originate in more generic computer graphics research that tries to find useful algorithms for visualization and simulation problems in the three spatial dimensions plus time.

Smoke plume rendered in Blender

A good example is the smoke simulator that was implemented by Blender developers Daniel Genrich and Miika Hamalainen based on a research paper by computer graphics scientists of Cornell University and ETH Zurich. And there you have it, a chain of links: basic computer graphics research linked to implementation in open software linked to use for specialized application development (e.g. a cloud development EVO) linked finally to students exploring weather dynamics at the end of the chain.

So, what does the future have up it’s sleeve for EVO creation? An intriguing place for some possible answers is the Interactive Geometry Lab led by Prof. Olga Sorkine. Check it out …

The first feedbacks are in.

Students have worked with the very first Modeling EVO! Using the R statistics package and the Experimental Design EVO they performed virtual growth experiments and related their results from statistical analysis to observations they made in the visualized experiment.

The feedbacks by the students were consistently positive: They liked the flexible approach to “run” an experiment with instantaneous visual outcome and to be able to switch back and forth between statistical results and inspecting the experimental system. Click on the image to have a closer look at some of the comments …

Innovedum event 2012 at ETH Zurich

Innovative projects and approaches for teaching at ETH within the innovedum framework were presented in the hall of the Environmental Systems Sciences building on 7 Nov. The EVO project took part in the exhibition featuring the scope of the project (poster and showreel) and demonstrating the new Modeling EVOs.

 

Lively discussions at the EVO booth

Quite a number of people stopped by and many of them were intrigued by the potential the EVO approach offers. We had fun discussing further options for future EVOs and making connections to colleagues from other departments during the exhibition and the dinner afterwards.