OpenGL ES 2.0 / WebGL Demos

Since more and more embedded platforms contain a powerful 3D graphics core, OpenGL ES 2.0 is the ideal graphics API for creating attractive user interfaces and complex graphics applications. The programmable vertex and fragment shaders enable the implementation of complex and fancy graphics effects and it reduces the CPU load dramatically.
Due to the fact that WebGL is based on OpenGL ES 2.0, graphics applications designed for embedded platforms can be ported easily to be executed within a web browser. Moreover, since Android and iOS (iPad, iPhone) are supporting OpenGL ES 2.0, platform independent and architecture independent graphics application can be realized.

This page contains an overview of several basic OpenGL ES 2.0 / WebGL sample applications - which can be executed directly within your webbrowser, provided that your browser supports WebGL. A short explanation gives you an overview, which parts of the demo application are done by the CPU or by using the GPU (programmable vertex and fragment shader).

Please ensure that your webbrowser supports WebGL in order to run the demos succesfully.

SolarSystem

The "SolarSystem" demo application shows a simplified model of the sun and the planets, rotating around their axis and orbitting around the sun. Once the system is initialized and all origin coordinates of the planets are calculated by the CPU, almost the complete calculation and drawing of the scene is done by the GPU.
Each planet is created by a wireframe sphere, build by several hundreds of vertices. Within each frame, the vertex shader calculates the position and the amount of lighting for each vertex, depending on the current position of the planet and the current viewing position.
Finally, the fragment shader draws the visible parts of the shpere by using a texture bitmap and performs some blending operation to make the desired lighting.
Click on the image to run the demo...

FireWorks

The "FireWorks" sample application demonstrates the usage of particle systems. The firework display consist of randomly fired pyrotechnic articles. Each of them is created by several hundres of particels with individual ballistic trajectories, colors and shapes.
The CPU is only creating a set of parameters for each particle system. Each particle has its own lifecycle. The position of each particle within the 3D space and the current size, the color and the shape of each particle is calculated by the vertex shader. The fragment shader program is responsible for drawing each particle according to the prepared values by using a star texture.
Click on the image to run the demo...

MoonShot

An interactive flight over a synthetic moon surface is shown within the "MoonShot" demo application. The user can change the speed and direction of the spacecraft to avoid a collision with one of the stone bricks.
The surface of the moon is created by a procedural terrain generation. The current visible area consists of an array of sectors with terrain height values. When the spacecraft leaves one sector a new row of terrain data is created. A similar techique is used for creating the flying stone bricks. They are created by using spheres where the radius of each vertex is modulated by a height map. Creating the surface sectors and creating the stone wireframe model is done by the CPU.
The vertex shader is responsible for calculating the current position, the lighting and the fog. The fog is used to darken the terrain sectors and stones which are too far away from the current viewing postion.
Finally, the fragment shader program is doing the texture mapping and lighting of the fragments.
Click on the image to run the demo...