how to get the web gl context in wade ?
Shri

Gio,

 

I was looking at some web gl tutorials on line and wanted to have a go with them in wade.

 

However, I can't seem to get the webgl context ?

When I try:

console.log(wade.getLayerCanvas().getContext("2d"));

I get the 2d context,

but when I call

console.log(wade.getLayerCanvas().getContext("webgl"));

or

console.log(wade.getLayerCanvas().getContext("experimental-webgl"));

I get null ?

 

When I call wade.isWebGlSupported() it returns true and my browser seems to run the web gl examples from the tutorial page ok.

 

So, how do I get the webgl context ?

Or do I even need to get the context as there is another way to approach it in wade ?

 

Any help you can provide is appreciated.

 

cheers,

Shri

 

When I try this in an html page, everything works ok

----------------------------------------------------------------

<body onload="start()">  <canvas id="glcanvas" width="640" height="480">    Your browser doesn't appear to support the HTML5 <code>&lt;canvas&gt;</code> element.  </canvas>  <script>    var gl; // A global variable for the WebGL context    function start() {        var canvas = document.getElementById("glcanvas");        gl = initWebGL(canvas);      // Initialize the GL context              // Only continue if WebGL is available and working              if (gl) {            console.log('have webgl context');            gl.clearColor(0.0, 0.0, 0.0, 1.0);                      // Set clear color to black, fully opaque            gl.enable(gl.DEPTH_TEST);                               // Enable depth testing            gl.depthFunc(gl.LEQUAL);                                // Near things obscure far things            gl.clear(gl.COLOR_BUFFER_BIT|gl.DEPTH_BUFFER_BIT);      // Clear the color as well as the depth buffer.        }    };    // end start        function initWebGL(canvas) {          gl = null;                    try {            // Try to grab the standard context. If it fails, fallback to experimental.            gl = canvas.getContext("webgl") || canvas.getContext("experimental-webgl");          }          catch(e) {}                    // If we don't have a GL context, give up now          if (!gl) {            alert("Unable to initialize WebGL. Your browser may not support it.");            gl = null;          }                    return gl;    };    // end initWebGL      </script></body>

When I try this in wade - I get gl is null, no gl context

this.init = function() {        console.log('wade global init');        gl = self.initWebGL(wade.getLayerCanvas());        if (gl) {            console.log('web gl supported - initializing');            gl.clearColor(0.0, 0.0, 0.0, 1.0);                               gl.enable(gl.DEPTH_TEST);                                         gl.depthFunc(gl.LEQUAL);                                          gl.clear(gl.COLOR_BUFFER_BIT|gl.DEPTH_BUFFER_BIT);           }        else {            console.log('gl is null, no gl context');        }    };    // end init        this.initWebGL = function(canvas) {        console.log('in init web gl');        gl = null;        if (wade.isWebGlSupported()) {            console.log('in web gl supported');            var sp = new Sprite(0,wade.defaultLayer);            gl = canvas.getContext("webgl") || canvas.getContext("experimental-webgl");        }        else {            console.log('web gl not supported');        }        return gl;    };    // end initWebGL
All 5 Comments
Gio

Hi Shri

 

The layer needs to be set to webgl mode first, or its context is going to be a 2d context.

 

However, I wouldn't recommend accessing the context through the canvas object directly. It may be easier to just create sprites that have your own custom draw function.

 

So for example, let's use layer 5. Let's set it to webgl mode first:

wade.setLayerRenderMode(5, 'webgl');

Then we create an empty sprite on layer 5 and set a size for it. We also create an object using this sprite, and add it to the scene:

var sprite = new Sprite(null, 5);sprite.setSize(256, 256);wade.addSceneObject(new SceneObject(sprite));

Lastly, we set a custom draw function for the sprite - this function is passed the webgl context, so you can do whatever you like with it.

sprite.setDrawFunction(function(context){    // do what you like with the webgl context here});

One thing to remember is that, if your draw function is supposed to show different things over time (such as an animation or a time-dependent shader), you have to call this.setDirtyArea() somewhere in your sprite's draw function, or WADE will not know that it's supposed to change and may decide to optimize it accordingly, by not calling your draw function when it thinks it doesn't need to.

 

If you want to use the default shaders (as opposed to your own custom shaders) with a custom draw function, you may want to look at the code in the default webgl draw function (Sprite.prototype.drawgl) to see which uniforms need to be set for the default shaders to work.

Shri

Gio,

 

Thanks for the reply.

I am trying to integrate three.js with wade.

The idea is to use three.js for the 3d stuff (since there are lots of examples out there to look at)

and then use wade for other stuff like events, callback loops, scene management, etc.

 

Right now, I got the basic three.js demo to work (rotating red cube), but I had to create a canvas via wade.createCanvas().

This canvas seems to overlay any other canvas and so I don't seem to be able to get input events (which would defeat one of the reasons for using wade with three.js in the first place)

Is there a way to do this in wade so that the rendering canvas is one of the 'built in' canvas' ?

 

I have included the relevant code below:

this.init = function() {		console.log('wade global init');		self.initThreeJS();	};	// end init		this.initThreeJS = function() {		scene = new THREE.Scene();				camera = new THREE.PerspectiveCamera( 75, window.innerWidth / window.innerHeight, 1, 10000 );        camera.position.z = 1000;        geometry = new THREE.BoxGeometry( 200, 200, 200 );        material = new THREE.MeshBasicMaterial( { color: 0xff0000, wireframe: true } );        mesh = new THREE.Mesh( geometry, material );        scene.add( mesh );				var threeCanvas = wade.createCanvas();        renderer = new THREE.WebGLRenderer(threeCanvas);				renderer.setSize( window.innerWidth, window.innerHeight );		document.body.appendChild(renderer.domElement);		wade.setMainLoopCallback(self.gameLoop,'gameLoop');	};	// end initWebGL		//----------------------------------------------------------------	//					Game Management functions	//----------------------------------------------------------------	this.gameLoop = function() {		mesh.rotation.x += 0.01;		mesh.rotation.y += 0.02;				renderer.render(scene, camera);	};

As always,  "Grazie per l'aiuto"

Shri

Gio

I don't know exactly what Three.js does with its DOM elements, so I can't really comment on that. However if you are creating a canvas with wade.createCanvas() - which is sensible - this is most likely NOT stopping any input events. The events are caught by the main container (wade_main_div by default), and processed by WADE.

 

This line:

document.body.appendChild(renderer.domElement);

looks a bit suspicious, because wade.createCanvas() already adds the canvas to the document (as a child of the main container, to ensure that it's in the right place and events are handled correctly). Hopefully removing that line should fix your issues?

Shri

Gio,

 

Hey, I got it all to work !

Rotating red cube, transparent background, mouse events.

I was creating the renderer wrong

 

However, I still am getting one warning which I am not sure about:

WebGL: INVALID_OPERATION: uniform2fv: location is not from current program   wade_1.6.js:98

Any ideas ?

 

Here is the relevant code

wade.setLayerRenderMode(2,"webgl");this.initThreeJS = function() {		scene = new THREE.Scene();				camera = new THREE.PerspectiveCamera( 75, window.innerWidth / window.innerHeight, 1, 10000 );                camera.position.z = 1000;                geometry = new THREE.BoxGeometry( 200, 200, 200 );                material = new THREE.MeshBasicMaterial( { color: 0xff0000, wireframe: true } );                mesh = new THREE.Mesh( geometry, material );                scene.add( mesh );				 var threeCanvas = wade.getLayerCanvas(2); 		renderer = new THREE.WebGLRenderer({canvas: threeCanvas, alpha: true});				renderer.setSize( 600, 600 );		wade.setMainLoopCallback(self.gameLoop,'gameLoop');	};	// end initWebGL

Anyway, I am working my way through a set of tutorials I found on the web.

When they are done, I will post them on share and fork like the physics stuff.

 

cheers,

Shri

Gio

Great stuff Shri, sounds like it's going to be really useful.

 

You are getting that error because, in webgl mode, WADE sets some "global" uniforms for the layer for each frame - the viewport size, camera position and things like that. Then you are using the same context with three.js to set more uniforms before doing any actual draw calls.

 

Theoretically, this is potentially a security issue, since wade and three are different scripts (in reality, whoever wrote the webgl specs was just being overly paranoid, it isn't really a security issue at all).

 

You have two options:

 

  1. Stop WADE from updating the layer and canvas, three.js takes care of it and WADE doesn't touch it ever after creating it.
  2. Combine wade.js and three.js into a single javascript file

I'd go with number 2, because then you'd be able to use WADE sprites and three.js meshes on the same layer, which would be very cool.

 

You know, ideally if you found a way to turn three.js meshes into WADE sprites (which is possible via custom draw functions), this would be totally amazing. But I do realize that it would be a lot more work - I may personally look into it at some point in the future when we aren't so busy with other stuff :)

Post a reply
Add Attachment
Submit Reply
Login to Reply