Get started

Basic setup example

Basic example image

Open it in a new tab

HTML

The HTML set up is pretty easy. Just create a div that will hold your canvas and a div that will hold your images.

<body>
<!-- div that will hold our WebGL canvas -->
<div id="canvas"></div>
<!-- div used to create our plane -->
<div class="plane">
<!-- image that will be used as texture by our plane -->
<img src="path/to/my-image.jpg" />
</div>
</body>

CSS

The CSS is also very easy. Make sure the div that will wrap the canvas fits the document, and apply any size you want to your plane div element.

body {
/* make the body fits our viewport */
position: relative;
width: 100%;
height: 100vh;
margin: 0;
overflow: hidden;
}
#canvas {
/* make the canvas wrapper fits the document */
position: absolute;
top: 0;
right: 0;
bottom: 0;
left: 0;
}
.plane {
/* define the size of your plane */
width: 80%;
height: 80vh;
margin: 10vh auto;
}
.plane img {
/* hide the img element */
display: none;
}

Javascript

There's a bit more work in the javascript part : we need to instanciate our WebGL context, create a plane with basic uniforms parameters and use it.

// wait for the DOM to be ready
window.addEventListener("DOMContentLoaded", function() {
// set up our WebGL context and append the canvas to our wrapper
var curtains = new Curtains("canvas");
// get our plane element
var planeElement = document.getElementsByClassName("plane")[0];
// set our initial parameters (basic uniforms)
var params = {
vertexShaderID: "plane-vs", // our vertex shader ID
fragmentShaderID: "plane-fs", // our fragment shader ID
uniforms: {
time: {
name: "uTime", // uniform name that will be passed to our shaders
type: "1f", // this means our uniform is a float
value: 0,
},
},
};
// create our plane
var plane = curtains.addPlane(planeElement, params);
// if our plane has been successfully created
if(plane) {
plane.onRender(function() {
// use the onRender method of our plane fired at each requestAnimationFrame call
plane.uniforms.time.value++; // update our time uniform value
});
}
});

Shaders

Here are some basic vertex and fragment shaders. Just put it inside your body tag, right before you include the library.

<!-- vertex shader -->
<script id="plane-vs" type="x-shader/x-vertex">
#ifdef GL_ES
precision mediump float;
#endif
// those are the mandatory attributes that the lib sets
attribute vec3 aVertexPosition;
attribute vec2 aTextureCoord;
// those are mandatory uniforms that the lib sets and that contain our model view and projection matrix
uniform mat4 uMVMatrix;
uniform mat4 uPMatrix;
// our texture matrix that will handle image cover
uniform mat4 uTextureMatrix0;
// pass your vertex and texture coords to the fragment shader
varying vec3 vVertexPosition;
varying vec2 vTextureCoord;
void main() {
vec3 vertexPosition = aVertexPosition;
gl_Position = uPMatrix * uMVMatrix * vec4(vertexPosition, 1.0);
// set the varyings
// here we use our texture matrix to calculate the accurate texture coords
vTextureCoord = (uTextureMatrix0 * vec4(aTextureCoord, 0.0, 1.0)).xy;
vVertexPosition = vertexPosition;
}
</script>
<!-- fragment shader -->
<script id="plane-fs" type="x-shader/x-fragment">
#ifdef GL_ES
precision mediump float;
#endif
// get our varyings
varying vec3 vVertexPosition;
varying vec2 vTextureCoord;
// the uniform we declared inside our javascript
uniform float uTime;
// our texture sampler (default name, to use a different name please refer to the documentation)
uniform sampler2D uSampler0;
void main() {
// get our texture coords from our varying
vec2 textureCoord = vTextureCoord;
// displace our pixels along the X axis based on our time uniform
// textures coords are ranging from 0.0 to 1.0 on both axis
textureCoord.x += sin(textureCoord.y * 25.0) * cos(textureCoord.x * 25.0) * (cos(uTime / 50.0)) / 25.0;
// map our texture with the texture matrix coords
gl_FragColor = texture2D(uSampler0, textureCoord);
}
</script>

Et voilĂ  !

Textures uniforms matrices and sampler names

Let's say you want to build a slideshow with 3 images and a displacement image to create a nice transition effect.
By default, the textures uniforms matrices and sampler will be named upon their indexes inside your plane element. If you got something like that :

<!-- div used to create our plane -->
<div class="plane">
<!-- images that will be used as textures by our plane -->
<img src="path/to/displacement.jpg" />
<img src="path/to/my-image-1.jpg" />
<img src="path/to/my-image-2.jpg" />
<img src="path/to/my-image-3.jpg" />
</div>

Then, in your shaders, your textures matrices and samplers would have to be declared that way :

// use this in your vertex shader
uniform mat4 uTextureMatrix0; // texture matrix of displacement.jpg
uniform mat4 uTextureMatrix1; // texture matrix of my-image-1.jpg
uniform mat4 uTextureMatrix2; // texture matrix of my-image-2.jpg
uniform mat4 uTextureMatrix3; // texture matrix of my-image-3.jpg
...
// use this in your fragment shader
uniform sampler2D uSampler0; // bound to displacement.jpg
uniform sampler2D uSampler1; // bound to my-image-1.jpg
uniform sampler2D uSampler2; // bound to my-image-2.jpg
uniform sampler2D uSampler3; // bound to my-image-3.jpg

It is handy but you could also get easily confused.
By using a data-sampler attribute on the <img /> tag, you could specify custom uniforms matrices and samplers names to use in your shaders. With the example above, this would become :

<!-- div used to create our plane -->
<div class="plane">
<!-- images that will be used as textures by our plane -->
<img src="path/to/displacement.jpg" data-sampler="uDisplacement" />
<img src="path/to/my-image-1.jpg" data-sampler="uSlide1" />
<img src="path/to/my-image-2.jpg" data-sampler="uSlide2" />
<img src="path/to/my-image-3.jpg" data-sampler="uLastSlide" />
</div>
// use this in your vertex shader
uniform mat4 uDisplacementTextureMatrix; // texture matrix of displacement.jpg
uniform mat4 uSlide1TextureMatrix;       // texture matrix of my-image-1.jpg
uniform mat4 uSlide2TextureMatrix;       // texture matrix of my-image-2.jpg
uniform mat4 uLastSlideTextureMatrix;    // texture matrix of my-image-3.jpg
...
// use this in your fragment shader
uniform sampler2D uDisplacement; // bound to displacement.jpg
uniform sampler2D uSlide1;       // bound to my-image-1.jpg
uniform sampler2D uSlide2;       // bound to my-image-2.jpg
uniform sampler2D uLastSlide;    // bound to my-image-3.jpg

Using videos as textures

Yes, videos as textures are supported ! However there are a few downsides you need to know.
First, we can't autoplay videos without a user gesture on most mobile devices. Unless you don't care about mobile users, you will have to start the videos playback after a user interaction like a click event.
Also, please note that videos tend to use a lot of memory and could have a significant impact on performance, so try to keep them small.
Besides that, videos are really easy to use (and can be mixed with images as well). Let's see how we can handle them :

HTML

<!-- div used to create our plane -->
<div class="plane">
<!-- video that will be used as texture by our plane -->
<video src="path/to/my-video.mp4"></video>
</div>

Like with images, you can use a data-sampler attribute to set a uniform sampler name. You can use one or more videos, or mixed them with images if you want :

<!-- div used to create our plane -->
<div class="plane">
<!-- elements that will be used as textures by our plane -->
<img src="path/to/displacement.jpg" data-sampler="uDisplacement" />
<video src="path/to/my-video-1.mp4" data-sampler="uFirstVideo"></video>
<video src="path/to/my-video-2.mp4" data-sampler="uSecondVideo"></video>
</div>

Javascript

There's only one change inside our javascript : we need to tell our plane when to start playing the videos. We've got a playVideos method that we will put inside an event listener in our onReady method :

// wait for the DOM to be ready
window.addEventListener("DOMContentLoaded", function() {
// set up our WebGL context and append the canvas to our wrapper
var curtains = new Curtains("canvas");
// get our plane element
var planeElement = document.getElementsByClassName("plane")[0];
// set our initial parameters (basic uniforms)
var params = {
vertexShaderID: "plane-vs", // our vertex shader ID
fragmentShaderID: "plane-fs", // our fragment shader ID
uniforms: {
time: {
name: "uTime", // uniform name that will be passed to our shaders
type: "1f", // this means our uniform is a float
value: 0,
},
},
};
// create our plane
var plane = curtains.addPlane(planeElement, params);
// if our plane has been successfully created
if(plane) {
plane.onReady(function() {
// set an event listener to start our playback
document.getElementbyId("start-playing").addEventListener("click", function() {
plane.playVideos();
});
}).onRender(function() {
// use the onRender method of our plane fired at each requestAnimationFrame call
plane.uniforms.time.value++; // update our time uniform value
});
}
});

And that's it. Check the video examples (and source codes) if you want to see what's possible.

Using canvas as texture

Last but not least, you can use a canvas as a texture. It is once again really easy to use. You just have to insert a canvas tag inside your HTML, or eventually create it in your javascript and load it using the loadCanvas method.

HTML

<!-- div used to create our plane -->
<div class="plane">
<!-- canvas that will be used as textures by our plane -->
<canvas id="canvas-texture" data-sampler="uCanvas"></canvas>
</div>

You can use multiple canvases and data-sampler attributes as well, like you'd do with images or videos.

Javascript

The javascript code remains almost the same. We just set the size of our canvas, get its context and draw a simple rotating red rectangle inside our animation loop.

// wait for the DOM to be ready
window.addEventListener("DOMContentLoaded", function() {
// set up our WebGL context and append the canvas to our wrapper
var curtains = new Curtains("canvas");
// get our plane element
var planeElement = document.getElementsByClassName("plane")[0];
// set our initial parameters (basic uniforms)
var params = {
vertexShaderID: "plane-vs", // our vertex shader ID
fragmentShaderID: "plane-fs", // our fragment shader ID
uniforms: {
time: {
name: "uTime", // uniform name that will be passed to our shaders
type: "1f", // this means our uniform is a float
value: 0,
},
},
};
// create our plane
var plane = curtains.addPlane(planeElement, params);
// if our plane has been successfully created
if(plane) {
// our texture canvas
var textureCanvas = document.getElementById("canvas-texture");
var textureCanvasContext = textureCanvas.getContext("2d");
// get our plane dimensions
var planeBoundingRect = plane.getBoundingRect();
// set the size of our canvas
textureCanvas.width = planeBoundingRect.width;
textureCanvas.height = planeBoundingRect.height;
// use the onRender method of our plane fired at each requestAnimationFrame call
plane.onRender(function() {
plane.uniforms.time.value++; // update our time uniform value
// here we will handle our canvas texture animation
// clear scene
textureCanvasContext.clearRect(0, 0, textureCanvas.width, textureCanvas.height);
// continuously rotate the canvas
textureCanvasContext.translate(textureCanvas.width / 2, textureCanvas.height / 2);
textureCanvasContext.rotate(Math.PI / 360);
textureCanvasContext.translate(-textureCanvas.width / 2, -textureCanvas.height / 2);
// draw a red rectangle
textureCanvasContext.fillStyle = "#ff0000";
textureCanvasContext.fillRect(textureCanvas.width / 2 - textureCanvas.width / 8, textureCanvas.height / 2 - textureCanvas.height / 8, textureCanvas.width / 4, textureCanvas.height / 4);
});
}
});

Adding post-processing

You can add post-processing to your scene by using the addShaderPass method of your Curtains object. It uses FBO (short for Frame Buffer Objects) under the hood and allows some really cool effects.

// wait for the DOM to be ready
window.addEventListener("DOMContentLoaded", function() {
// "canvas" is the ID of our HTML container element
var curtains = new Curtains("canvas");
var params = {
vertexShaderID: "my-shader-pass-vs", // ID of your vertex shader script tag
fragmentShaderID: "my-shader-pass-fs", // ID of your fragment shader script tag
uniforms: { // uniforms are what will allow you to interact with your shader pass
time: {
name: "uTime", // uniform name that will be passed to our shaders
type: "1f", // this means our uniform is a float
value: 0, // initial value of the uniform
},
},
};
var shaderPass = curtains.addShaderPass(params);
// if our shader pass has been successfully created
if(shaderPass) {
shaderPass.onRender(function() {
shaderPass.uniforms.time.value++; // update our time uniform value
});
}
});

Post-processing shaders are a bit different than plane shaders. They do not have any projection or model view matrix and they also silently create a render texture that will hold our scene (called uRenderTexture in our fragment shader).
Here are some very basic vertex and fragment shaders example, that will use the same effect as our basic plane example seen above.

Post processing vertex shader

// those are the mandatory attributes that the lib sets
attribute vec3 aVertexPosition;
attribute vec2 aTextureCoord;
// pass your vertex and texture coords to the fragment shader
varying vec3 vVertexPosition;
varying vec2 vTextureCoord;
void main() {
gl_Position = vec4(aVertexPosition, 1.0);
// set the varyings
// use our aTextureCoord attributes as texture coords in our fragment shader
vvTextureCoord = aTextureCoord;
vVertexPosition = vertexPosition;
}

Post processing fragment shader

// get our varyings
varying vec3 vVertexPosition;
varying vec2 vTextureCoord;
// the uniform we declared inside our javascript
uniform float uTime;
// our render texture (our WebGL scene)
uniform sampler2D uRenderTexture;
void main() {
// get our texture coords from our varying
vec2 textureCoord = vTextureCoord;
// displace our pixels along the X axis based on our time uniform
// textures coords are ranging from 0.0 to 1.0 on both axis
textureCoord.x += sin(textureCoord.y * 25.0) * cos(textureCoord.x * 25.0) * (cos(uTime / 50.0)) / 25.0;
// map our texture with the texture matrix coords
gl_FragColor = texture2D(uRenderTexture, textureCoord);
}

You can also load images, videos or canvases into your shader pass, as you'd do with a regular plane.

Performance tips

  • Plane's canvases textures are updated at each frame (videos are updated at 30FPS), which has a significant impact on performance. When those textures are not visible (if they are hidden by another texture, or if you have finished drawing on your canvas...), you should set those textures shouldUpdate property to false, and switch it back to true before displaying them again.
  • Large images have a bigger impact on performance. Try to scale your images so they will fit your plane maximum size. It goes the same for videos of course: try to keep them as light as possible.
  • Try to use as less javascript as possible in the onRender() methods as this get executed at each draw call.