-
Notifications
You must be signed in to change notification settings - Fork 0
Raven API Documentation
Raven is a high-level JavaScript graphics library for creating GUIs on the web with the WebGL API. This paper is the documentation reference of the library's API, and contains guides to help learn its main concepts.
Raven uses some features that may not be fully supported on certain browsers:
| API | Compatibility table |
|---|---|
| WebGL2 | MDN ∙ caniuse |
| OffscreenCanvas | MDN ∙ caniuse |
| ResizeObserver | MDN ∙ caniuse |
| Promise.allSettled | MDN ∙ caniuse |
Raven has been written with vanilla JavaScript and is designed to work as a standalone linrary, therefore its compatibility within web frameworks has not been tested.
However, one web page can host multiple Raven instances.
You can try out Raven by adding it as a submodule in your Git repository:
$ git submodule add git@github.com:matteokeole/raven
$ git submodule update --init --recursiveBecause of the use of the Fetch API and ES6 modules, a local web server is required.
The Raven API is a combination of 3 smaller APIs:
- a math API with vector and matrix utilities;
- a GUI API dedicated to GUI rendering;
- the core API which brings it all together into a coherent library.
The instance is the entry point of every Raven application. It stores a list of composites and renders them in order to produce the final image.
An instance renderer which presents the output to the user.
This is the only renderer which must have a DOM canvas.
An array of composites on which the instance will loop at render time.
The draw order is the same as the array order. For example, a scene composite should be located before a GUI composite.
An integer that keeps track of the number of composites in the instance, mainly to serve as cache.
A ResizeObserver that listens to resize events and propagates them on the instance and its composites.
A Vector2 that keeps track of the current pointer position. This is used by mouse events such as MouseDownEvent and MouseMoveEvent.
A map of customizable parameter values. The following parameters are required by the instance when setting it up:
-
current_scale: The GUI scale multiplier. Defaults to 1. -
root_path: The Raven submodule path, ending with "/". Defaults to "/". -
font_path: The font directory path, ending with "/". Defaults to "/". -
texture_path: The texture directory path, ending with "/". Defaults to "/". -
resize_delay: The debounce delay in milliseconds to trigger a resize event. Defaults to 0.
The number of frames per second, defaulting to 60. This indicates to the instance how many times to trigger a render. Note that this doesn't affect composites, in which renders are manually triggered.
The index of the current frame since the start of the loop. Components can use it for frame-timed animations.
The theoretical interval between frames, in milliseconds. The frame interval
A number that keeps track of the elapsed time, in milliseconds, since the last frame.
The ID of the current animation frame.
The ID of the current resize timeout.
A boolean used to prevent calling loop when the loop is already active. Defaults to false.
A map storing the currently pressed keys in order to differentiate key press and repeat events.
Creates an Instance instance. The first parameter is the instance renderer which will output the final image. Note that its canvas doesn't get added to the page automatically.
Returns the instance's renderer.
Returns the instance's composite array.
Binds the provided composite array to the instance.
Returns the instance's ResizeObserver.
Binds the provided ResizeObserver to the instance.
Returns the parameter associated with the provided key.
If no corresponding parameters are found, undefined is returned.
Changes the value of the parameter associated with the provided key.
If no corresponding parameters are found, undefined if returned.
Parameters can only be added to the map through a sub-class constructor, but this behavior will likely change in favor of either more modular parameters or sub-class members.
Returns the instance's frames per second value.
This method asynchronously performs the following steps:
- Builds the instance's renderer.
- Creates a Vector4 matching the WebGL viewport method parameters. The viewport dimension is taken with innerWidth and innerHeight.
- Sets the viewport vector to the instance's renderer.
- Builds each composite.
- Sets the viewport vector to each composite.
- Adds key and mouse event listeners to the instance renderer's canvas.
- Creates a generic ResizeObserver and binds it to the instance.
Starts the frame loop with the defined FPS value. The ResizeObserver's listening is started by this method.
If the loop is already running, an Error is thrown.
An abstract method called from the ResizeObserver's callback. Its arguments include a Vector4 containing the new viewport dimensions, and the devicePixelRatio value.
Pauses the loop if it's running, then disposes the instance's renderer and composite renderers.
Initiates the observing of the instance's renderer's canvas through the bound ResizeObserver.
The requestAnimationFrame callback function. Every frame, Instance.update is called, followed by InstanceRenderer.render.
The loop is stopped if an error happens within these two calls.
Calls the update method of every animatable composite with the current frame index.
Calls either the onKeyPress or onKeyRepeat method of every composite.
This is used as the instance renderer's canvas keydown event listener.
Calls the onKeyRelease method of every composite.
This is used as the instance renderer's canvas keyup event listener.
Calls the onMouseDown method of every composite.
This is used as the instance renderer's canvas mousedown event listener.
Updates the pointer position, then calls the onMouseMove method of every composite.
This is used as the instance renderer's canvas mousemove event listener.
A class regrouping GLSL shaders, WebGL textures and a canvas to make general-purpose rendering.
A static Vector2 representing the maximum size of a texture array sub-rectangle. It is the size given to every texture array created with a WebGL renderer.
By default, MAX_TEXTURE_SIZE is set to
WebGLRenderer.MAX_TEXTURE_SIZE = new Vector2(512, 512);Stores the output canvas in the form of an HTMLCanvasElement or an OffscreenCanvas if the renderer is offscreen.
Stores a WebGL2RenderingContext obtained via the canvas.
Stores a Vector4 representing the WebGL canvas viewport. The first two elements are the top left coordinates and the last two are the width and height.
Stores a WebGLProgram array for allowing multiple shader programs.
Stores WebGL vertex attributes in a map, the keys being the attribute names and the values being integers specifying the location of each attribute.
Stores WebGL uniform locations in a map, the keys being the uniform names and the values being WebGLUniformLocation instances.
Stores WebGL buffers in a map, the keys being the buffer names and the values being WebGLBuffer instances.
A map of texture wrappers uploaded by the client code, the keys being the texture paths. It is automatically filled after calling createTextureArray.
Note: This does NOT store WebGL textures. Only one WebGLTexture will be created in the form of a 2D texture array and will hold sub-rectangles corresponding to the loaded images.
Creates a WebGL renderer instance.
Returns the renderer's canvas.
Returns the renderer's viewport.
Replaces the viewport with the Vector4 parameter, and updates the canvas dimensions and the context viewport.
Returns the renderer's texture map.
Returns a texture from the renderer's texture map, identified by its name string.
If no corresponding texture is found, a ReferenceError is thrown.
Initializes the renderer. The shaderPath parameter is a string equal to the instance's root_path parameter.
This method is abstract and asynchronous.
Utility method to synchronously create a WebGLProgram from shader source strings. The resulting program is only returned and not added to the renderer's program array.
If the compilation fails, a ShaderCompilationError is thrown.
Creates a WebGL texture array and fills it with the texture list (loaded by a texture loader) provided through the textures parameter.
If the generateMipmaps flag is true, mipmaps will be generated for the texture array. Otherwise, a linear filtering is used.
If a texture's size exceed MAX_TEXTURE_SIZE, a RangeError is thrown.
Renders a frame using the scene data provided through the untyped scene parameter. This method is abstract and must be implemented in sub-classes.
Replaces the viewport by the viewport Vector4 parameter. The additionnal projection Matrix3 parameter can be used when overridding this method.
Clears the context's color and depth buffers.
Since this method clears the whole canvas, it is preferred not to use it when possible and target specific parts of the canvas instead.
Frees the resources used by the renderer such as programs, attributes, uniforms, buffers and user textures and triggers a context loss through the WEBGL_lose_context extension.
Non-offscreen canvases won't be removed from the DOM by calling this method.
Utility method that creates, compiles and returns a WebGLShader.
The type parameter is a GLint representing the shader type (VERTEX_SHADER or FRAGMENT_SHADER) and the source parameter is the shader's source code.
This renderer is meant to be only used as the instance's DOM renderer.
Stores the output canvas in the form of an HTMLCanvasElement.
A WebGLTexture array. Each texture is written to by a composite.
An integer that keeps track of the number of composites, mainly to serve as cache.
Creates an InstanceRenderer instance.
Returns the renderer's canvas.
Replaces the composite count with the compositeCount number parameter.
Replaces the shader path with the shaderPath string parameter.
Initializes the renderer. The shaderPath parameter is a string equal to the instance's root_path parameter.
This method is asynchronous.
If WebGL2 is not supported, a NoWebGL2Error is thrown.
Renders each composite texture in order, using the TRIANGLE_FAN rendering primitive.
Updates the texture of a composite with the texture parameter, generally in the form of a canvas. The index parameter is the index of the composite to update.
This method is meant to be called from within the render method of a composite, after the image has been produced.
A "layer" of the final image which has its own WebGLRenderer. Not to be mixed up with layers, which are more like GUI pages.
A WebGL renderer instance. This renderer should be offscreen and send its canvas's contents to the instance's renderer through updateCompositeTexture.
A mutable scene containing various objects to render.
A reference to the instance in which this composite is bound.
The index of this composite in the instance's composite array.
A flag determining if this composite contains components that need to be updated on a per-frame basis. For example, the GUI renderer is animatable.
Creates a Composite instance with a fresh WebGL renderer and a reference to the app's instance.
Returns the composite's renderer.
Returns the composite's instance.
Returns the composite's index.
Replaces the index with the index number parameter.
Returns the composite's isAnimatable value.
Replaces the isAnimatable value with the isAnimatable boolean parameter.
Initializes the composite.
This method is abstract and asynchronous.
Updates the composite.
The frameIndex parameter is a integer representing the current frame index.
This method is abstract.
Renders the composite.
This method is abstract.
Resizes the composite with the provided viewport.
This method is abstract.
Note: The devicePixelRatio value is not passed to this method and should already be present in the viewport Vector4.
An entry point for dispatching key press events through a composite.
The event parameter is a KeyboardEvent.
This method is abstract.
An entry point for dispatching key release events through the composite.
The event parameter is a KeyboardEvent.
This method is abstract.
An entry point for dispatching key repeat events through the composite.
The event parameter is a KeyboardEvent.
This method is abstract.
An entry point for dispatching mouse down events through the composite.
The event parameter is a MouseEvent.
This method is abstract.
An entry point for dispatching mouse move events through the composite.
The event parameter is a MouseEvent.
This method is abstract.
An abstract camera class for 2D and 3D usage. For now it can only be used with an orthographic projection (see OrthographicCamera).
A Matrix4 containing the projection data.
A Vector3 containing the current camera's position in 3D space.
A Vector3 containing the current camera's rotation.
Creates a Camera instance.
Returns the camera's projection matrix.
Replaces the camera's projection matrix with the provided projection parameter.
Returns the camera's position vector.
Replaces the camera's position vector with the provided position parameter.
Returns the camera's rotation vector.
Replaces the camera's rotation vector with the provided rotation parameter.
Updates the camera's projection matrix. The exact steps will differ from each implementation. This method is abstract and must be implemented in a sub-class.
A Camera with support for orthographic projection.
A Vector2 representing the camera's viewport. The first parameter is the width and the second is the height.
Creates an OrthographicCamera instance. The first parameter is the initial viewport Vector2.
Returns the camera's viewport.
Replaces the camera's viewport with the provided viewport parameter.
Updates the camera's projection matrix by creating a new orthographic projection matrix with the current viewport.
A list of abstract objects to be drawn by a WebGLRenderer.
An array containing the objects of the scene.
Creates a Scene instance.
Returns the scene's queue.
Returns a true if the scene's queue is empty, false otherwise.
Removes all objects from the scene's queue.
A collection of Array sub-classes working as stacks.
A stack that keeps track of the insertions of "buckets" to allow popping large batches of elements fastly.
An array containing the insertion indices for each bucket.
An integer tracking the length of the last bucket, only for internal usage.
Creates a BucketStack instance.
Pushes the current stack length in the insertion index array. This marks the closing of the current bucket and the opening of a new one on top of it.
Removes the elements of the last sealed bucket by setting the stack length to the last bucket insertion index.
If no buckets have been sealed, the method returns silently.
This can happen when a BucketStack instance is more recent than others, thus having less sealed bucket indices than the other instances.
A collection of classes for holding additional data on top of an opaque structure such as a WebGL object.
A wrapper around images, mainly for use within a WebGL renderer's texture map.
An HTMLImageElement or Uint8Array representing the image data.
A Vector2 containing the image's dimensions.
An integer associated with the image for indexing in a texture array.
Creates a TextureWrapper instance with parameters for the image, viewport and index read-only members.
Returns the wrapper's image.
Returns the wrapper's viewport.
Returns the wrapper's index.
A collection of classes for loading and parsing resources asynchronously.
A read-only string representing the directory containing the resources to load.
Creates a Loader instance.
The base path will be set to the basePath string parameter.
Asynchronously loads resources.
The data parameter can hold additional data for the parsing algorithms, such as a path.
This method is abstract. Implementations can use custom types for the data parameter and the return value.
A loader class for fetching WebGL shader code from a text file.
Asynchronously loads and returns the content of the file (decoded as text) as a string.
If the response is not successful, a standard error is thrown.
A loader class for fetching colors and textures from a JSON file.
Asynchronously loads a list of textures from a source file. The file must have the JSON format and list the paths to load in an array. Each path is an endpoint starting in the texture directory (see the instance's texturePath parameter).
The textures are loaded in parallel using the Promise.allSettled method (which may require a polyfill). Textures which are not found will be skipped.
For each texture, a custom Image object is created with the following data:
-
name: the texture path as specified in the JSON file -
image: an HTMLImageElement representing the loaded image -
viewport: a Vector2 containing the texture's width and height
The return value is an array of Image objects.
If the reponse is not successful, a standard error is thrown. If a texture cannot be decoded (e.g. is missing), the script continues to the next one.
Synchronously loads a list of custom Color objects into Image objects (see load).
Each Color object must have these specified properties:
-
name: the name used by the future texture -
value: an Uint8Array describing the color RGBA value with 4 elements (not normalized).
The texture dimensions will be equal to the viewport Vector2 parameter. It is advised to provide this parameter with the WebGLRenderer's MAX_TEXTURE_SIZE static value.
The return value is an array of Image objects.
A collection of custom errors.
Indicates that a function or method is not implemented yet.
Creates a NotImplementedError instance.
Indicates that the browser doesn't support WebGL2.
This class stores a DOM node containing the error message in its node member.
Creates a NoWebGL2Error instance.
Indicates that the compilation of a WebGLShader failed.
This class stores a DOM node containing the error message in its node member.
Creates a ShaderCompilationError instance.
The shader info log should be passed in the message string parameter, and the shader type in the type string parameter.
Exports of cached mathematical constants.
Caches Math.PI.
Caches Math.SQRT1_2.
Caches Math.SQRT2.
Objects represented in a 2D, 3D or 4D space with a direction and a magnitude.
Vectors extend the Float32Array interface.
Most of the methods below return the current instance to allow chaining operations.
The Vector class is abstract and should only be used in the client side for typedefs. For creating and using vectors, please refer to the Vector2, Vector3 and Vector4 classes.
Creates a Vector instance.
The elements parameter is an spread number array. Only the first N elements will be used for a N-dimension vector (e.g. the first 3 elements for a Vector3).
Creates a copy of the vector Vector parameter. The new instance and the parameter must be of the same type.
Adds the components of the vector Vector to this vector.
The two vectors must have the same type. This operation alters the current vector.
Adds the scalar number parameter to each of this vector's components.
This operation alters the current vector.
Divides the components of the vector Vector by that of this vector.
The two vectors must have the same type. This operation alters the current vector.
If at least one component of the vector parameter is 0, a RangeError is thrown.
Divides each component of this vector by the scalar number parameter.
This operation alters the current vector.
If scalar is 0, a RangeError is thrown.
Returns the dot product between this vector and the vector Vector parameter.
The two vectors must have the same type.
Floors each component of this vector.
This operation alters the current vector.
Linearly interpolates between this vector and the vector Vector parameter, using the multiplier number parameter as factor.
This operation alters the current vector.
Returns the length of this vector.
This method cannot be called
lengthbecause this name is used by the Float32Array class, which is extended by all vectors.
Multiplies the components of the vector Vector by that of this vector.
The two vectors must have the same type. This operation alters the current vector.
Multiplies each component of this vector by the scalar number parameter.
This operation alters the current vector.
Transforms this vector into an unit vector if its length is not 0.
This operation alters the current vector.
Subtracts the components of the vector Vector from that of this vector.
The two vectors must have the same type. This operation alters the current vector.
Subtracts the scalar number parameter from each component of this vector.
This operation alters the current vector.
Returns a readable string containing the vector's components, joined with a space.
Object represented in a 2D space with a direction and a magnitude.
This class extends Vector and implements all of its abstract methods.
Object represented in a 3D space with a direction and a magnitude.
This class extends Vector and implements all of its abstract methods.
Returns the cross product of this vector and the vector Vector parameter.
Object represented in a 4D space with a direction and a magnitude.
This class extends Vector and implements all of its abstract methods.
A set of numbers arranged in rows and columns to form a rectangular array.
Matrices extend the Float32Array interface.
Most of the methods below return the current instance to allow chaining operations.
The Matrix class is abstract and should only be used in the client side for typedefs. For creating and using matrices, please refer to the Matrix3 and Matrix4 classes.
Returns an identity Matrix. The matrix type is that of the underlying class.
This method is a static constructor.
Returns an orthographic Matrix with the vector Vector parameter as viewport. The matrix type is that of the underlying class, and the vector type is Vector2 for Matrix3 and Vector3 for Matrix4.
This method is a static constructor.
Returns a Matrix translated by the vector Vector parameter. The matrix type is that of the underlying class, and the vector type is Vector2 for Matrix3 and Vector3 for Matrix4.
This method is a static constructor.
Returns a Matrix scaled by the vector Vector parameter. The matrix type is that of the underlying class, and the vector type is Vector2 for Matrix3 and Vector3 for Matrix4.
This method is a static constructor.
Creates a Matrix instance.
The elements parameter is an spread number array. Only the first N
Creates copy of the matrix Matrix parameter. The new instance and the parameter must be of the same type.
Inverts this matrix.
This operation alters the current matrix.
Multiply this matrix by the matrix Matrix parameter.
The two matrices must have the same type.
This operation alters the current matrix.
Multiply each component of this matrix by the scalar number parameter.
This operation alters the current matrix.
Transposes this matrix.
This operation alters the current matrix.
A set of 9 numbers arranged in rows and columns to form a 3
This class extends Matrix and implements all of its abstract methods.
Returns a Matrix3 rotated by the scalar number parameter (expressed in radians).
This method is a static constructor.
A set of 16 numbers arranged in rows and columns to form a 4
This class extends Matrix and implements all of its abstract methods.
Returns a "lookAt" Matrix4 from the provided parameters (all Vector3s).
This method is a static constructor.
Returns a perspective projection Matrix4 from the provided parameters (all numbers).
-
fieldOfViewmust be expressed in radians. -
coordinateSystemis either -1 for left-handed or 1 for right-handed. -
biascan be omitted and defaults to$\pi \over 2$ .
This method is a static constructor.
Returns a Matrix4 rotated by the vector Vector3 parameter.
This method is a static constructor.
Functions designed to help with number clamping and basic intersection checking.
Returns the largest value between a and b.
Returns the smallest value between a and b.
Returns the clamped value of n between a and b.
Returns a value between a and b, linearly interpolated with the t factor.
Creates a box from the top-left corner boxPosition Vector2 and size boxSize Vector2, and returns whether the point Vector2 is located within this box.
This function can be used for checking if an element is hovered by the mouse cursor.
A composite made for displaying graphical user interfaces.
A camera to project the scene onto the composite.
An array of layers used as a stack for the multiple GUI pages.
An array of the root components currently rendered from the layer stack.
An array of the animatable and currently rendered visual components.
Each frame, the composite calls their update method, which determines if the component needs to be re-computed and/or re-rendered.
An array containing all currently rendered components. These include all components returned from Layer.build calls.
An array containing all layer insertion indices for fast batch popping, similar to a BucketStack.
An object containing event listener callbacks filtered by their events and stored into a bucket stack.
The keys are the event names.
An object containing the loaded bitmap fonts, with their names as keys.
Creates a GUIComposite instance.
Takes an option object with the associated GUI renderer, the parent instance and an object containing the desired bitmap fonts, with the font names as keys.
Returns the bitmap font named as the provided key string.
If no corresponding font is found, a ReferenceError is thrown.
Returns the texture wrapper named as the provided key string from the renderer's texture map.
If no corresponding texture is found, a ReferenceError is thrown.
This method asynchronously performs the following steps:
- Load each the glyph map of every font.
- Create a projection Matrix4 based on the instance renderer's viewport and the instance's
current_scaleparameter. - Set the projection matrix onto the renderer.
- Set the projection matrix onto the camera.
- Build the GUI renderer.
Re-calculates the absolute position of each component of the render queue, all layers included.
This method must be called after you've changed the alignment, margin, size or position of a component that is in the render queue.
See Component.compute for re-computing a single component.
Calls the update method of every animated component.
The frameIndex parameter is the current frame index since the start of the loop and not since the visual component build time.
If a visual component returned true from the update, it is added to the render queue.
If the composite's scene is not empty after the loop, a new render is triggered.
Note that visual components can trigger a new render themselves with the
contextGUI composite parameter.
Calls the render method of the composite's renderer with the composite's scene as argument.
The scene is then cleared and the instance renderer's composite texture is updated with the new canvas data.
Resizes the viewport of the composite's renderer, then triggers a re-computation and a new render.
Resize events render ALL the components built from the layer stack.
The viewport parameter is a Vector4 containing the X/Y offsets of the screen, the width and the height, in order.
Pushes the provided layer layer parameter on top of the layer stack.
Calling this method will result in all the children of the new layer being registered into the render queue. The new components will be rendered on top of the previous ones.
Registers the provided component component parameter in the render queue.
Dispatches the provided event event parameter to the currently rendered components.
Pops the top layer from the layer stack.
Calling this method will result in all the children of all the stacked layers being registered into the render queue and a new render being triggered.
If the layer stack is empty, an Error is thrown.
Dispatches a key press event with a carry containing the key and code values of the event KeyboardEvent parameter.
This is an event listener used by the parent instance.
Dispatches a key release event with a carry containing the key and code values of the event KeyboardEvent parameter.
This is an event listener used by the parent instance.
Dispatches a key repeat event with a carry containing the key and code values of the event KeyboardEvent parameter.
This is an event listener used by the parent instance.
Dispatches a mouse down event with a carry containing the clientX/clientY values of the event MouseEvent parameter, arranged into a Vector2.
This is an event listener used by the parent instance.
Dispatches a mouse move event with a carry containing the clientX/clientY values of the event MouseEvent parameter, arranged into a Vector2.
This is an event listener used by the parent instance.
Recursively adds each of the provided children component array parameter to the render queue, and also to the tree if the addToTree flag is true.
If the registerEvents flag is true, the component event listeners are registered for later use.
Registers every event listener of the provided component component parameter inside the event listener object.
If an event type is not already stored, a new bucket stack is created for it.
Closes the last bucket of every event listener bucket stack.
Pops the last bucket of every event listener bucket stack.
Calls the build method of the provided layer layer parameter.
If the method didn't return a component, an Error is thrown.
A WebGL renderer extension used by the GUI composite to draw 2D interfaces.
Stores the output canvas in the form of an OffscreenCanvas.
Stores a Matrix3 used for projecting the 2D GUI.
Creates a GUIRenderer instance.
Returns the renderer's canvas.
Replaces the current projection matrix with the projection Matrix3 parameter.
Asynchronously builds the renderer, initializing the WebGL context, programs, attributes, uniforms and buffers.
Renders the components from the provided scene GUI scene parameter onto the renderer's canvas.
Replaces the renderer's viewport with the viewport Vector4 parameter and uploads the projection Matrix4 parameter to its WebGL uniform.
An scene extension for storing and rendering visual components on a GUI.
The cached number of subcomponents in the queue.
Creates a GUIScene instance.
Returns the visual component queue array.
Returns the subcomponent count.
Pushes the component visual component parameter onto the queue.
The subcomponent count is updated accordingly.
Clears the queue and resets the subcomponent count.
Resets the subcomponent count.
A "page" in a GUI composite.
Returns the layer's content in the form of a single component (which can be either structural or visual).
This method is abstract.
An element that composes a graphical user interface.
A reference to a GUI composite for dispatching events. This is automatically set up when pushing a new layer onto the composite's render queue.
A Vector2 representing the component's absolute screen position, with the origin being on the top-left corner.
A packed integer representing the component's horizontal and vertical alignment.
This value must be one of the Alignment module exports.
A Vector2 representing the component's offset relative to its alignment.
A Vector2 representing the component's width and height.
A string array representing the events the component is listening to.
Events are referenced by their name.
Creates a Component instance.
The descriptor object parameter must contain default values for the alignment and size, and can contain optional values for the margin and events.
Replaces the component's event dispatcher by the eventDispatcher GUI composite parameter.
Returns the component's position.
Replaces the component's position by the position Vector2 parameter.
Returns the component's alignment.
Returns the component's margin.
Replaces the component's margin by the margin Vector2 parameter.
Returns the component's size.
Replaces the component's size by the size Vector2 parameter.
Returns the component's events.
Computes the component's position with the alignment, margin and size values.
The computed position is restrained by the initial and parentSize Vector2 parameters, initial being a copy of the parent's position.
Returns a Matrix3 translated by the component's position and scaled by its size.
This is essentially the component's 2D world matrix.
Emits the event event parameter through the use of the component's event dispatcher.
A component acting as a container to restrain the positioning of its child components.
Structural components don't get rendered and are only used for layout purposes.
A component array representing the child components. This can contain other structural components.
Creates a StructuralComponent instance.
The descriptor object parameter must contain default values for the alignment, size and children, and can contain optional values for the margin and events.
Returns the component's children.
Computes the component's position with the alignment, margin and size values.
The computed position is restrained by the initial and parentSize Vector2 parameters, initial being a copy of the parent's position.
The most basic structural component.
The child components are also re-computed when calling compute.
A renderable (and potentially animatable) component that can be textured. Unlike a structural component, it cannot have children.
A nullable texture wrapper representing the component's texture.
A subcomponent array representing the component's parts.
Creates a VisualComponent instance.
The descriptor object parameter must contain default values for the alignment, size and texture, and can contain optional values for the margin and events.
Returns the component's texture.
Replaces the component's texture with the texture texture wrapper parameter.
Returns the component's subcomponents.
Replaces the component's subcomponents with the subcomponents subcomponent array parameter.
An abstract method triggered each frame with a context GUI composite parameter to allow the component do computing/rendering on its own.
If it returns true, the component will be pushed onto the render queue and rendered on the next frame with the other queued components.
If it returns false, the component won't be pushed onto the render queue, but can push itself with the context before returning.
If the render queue is empty the next frame, no render will be triggered.
The frameIndex number represents the elapsed frame count since the loop start, and not the component build time. To get the frame index relative to the component creation, you can store the frameIndex value inside a local variable once, and subtract frameIndex from this variable each time the udpate method is called.
Since structural components can't be pushed to the render queue, they don't have this method.
A visual component extension which has a single subcomponent for basic image rendering.
A visual component adapted for multiline text rendering.
Creates a Text instance.
The descriptor object parameter must contain default values for the alignment, font and context, and can contain optional values for the margin, events, fontSize (defaults to 1) and colorMask (defaults to $(255, 255, 255, 255)$).
The glyph generator used is generateGlyphsFromMultilineString.
An export file containing 6 numbers representing each possible alignment combination:
-
left: Horizontal left alignment -
middle: Horizontal center alignment -
right: Horizontal right alignment -
top: Vertical top alignment -
center: Vertical center alignment -
bottom: Vertical bottom alignment
You can combine horizontal and vertical values with a bitwise OR to get full alignments, e.g. left | top.
An offsettable, resizeable and scalable quad portion of the texture of a visual component. A color mask can also be applied to the opaque texels.
Subcomponents are rendered using instanced drawing.
A Vector2 representing the subcomponent's offset, starting at the top-left corner of the parent visual component.
Defaults to
A Vector2 representing the subcomponent's width and height.
A Vector2 representing the subcomponent's X and Y scale multipliers.
Defaults to
A Vector2 representing the subcomponent's texture coordinates, starting at the top-left corner of the texture.
Defaults to
A Vector4 representing the subcomponent's color mask applied to the opaque pixels of the parent visual component.
The mask is expressed in non-normalized RGBA.
Defaults to
Creates a Subcomponent instance.
The descriptor object parameter must contain a default value for the size and can contain optional values for the offset, scale, uv and color mask.
Creates a copy of the subcomponent subcomponent parameter.
Returns the subcomponent's offset.
Replaces the subcomponent's offset by the offset Vector2 parameter.
Returns the subcomponent's size.
Replaces the subcomponent's size by the size Vector2 parameter.
Returns the subcomponent's scale.
Replaces the subcomponent's scale by the scale Vector2 parameter.
Returns the subcomponent's texture coordinates.
Replaces the subcomponent's texture coordinates by the uv Vector2 parameter.
Returns the subcomponent's color mask.
Replaces the subcomponent's color mask by the colorMask Vector4 parameter.
An abstract class representing a set of type of one particular face and size.
The spacing between each line, in pixels.
Creates a Font instance.
The descriptor object parameter must contain a default value for the line spacing.
Returns the font's line spacing.
A font extension which stores glyphs in an image and references them in what's called a glyph map.
A string representing the endpoint to the glyph map JSON file.
A string representing the path to the bitmap texture, starting at the instance's texture path parameter.
A nullable record of every glyph map entry extracted from the JSON file, associated to their string key.
A glyph map entry contains the following properties:
-
width: a number specifying the glyph's width -
uv: an array of two numbers representing the glyph's top left texture coordinates
The glyph map is automatically initialized after the JSON file has loaded.
A nullable record of every glyph as a subcomponent for use in text components.
The glyphs are automatically initialized after the JSON file has loaded.
The height of a glyph tile, in pixels.
The spacing between two glyph tiles, in pixels.
A record containing every tile width exception to allow further customizing the glyph map (not recommended).
The keys are the glyph keys from the glyph map and the values are numbers representing the new width.
A record containing every tile offset exception to allow further customizing the glyph map (not recommended).
The keys are the glyph keys from the glyph map and the values are numbers representing the new offset.
Creates a BitmapFont instance.
The descriptor object parameter must contain default values for the glyph map path, texture path and tile height, and can contain optional values for the tile spacing, line spacing, custom tile widths and custom tile offsets.
Returns the font's glyph map path.
Returns the font's texture path.
Returns the font's glyph map.
Returns the font's glyphs.
Returns the tile width corresponding to the glyph string key parameter.
If the glyph key exists within the custom glyph widths, this value will be returned; otherwise, its width will be returned.
Returns the font's tile height.
Returns the tile offset corresponding to the glyph string key parameter.
If the glyph key exists within the custom glyph widths, this value will be returned; 0 otherwise.
Returns the font's tile spacing.
Returns the font's custom tile widths.
Returns the font's custom tile offsets.
Asynchronously loads the glyph map from the JSON file. The path is made from the basePath string parameter and the font's glyph map path endpoint.
This initializes the font's glyph map and glyph subcomponent array.
Generates an array of glyph subcomponents from a single-line string string parameter. Returns an object containing the glyph subcomponent array in a glyph key and a Vector2 representing the total string size in a size key.
The fontSize number parameter is a scale multiplier for every glyph in the string.
The colorMask Vector4 parameter will be applied to every glyph when generating the subcomponents.
If string contains newlines, an Error is thrown.
Generates an array of glyph subcomponents from a single-line or multiline string string parameter. Returns an object containing the glyph subcomponent array in a glyph key and a Vector2 representing the total string size in a size key.
The fontSize number parameter is a scale multiplier for every glyph in the string.
The colorMask Vector4 parameter will be applied to every glyph when generating the subcomponents.
A bitmap font variant where all tiles have the same widths. Like bitmap fonts, exceptions can be made through the custom tile widths.
The width of a glyph tile, in pixels.
Creates a MonospacedBitmapFont instance.
The descriptor object parameter must contain default values for the glyph map path, texture path, tile width and tile height, and can contain optional values for the tile spacing, line spacing, custom tile widths and custom tile offsets.
An event containing a carry of generic type T, emitted by a component or GUI composite and caught by every component listening to its type.
Custom events can be created by extending one of the default event classes.
A static string representing the event name and used by listeners. It must be unique amongst other event types.
Defaults to default.
A generic type T variable for storing custom data.
The carry type is determined by the JSDoc @extends keyword.
Creates an Event instance.
The carry parameter must be of type T (the event type).
Returns the event carry.
A keyboard event emitted once when pressing a key (key_press event name).
The KeyEventCarry object contains the event key and code properties.
A keyboard event emitted repeatedly after pressing a key (key_repeat event name).
A keyboard event emitted when releasing a key (key_release event name).
A mouse event emitted when pressing the mouse button (mouse_down event name).
The event carry is a Vector2 representing the current pointer position.
A mouse event emitted when moving the mouse around (mouse_mouve event name).
The event carry is a Vector2 representing the current pointer position.
This section is a tutorial for developing a counter application with Raven. You can follow along with the demo branch of the repository.
Inside a directory of your choice, create an HTML file named index.html, an assets directory and a public directory.
The assets directory will contain the application assets, such as glyph map definitions, shaders, textures. I also like to put additionnal CSS here.
The public directory is the client-side code of your application. It will contain components, layers and the module entrypoint that we'll name main.js.
Paste the following code in index.html:
/index.html
<!DOCTYPE html>
<html>
<head>
<meta charset="utf-8">
<script type="module" src="public/main.js"></script>
<title>Demo</title>
</head>
<body></body>
</html>Open a local server and check that the /public/main.js module is correctly loaded.
Now that the client part is done, you can clone the Raven main branch. Since the library's code will extensively be used within the client-side code, it can be cloned into a new src directory next to public:
$ git submodule add git@github.com:matteokeole/raven src --recurse-submodulesThe --recurse-submodules flag is used to recursively clone the math API submodule.
To get started, you need an instance. Instances are the foundation to every Raven application, as they compose the final render from every composite you give to them.
The instance constructor requires an instance renderer, which is a standard WebGL renderer, except for the fact that it has a DOM canvas instead of an offscreen canvas, which are reserved for composite renderers.
Start by importing the Instance and InstanceRenderer classes:
/public/main.js
import {Instance, InstanceRenderer} from "./index.js";
const instanceRenderer = new InstanceRenderer();
const instance = new Instance(instanceRenderer);To use the instance, you first need to build it. The build step is an asynchronous operation on instances, composites and renderers to initialize some complex or external property, such as a renderer's canvas context and shader programs or a GUI composite camera.
Building an instance will recursively build its renderer and composites.
However the instance requires further information to build correctly. This information is specified through a parameter object and mostly consists of resource paths. The following parameters are required:
-
root_path(the Raven submodule path - defaults to "/") -
font_path(defaults to "/") -
texture_path(defaults to "/")
Start by creating the fonts, shaders and textures directories within assets. Then, use the instance's setParameter method to initialize the required parameters above:
/public/main.js
// ...
instance.setParameter("root_path", "src/");
instance.setParameter("font_path", "assets/fonts/");
instance.setParameter("texture_path", "assets/textures/");Don't forget the trailing slash at the end of the paths.
The instance has no bound composites yet. Create a GUIComposite and a GUIRenderer and bind them to your instance with setComposites.
Adding a GUI composite will allow you to develop GUIs with components.
/public/main.js
import {GUIComposite, GUIRenderer} from "../src/gui/index.js";
// ...
const guiRenderer = new GUIRenderer();
const guiComposite = new GUIComposite({
renderer: guiRenderer,
instance,
fonts: {},
});
instance.setComposites([
// A 3D composite could be added here to render a scene below the GUI
guiComposite,
]);Composites will be rendered in the order you provide them to setComposites.
/public/main.js
// ...
await instance.build();You can now see that the instance renderer's canvas has been initialized, and you can add it to the DOM:
/public/main.js
// ...
document.body.appendChild(instance.getRenderer().getCanvas());Now that the instance is built, you can start the game loop!
/public/main.js
// ...
instance.loop();Textures are images asynchronously loaded after building the instance and rendered by a WebGL renderer.
Loaders have been written to manage external resource fetching. You'll need the TextureLoader extension to load textures:
/public/main.js
import {TextureLoader} from "../src/Loader/index.js";
// ...
const textureLoader = new TextureLoader(instance.getParameter("texture_path"));We provide the texture_path parameter to the texture loader constructor.
All loaders have a load method, but each extension may have different arguments. The TextureLoader's load method requires the relative path of a JSON file located inside the textures directory. This file lists the paths of the textures you want to use.
Let's add this 64 × 64 texture (credit: Via Placeholder) to the project and reference it in textures.json.
/assets/textures/textures.json
[
"64x64.png"
]You can now provide the JSON file's relative path to the texture loader:
/public/main.js
// ...
const textures = await textureLoader.load("textures.json");The texture loader returns an array of images representing the loaded textures. The final step is to provide this array to the GUI composite's renderer:
/public/main.js
// ...
guiRenderer.createTextureArray(textures, false);To do a quick test, open the browser's console, go to the Network tab and filter by image. You should see a loaded 64x64.png file.
Colors are monochromatic WebGL textures that are loaded synchronously. They are easier to set up than regular textures because they don't require to be in external files.
Colors are loaded through the texture loader's loadColors method. The first parameter is the list of colors and the second is the size to use when creating a monochromatic texture from a color.
/public/main.js
import {WebGLRenderer} from "../src/index.js";
// ...
const colors = textureLoader.loadColors([
{
name: "white",
value: Uint8Array.of(255, 255, 255, 255),
},
], WebGLRenderer.MAX_TEXTURE_SIZE);Since colors and textures are stored the same way in renderers, you can send them together to the createTextureArray method:
/public/main.js
// ...
- guiRenderer.createTextureArray(textures, false);
+ guiRenderer.createTextureArray(textures.concat(colors), false);
// ...Layers are virtual pages rendered by a GUI composite. They consist of a build method, which must return a root component.
If the component is structural, it can contain children, thus creating a component tree.
Start by creating a Layer directory inside public and put a DemoLayer.js file in it with the following code:
/public/Layer/DemoLayer.js
import {GUIComposite, Layer} from "../../src/gui/index.js";
export class DemoLayer extends Layer {
/**
* @param {GUIComposite} context
*/
build(context) {
throw new Error("No implemented (yet)");
}
}Import the layer file in the entrypoint and use the GUI composite's push method to push the layer to the render queue and display it on the screen.
/public/main.js
import {DemoLayer} from "./Layer/DemoLayer.js";
// ...
guiComposite.push(new DemoLayer());You should see the "Not implemented (yet)" error in the console. Let's replace it by a custom component.
The layer's build method should return an instance of a class extending one of the component abstract classes. Since a layer usually has multiple components, we'll use a group, which is the standard structural component extension, as the "foundation" and pass child components to it.
Import the Group class in the layer.
When calling a component constructor, a descriptor must be provided with options like:
- The alignment is a packed integer representing the component's horizontal and vertical origin from the screen's border.
- The margin is a Vector2 representing the component's offset from its alignment.
- The size is a Vector2 representing the component's width and height.
Groups also need the children descriptor option, which is a component array.
Create a Group instance with a simple descriptor and remove the "Not implemented (yet)" error.
/public/Layer/DemoLayer.js
+ import * as Alignment from "../../src/gui/Alignment/index.js";
+ import {Group} from "../src/gui/Component/index.js";
+ import {Vector2} from "../../src/math/index.js";
// ...
- throw new Error("No implemented (yet)");
+ return new Group({
+ alignment: Alignment.left | Alignment.top,
+ margin: new Vector2(16, 16),
+ size: new Vector2(144, 64),
+ children: [],
+ });
// ...Since structural components are used for layout purposes only, there won't be any visual changes.
However, visual components will get rendered, so let's extend the VisualComponent class into a new file.
Create a new Component directory inside public and add the following file to it:
/public/Component/Button.js
import {VisualComponent} from "../../src/gui/Component/index.js";
export class Button extends VisualComponent {}To see our visual component on the screen, we need to give it at least one subcomponent.
Subcomponents are portions of the component's texture that can be moved, resized and rescaled. They are extensively used in text components (each character is a portion of the glyph map moved by its position in the text, resized by its glyph size and rescaled by the font size).
The VisualComponent class's constructor initializes the subcomponents to an empty array. In our case we want the array to contain one subcomponent which will cover the whole component area, so we'll override the default constructor.
/public/Component/Button.js
import {Subcomponent} from "../../src/gui/index.js";
// ...
/**
* @param {import("../../src/gui/Component/VisualComponent.js").VisualComponentDescriptor} descriptor
*/
constructor(descriptor) {
super(descriptor);
this.setSubcomponents([
new Subcomponent({
size: this.getSize(),
}),
]);
}
// ...Back in the layer, you can push an instance of Button in the Group's children array:
/public/Layer/DemoLayer.js
import {Button} from "../Component/Button.js";
// ...
- children: [],
+ children: [
+ new Button({
+ alignment: Alignment.left | Alignment.top,
+ size: new Vector2(64, 64),
+ texture: context.getTexture("64x64.png"),
+ }),
+ ],
// ...After refreshing the page, you should see the 64x64.png texture on the top left corner of the page, with a small offset.
Raven supports bitmap fonts. To load one, you must provide a bitmap and a glyph map.
The bitmap is a PNG image which will be loaded as a texture and used by text components.
The glyph map is an object storing key-value pairs of glyph and glyph data. This glyph data object contains two properties:
-
width: the tile width on the texture (to avoid overflowing the next character) -
uv: an array representing the tile X/Y offset from the top-left corner of the bitmap
The demo features a bitmap and glyph map I made for Quiver, a little rogue-like. We'll use them as the GUI font.
Put the glyph map into the fonts directory and the bitmap into the textures directory. Don't forget to reference the bitmap into textures.json:
/assets/textures/textures.json
[
"64x64.png",
"quiver.png"
]The font is now ready to be initialized. Open the entrypoint file and create a new bitmap font into the GUI composite descriptor's fonts option:
/public/main.js
import {BitmapFont} from "../src/fonts/index.js";
// ...
- fonts: {},
+ fonts: {
+ quiver: new BitmapFont({
+ glyphMapPath: "quiver.json",
+ texturePath: "quiver.png",
+ tileHeight: 12,
+ tileSpacing: 1,
+ }),
+ },
// ...The tileHeight defines the height of all characters and the tileSpacing defines the horizontal spacing between two characters.
Now that the font is set up, you can start using it. We'll create another visual component to display text. We'll alter a bit its constructor arguments to carry font data.
/public/Component/Text.js
import {BitmapFont} from "../../src/fonts/index.js";
import {GUIComposite} from "../../src/gui/index.js";
import {VisualComponent} from "../../src/gui/Component/index.js";
import {Vector2} from "../../src/math/index.js";
export class Text extends VisualComponent {
/**
* @param {String} text
* @param {Object} descriptor
* @param {Number} descriptor.alignment
* @param {Vector2} [descriptor.margin]
* @param {BitmapFont} descriptor.font
* @param {GUIComposite} descriptor.context
*/
constructor(text, descriptor) {
super({
alignment: descriptor.alignment,
margin: descriptor.margin,
size: new Vector2(),
});
}
}The
To generate glyphs from a given text with the given font, two methods are available in the BitmapFont class:
- generateGlyphsFromString for single-line text
- generateGlyphsFromMultilineString for multiline text
For our purposes we'll use the single-line generator. Let's use the font option to the component's descriptor to allow the client-side to select a font.
Note: As of v0.3.0, Raven now includes a standard Text component for multiline text rendering.
/public/Component/Text.js
- import {Vector2} from "../../src/math/index.js";
+ import {Vector2, Vector4} from "../../src/math/index.js";
// ...
const {glyphs, size} = descriptor.font.generateGlyphsFromString(text, 2, new Vector4(255, 255, 255, 255));The returned object can be destructured into a glyphs array containing subcomponents for each character of the text, generated with the font's parameter we set earlier in the entrypoint and a size Vector2 representing the text size (that we'll apply to the component).
Let's also set the font's bitmap as the component's texture. This operation requires a GUI composite. That's why we ask for one in the descriptor's options.
/public/Component/Text.js
// ...
this.setSize(size);
this.setTexture(descriptor.context.getTexture(descriptor.font.getTexturePath()));
this.setSubcomponents(glyphs);Then, import the Text class into the layer and create an instance of it after the Button instance:
/public/Layer/DemoLayer.js
import {Text} from "../Component/Text.js";
// ...
new Text("0 clicks", {
alignment: Alignment.left | Alignment.center,
margin: new Vector2(80, 0),
font: context.getFont("quiver"),
context,
}),Refresh the page and you'll see the "0 clicks" text right next to the 64x64.png texture.
For the time being, we've only used the rendering pipeline of the GUI composite. To define interactions between components, Raven offers an Event API.
An event is an instance of the Event class containing a carry and that can be dispatched/listened tp by components. The API provides multiple extensions for keyboard and mouse listening:
These events can be listened to by adding their NAME value into the component's event array.
Let's make the Button class listen to the MouseDownEvent event:
/public/Layer/DemoLayer.js
import {MouseDownEvent} from "../../src/gui/Event/index.js";
// ...
new Button({
alignment: Alignment.left | Alignment.top,
size: new Vector2(64, 64),
+ events: [
+ MouseDownEvent.NAME,
+ ],
texture: context.getTexture("64x64.png"),
}),
// ...Then, create a listener function inside the component class:
/public/Component/Button.js
- import {Subcomponent} from "../../src/gui/index.js";
+ import {GUIComposite, Subcomponent} from "../../src/gui/index.js";
import {MouseDownEvent} from "../../src/gui/Event/index.js";
import {Vector2} from "../../src/math/index.js";
// ...
/**
* @param {Vector2} carry
* @param {GUIComposite} context
*/
[MouseDownEvent.NAME](carry, context) {}We can now react to mouse clicks on this component. But what if we want to tell the neighbor text component to update? We need a custom event which is:
- dispatched by
Button - listened to by
Text
A custom event is a client-side class which extends Event, similar to layers and components.
Create an Event directory inside public and write an IncrementCounterEvent class which will carry a number: the count.
/public/Event/IncrementCountEvent.js
import {Event} from "../../src/gui/Event/index.js";
/**
* @extends {Event<Number>}
*/
export class IncrementCountEvent extends Event {
static NAME = "increment_count";
}Event names are usually written in snake case.
Then, add a private #count number to the Button class and increment it while dispatching an IncrementCountEvent when a MouseDownEvent is triggered:
/public/Component/Button:
import {IncrementCountEvent} from "../Event/IncrementCountEvent.js";
// ...
+ /**
+ * @type {Number}
+ */
+ #count;
/**
* @param {import("../../src/gui/Component/VisualComponent.js").VisualComponentDescriptor} descriptor
*/
constructor(descriptor) {
super(descriptor);
+ this.#count = 0;
this.setSubcomponents([
new Subcomponent({
size: this.getSize(),
}),
]);
}
/**
* @param {Vector2} carry
* @param {GUIComposite} context
*/
- [MouseDownEvent.NAME](carry, context) {}
+ [MouseDownEvent.NAME](carry, context) {
+ this.#count++;
+
+ this.dispatchEvent(new IncrementCountEvent(this.#count));
+ }Since the mouse events are triggered in the whole screen and not just within the component bounds, we need to check that the pointer is located on the component before incrementing #count. We can do it by importing an utility from the math API.
/public/Component/Button:
- import {Vector2} from "../../src/math/index.js";
+ import {Vector2, intersects} from "../../src/math/index.js";
// ...
[MouseDownEvent.NAME](carry, context) {
+ if (!intersects(carry, this.getPosition(), this.getSize())) {
+ return;
+ }
// ...
}Update the Text class to listen to the IncrementCountEvent event and create a listener function for it:
/public/Component/Text.js
import {IncrementCountEvent} from "../Event/IncrementCountEvent.js";
// ...
super({
alignment: descriptor.alignment,
margin: descriptor.margin,
size: new Vector2(),
+ events: [
+ IncrementCountEvent.NAME,
+ ],
});
// ...
/**
* @param {Number} carry
* @param {GUIComposite} context
*/
[IncrementCountEvent.NAME](carry, context) {
debugger;
}Refresh the page, open the console and try clicking on the 64x64.png texture to trigger the debugger. You should see 1 when hovering over the carry parameter.
Here is the event listener pipeline:
- Clear the old text
- Generate a new text with the event carry
- Render the new text
We'll use the GUI composite (context) to do the clearing/rendering. To generate the text, we need a bitmap font; we can store the one provided in the constructor inside a private variable.
/public/Component/Text.js
// ...
+ /**
+ * @type {BitmapFont}
+ */
+ #font;
// ...
constructor(text, descriptor) {
// ...
this.#font = descriptor.font;
- const {glyphs, size} = descriptor.font.generateGlyphsFromMultilineString(text, 2, new Vector4(255, 255, 255, 255));
+ const {glyphs, size} = this.#font.generateGlyphsFromMultilineString(text, 2, new Vector4(255, 255, 255, 255));
// ...
- this.setTexture(descriptor.context.getTexture(descriptor.font.getTexturePath()));
+ this.setTexture(descriptor.context.getTexture(this.#font.getTexturePath()));
// ...
}
/**
* @param {Number} carry
* @param {GUIComposite} context
*/
[IncrementCountEvent.NAME](carry, context) {
- debugger;
+ // Clear
+ {
+ // Set the texture to the same as the background color
+ this.setTexture(context.getTexture("white"));
+
+ // Push the component to the render queue
+ context.pushToRenderQueue(this);
+
+ // Render the queue
+ context.render();
+ }
+
+ // Generate the text
+ {
+ const text = `${carry} clicks`;
+ const {glyphs, size} = this.#font.generateGlyphsFromMultilineString(text, 2, new Vector4(255, 255, 255, 255));
+
+ this.setSize(size);
+ this.setTexture(context.getTexture(this.#font.getTexturePath()));
+ this.setSubcomponents(glyphs);
+ }
+
+ // Render
+ {
+ // Re-compute the currently rendered layers,
+ // because the component's size has changed
+ context.compute();
+
+ // Push the component to the render queue
+ context.pushToRenderQueue(this);
+
+ // Render the queue
+ context.render();
+ }
}Refresh the page and click on the 64x64.png texture: the text should update with the correct count value.
The underlying graphics API will likely be replaced by WebGPU. Here are the main reasons as to why.
I want the library to run on other platforms than the web and WebGPU seems like a good solution to me. At the time of writing, there are 2 main implementations of the WebGPU spec (Dawn and wgpu). Using one of these could enable support for multiple environments, including web thanks to WASM compilation with a tool like Emscripten. Here is a list of the targeted platforms:
- Windows
- Linux
- Web (need WASM/WebGPU support)
By migrating to WebGPU, we'll get new features that WebGL doesn't support such as compute shaders or indirect draw/dispatch calls, as well as a performance boost in at least some cases. The main downside is the code complexity which comes with WebGPU.
The library's language is being reconsidered. Since there are C++ and Rust WebGPU implementations, these languages are very probable choices, with a personal preference for C++.
The code structure will also be refactored with OOP in mind.
-
More complex 2D components. The GUI should not be limited by quads. The reason why components can only be quads is that it makes instanced drawing possible. Using meshes with different geometry will likely need a full rewrite of the GUI renderer.
-
Component clearing and overdraw reduction. To clear a component with the current Composite API, you need to:
- set the component texture to one that matches the background
- push the component to the render queue
- render
This will effectively create a draw call for only one component, thus not using the advantages of instancing. The clear request should be added in the next render queue and be executed like a draw request, because that's what it is: a component with a texture that matches the texture below it.
But there are a few problems with this method. Firstly, if a component gets cleared, the component above it will be at least partially erased as well. Secondly, how to write a new Composite API which will handle this "clear-draw" technique while also being easy to develop with?
-
No more layers and offscreen renderers. Layers could entirely disappear, due to the limit of active WebGL contexts (16 at best). A single web page may not be able to run multiple instances of the library at the same time.
If layers do get removed, their concept must remain because of its use with event propagation, GUI separation from the scene and post-processing overlays. One solution is to sort the components within a compute shader, where we could also do culling, but this is under consideration.
Since layers will certainly be removed, their offscreen renderers also will. Not only the limit of active contexts is low, but the OffscreenCanvas API is also not supported everywhere. Like layers, their concept could remain, maybe in the form of framebuffers bound to the instance renderer.
-
Perspective-projected GUIs. I'd like to add support for twisted GUIs (e.g. Subnautica/Lethal Company).
-
3D rendering integration. A
Meshabstract class could be written to allow extensions for 2D shapes (the existingComponentclass) and 3D shapes.