srm-effects-core v2.0.0-alpha.1
Galacean Effects Core
Basic Concepts
In Galacean Effects, a Composition is the unit of animation playback. It is managed by the abstract class Composition, which is responsible for parsing data (JSON -> VFXItem / Texture -> mesh), creating and updating render frames (renderFrame) and render passes (renderPass).
Each composition uses animation data for different types of elements (VFXItems), including camera properties, multiple layers, particles, and interactive elements. When a composition is created, it completes the creation of elements (VFXItems), loading and creation of animation texture maps (Textures), and initialization of render frames (renderFrame) and render passes (renderPass).
At the beginning of the composition's lifecycle, the corresponding mesh is added to the default render pass (renderPass).
During the lifecycle, the Geometry and Material data contained in the mesh is updated.
When post-processing is required, the mesh is split into the appropriate renderPass.At the end of the lifecycle, the corresponding mesh is removed from the renderFrame.
To play the animation, the engine retrieves the mesh from the renderFrame and adds it to the scene, continuously calls the update function of the Composition during the rendering loop to update the data.
Process
1. Resource Loading and Creation
- Asset Download AssetManager: Before playing the animation, JSON data along with binary resources (
processBins) and image resources (processImages) are downloaded. Upon completion of image downloads, parameters for creating textures (Texture) are returned. In addition to basic resource downloading functionality, the following features are supported:- Selective downloading of resources based on rendering levels.
- After loading the image, image/text replacement is performed according to the configuration, and the modified image is saved as an
imageDataobject by drawing on a Canvas. - Enable the gl extension
KHR_parallel_shader_compileto compile shaders after resource loading is completed.
- Texture Creation Texture:
Texturesare created based on the parameters obtained from the resource download process. The current texture object may be based on one of the creation types defined in theTextureSourceTypeenumeration.
2. Animation Playback
Composition: The composition manages the data processing and rendering setup for animation playback. The
initializefunction is called to initialize theVFXItemManagerfor JSON -> VFXItem processing. Additionally, the engine needs to retrieve the mesh when appropriate throughcomposition.renderFrameand add the retrieved mesh to the scene.- Static
initializemethod:- The engine needs to implement the creation of
VFXItemManager,Compositioninstances, and converting texture parameters toTextureusable by the engine.
- The engine needs to implement the creation of
- In the constructor, the following functions need to be called:
- Plugin system
pluginSystem.initializeComposition(). composition.resetRenderFrame(): Create and initialize therenderFrame.composition.reset(): Parse the animation data and initialize the state of the render instance.composition.play(): Start playing the composition animation.
- Plugin system
updatemethod: Used to call therenderFrame's methods to add/modify/delete meshes and drive the update and vertex data, uniform variable values, etc., ofVFXItems. The following functions need to be implemented:updateVideo: Update video frames for video playback.getRendererOptions: Return a blankTexturecreated using the data.reloadTexture/offloadTexture: Reload/unload textures.
- The mesh or rendering objects added to the scene can be retrieved through the
renderFrame, and the interface can be freely designed in theCompositionaccording to the engine's needs. disposemethod: When the composition's lifecycle comes to an end, this method is called based on the termination behavior. It executes the composition's disposal callback forVFXItemand also destroys associated objects such as meshes and textures.
- Static
RenderFrame: The
RenderFramecan be understood as the rendering data object corresponding to each frame of the composition. In addition to managing therenderPass, it also stores the camera properties and common uniform variable table (semantics) associated with the composition. The meshes corresponding to different types of elements are added and removed usingaddMeshToDefaultRenderPassandremoveMeshFromDefaultRenderPassmethods ofrenderFrame. The mesh is added to the appropriate position in therenderPassbased on itspriorityproperty.addMeshToDefaultRenderPass/removeMeshFromDefaultRenderPass:- For compositions without filter elements, the engine can manage all meshes through the
defRenderPass, or it can directly place the passed-in mesh into its own scene. The engine can also organize and manage the meshes as required. - For compositions with filter elements involving post-processing, the effects-core will call the
splitDefaultRenderPassByMeshfunction to split therenderPassusing the splitting parameters. In this case, the engine needs to iterate overrenderFrame._renderPassesto retrieve meshes and add them to the scene. - When adding a mesh, the common uniforms used by the material can be obtained through
mesh.material.uniformSemantics, including matrices related to MVP transformations and the attachments used.
- For compositions without filter elements, the engine can manage all meshes through the
setEditorTransformUniform: This method is used to set the translation/scaling transformation of an element after model transformation. The engine may not necessarily understand this concept but can set the value tosemantics[EDITOR_TRANSFORM].
- RenderPass: The meshes added to the scene can be obtained through
renderPass.meshes. The render passrenderPasscontains the meshes for the current pass, the operations for clearing the buffer before and after rendering, and attachments related to color, depth, and stencil. Thedelegateproperty is used to specify the callbacks before and after rendering for therenderPass, as defined in filters. The engine needs to execute these callbacks before actually rendering the meshes to ensure the correct operation of the filters. - Mesh: Each
VFXItemcalls theMesh.create()function during initialization, passing in parameters such as geometry and material, and sets/retrieves the rendering order for the current mesh usingpriority.- The static
createmethod is used to create a newMeshobject that the engine can render. The engine needs to add geometry, material, and other objects to the mesh here.- The primitive type to be rendered can be obtained from
geometry.mode.
- The primitive type to be rendered can be obtained from
- The
setterandgetterfunctions forpriorityare used to set the rendering order of the current mesh. Meshes with lowerpriorityvalues should be drawn before those with higher values. setVisible/getVisiblesets the visibility of the mesh.
- The static
Tips
- Each
spriteVFXItemdoes not necessarily correspond to a single mesh. Layer elements are compared using a diff algorithm during frame updates to determine whether adjacent meshes have the same material properties, and then the meshes are split or merged accordingly.- To obtain the mesh corresponding to the current
VFXItem, you can useVFXItem.content.meshto retrieve it.
3. Geometry
Each VFXItem calls the Geometry.create() function during initialization, passing in the drawing type, vertex data, and index data of the element. During each frame update, new vertex data is passed to the attribute data.
1. The static create method: It processes the passed attribute data. If the data contains the dataSource property, it indicates that the attribute shares a buffer with the data source.
- size, offset, and stride are also passed in. If the data length is 0 and the engine does not allow dynamic modification of the GPU cache length, an initialization array should be created using the maxVertex parameter.
2. setAttributeData/getAttributeData: Sets/retrieves attribute data for the specified attribute name.
3. setAttributeSubData: Sets partial attribute updates.
4. getIndexData/setIndexData: Sets/retrieves index data.
5. setDrawCount/getDrawCount: Sets/retrieves the draw count.
Attributes involved:
Sprite
1. aPoint: Float32Array - Vertex data
2. aIndex: Float32Array - Shared buffer with aPoint
3. Index data: Uint16ArrayParticle
1. aPos: Float32Array
2. aVel: Float32Array - Shared buffer with aPos
3. aDirX: Float32Array - Shared buffer with aPos
4. aDirY: Float32Array - Shared buffer with aPos
5. aRot: Float32Array - Shared buffer with aPos
6. aSeed: Float32Array - Shared buffer with aRot
7. aColor: Float32Array - Shared buffer with aRot
8. aOffset: Float32Array
9. aSprite: Float32Array
10. Index data: Uint16ArrayParticle-trail
1. aColor: Float32Array
2. aSeed: Float32Array - Shared buffer with aColor
3. aInfo: Float32Array - Shared buffer with aColor
4. aPos: Float32Array - Shared buffer with aColor
5. aTime: Float32Array
6. aDir: Float32Array
7. aTrailStart: Float32Array
8. aTrailStartIndex: Float32Array4. Material
Each VFXItem calls the Material.create() function during initialization, passing the shader and uniform semantics. The states and uniform data of the material are not passed in the constructor parameters but are set through functions after material creation.
1. Static create method: It needs to handle the provided shader text and set the uniformSemantics.
2. Implementation of setter/getter methods for states: The constant type passed is glContext, which may need to be converted to constants defined by the engine.
3. set[dataType]/get[dataType] methods for uniforms: effects-core will invoke the corresponding methods based on the type of the uniform to set data.
⚠️ Note: The related UBO calls are deprecated, and
material-data-blockdoes not need to be implemented.
Uniforms involved and their types:
Sprite
1. uMainData: mat4
2. uTexParams: vec4
3. uTexOffset: vec4
4. uSampler\[i]: sampler2D
5. uSamplerPre: sampler2D
6. uFeatherSampler: sampler2DParticle
1. uSprite: vec4
2. uParams: vec4
3. uAcceleration: vec4
4. uGravityModifierValue: vec4
5. uOpacityOverLifetimeValue: vec4
6. uRXByLifeTimeValue: vec4
7. uRYByLifeTimeValue: vec4
8. uRZByLifeTimeValue: vec4
9. uLinearXByLifetimeValue: vec4
10. uLinearYByLifetimeValue: vec4
11. uLinearZByLifetimeValue: vec4
12. uSpeedLifetimeValue: vec4
13. uOrbXByLifetimeValue: vec4
14. uOrbYByLifetimeValue: vec4
15. uOrbZByLifetimeValue: vec4
16. uSizeByLifetimeValue: vec4
17. uSizeYByLifetimeValue:vec4
18. uColorParams: vec4
19. uFSprite: vec4
20. uPreviewColor: vec4
21. uVCurveValues: vec4Array
22. uFCurveValues: vec4
23. uFinalTarget: vec3
24. uForceCurve: vec4
25. uOrbCenter: vec3
26. uTexOffset: vec2
27. uPeriodValue: vec4
28. uMovementValue: vec4
29. uStrengthValue: vec4
30. uWaveParams: vec4API Documentation
2 years ago
2 years ago
2 years ago
2 years ago
2 years ago
2 years ago
2 years ago
2 years ago
2 years ago
2 years ago
2 years ago
2 years ago