Documentation Index
Fetch the complete documentation index at: https://rive-editor-sidebar-reorg.mintlify.app/llms.txt
Use this file to discover all available pages before exploring further.
Context is an object provided to scripted nodes during initialization (via the init function).
It serves as a bridge between your script and the Rive runtime, giving you access to:
- Update scheduling — Request that your node be updated on the next frame.
- Data (ViewModels) — Access the ViewModel data context bound to the node or the root artboard.
- Assets — Retrieve named assets (images, blobs, and audio) that have been added to the Rive file.
Methods
markNeedsUpdate
Marks the object as needing an update on the next frame.
Call this when something has changed (e.g., from a listener callback) and you need the runtime to re-invoke your node’s update function.
function init(self: MyNode, context: Context): boolean
self.context = context
-- Access a ViewModel property and listen for changes
local vm = context:viewModel()
if vm then
local name = vm:getString("name")
if name then
name:addListener(function()
-- Re-trigger update when the data changes
if self.context then
self.context:markNeedsUpdate()
end
end)
end
end
return true
end
viewModel
viewModel() -> ViewModel?
Returns the ViewModel bound to the node’s immediate data context. This is the most common way
to read data-bound properties from a script.
function init(self: MyNode, context: Context): boolean
local vmi = context:viewModel()
if vmi then
local cannon = vmi:getTrigger('cannon')
end
return true
end
rootViewModel
rootViewModel() -> ViewModel?
Returns the ViewModel bound to the root artboard’s data context.
Useful when you need to access top-level data from a deeply nested node.
function init(self: MyNode, context: Context): boolean
local vmi = context:rootViewModel()
if vmi then
local cannon = vmi:getTrigger('cannon')
end
return true
end
dataContext
dataContext() -> DataContext?
Returns the data context provided to this node.
function init(self: MyNode, context: Context): boolean
local dc = context:dataContext()
if dc then
local parentDC = dc:parent()
local vm = dc:viewModel()
end
return true
end
image
image(name: string) -> Image?
Returns an image asset by name, or nil if not found.
The returned Image can be drawn via drawImage.
See also ImageSampler.
Check out Scripting demos to see a working example.
type DrawImage = {
myImage: Image?,
sampler: ImageSampler?,
}
function init(self: DrawImage, context: Context): boolean
self.myImage = context:image('myImage')
self.sampler = ImageSampler('clamp', 'clamp', 'bilinear')
return true
end
function draw(self: DrawImage, renderer: Renderer)
if self.myImage and self.sampler then
renderer:drawImage(self.myImage, self.sampler, 'srcOver', 1)
end
end
return function(): Node<DrawImage>
return {
myImage = nil,
sampler = nil,
init = init,
draw = draw,
}
end
blob
blob(name: string) -> Blob?
Returns a Blob (raw binary data) asset by name, or nil if not found.
Useful for loading custom data files embedded in the Rive file.
type DrawBlob = {
myBlob: Blob?,
}
function init(self: DrawBlob, context: Context): boolean
self.myBlob = context:blob('myBlob')
if self.myBlob then
print('Blob name:', self.myBlob.name)
print('Blob size:', self.myBlob.size, 'bytes')
print('Blob data buffer:', self.myBlob.data)
end
return true
end
return function(): Node<DrawBlob>
return {
myBlob = nil,
init = init,
}
end
audio
audio(name: string) -> AudioSource?
Returns an AudioSource asset by name. The returned source can be played using the global Audio API.
type AudioExample = {
play: Input<Trigger>,
source: AudioSource?,
}
function init(self: AudioExample, context: Context): boolean
self.source = context:audio("myAudio")
return true
end
function playSound(self: AudioExample)
if self.source then
Audio.play(self.source)
end
end
return function(): Node<AudioExample>
return {
play = playSound,
source = nil,
init = init,
}
end
canvas
canvas(desc: {width: number, height: number, clearColor: Color?,}) -> Canvas
Create a 2D canvas for Rive Renderer drawing.
Use beginFrame/endFrame inside drawCanvas to render.
gpuCanvas
gpuCanvas(desc: {width: number, height: number, sampleCount: number?,}) -> GPUCanvas
Create a GPU canvas for custom GPU rendering.
Use beginRenderPass inside drawCanvas to render.
Pass sampleCount (2, 4, or 8) to enable MSAA on the canvas backing
texture. When set, beginRenderPass automatically routes rendering
through the MSAA color texture and resolves to the 1× backing —
no extra setup needed in the script.
features
features() -> GPUFeatures
Query GPU capabilities. Returns a table of supported features
and limits for the current backend.
preferredCanvasFormat() -> TextureFormat
Returns the native canvas texture format for the current platform.
This is the format that context:gpuCanvas(...) backing textures use,
and therefore what canvas.format will report. Equivalent to WebGPU’s
navigator.gpu.getPreferredCanvasFormat().
- Metal (macOS/iOS):
'rgba8unorm' (off-screen canvas, not a CAMetalLayer surface)
- D3D11/D3D12 (Windows):
'bgra8unorm'
- Vulkan, OpenGL, WebGPU:
'rgba8unorm' (safe default; actual surface format
may vary — use canvas.format for the authoritative value once a
canvas exists)
Use this at init time to create GPUTexture and GPUPipeline
with a matching format before any canvas is drawn:
local fmt = context:preferredCanvasFormat()
self.pipeline = GPUPipeline.new({ colorTargets = {{ format = fmt }}, ... })
loadShader
loadShader(name: string) -> Shader?
Load a compiled shader by asset name. Returns a Shader ready for use
in GPUPipeline.new(), or nil if the named shader is not found.
decodeImage
decodeImage(data: buffer) -> Promise<DecodedImage>
Decode compressed image data (PNG, JPEG, WebP) into premultiplied RGBA8 pixels.
Returns a Promise that resolves with a DecodedImage table.
Use with await() inside async(), or chain with :andThen().