Use this file to discover all available pages before exploring further.
Event
Description
New mints
Raw mint data
Mint updates
Supply changes, authority changes
TokenMetadata
Name, symbol, URI, additional_metadata
Cold/hot transitions
Mint compressed or decompressed
This guide is for teams building custom data pipelines (aggregators, market makers).
If you just need account lookups, use get_account_interface instead.
Agent skill
Use the data-streaming agent skill to add Laserstream support to your project:
Light mints are Solana accounts owned by the Light Token Program. The streaming
setup requires two gRPC subscriptions:
Subscription
Detects
How
Account sub (owner: cToken..., account_type == 1)
Hot state + cold-to-hot
Pubkey cache lookup
Transaction sub (account_include: cToken...)
Hot-to-cold
Balance heuristic (pre > 0, post == 0)
The account subscription delivers all state changes while mints are hot.
The transaction subscription is needed to detect mints going cold
(CompressAndCloseMint changes the owner to System Program, which the account
subscription no longer matches).
cache: HashMap<[u8; 32], T> — hot account state (for quoting/routing)
cold_cache: HashMap<[u8; 32], AccountInterface> — cold accounts with ColdContext (for building load instructions)
use helius_laserstream::grpc::subscribe_update::UpdateOneof;Some(UpdateOneof::Transaction(tx_update)) => { if let Some(ref tx_info) = tx_update.transaction { for pubkey in find_closed_accounts(tx_info) { if cache.remove(&pubkey).is_some() { // Async: fetch AccountInterface with ColdContext. // Cold accounts are inactive, so this completes well // before anyone tries to swap through them. let rpc = rpc.clone(); let cold_cache = cold_cache.clone(); tokio::spawn(async move { if let Ok(Some(iface)) = rpc.get_account_interface(&pubkey, None).await { cold_cache.insert(pubkey, iface); } }); } } }}
fn find_closed_accounts( tx_info: &helius_laserstream::grpc::SubscribeUpdateTransactionInfo,) -> Vec<[u8; 32]> { let meta = match &tx_info.meta { Some(m) => m, None => return vec![], }; let msg = match tx_info.transaction.as_ref().and_then(|t| t.message.as_ref()) { Some(m) => m, None => return vec![], }; let mut all_keys: Vec<&[u8]> = msg.account_keys.iter().map(|k| k.as_slice()).collect(); all_keys.extend(meta.loaded_writable_addresses.iter().map(|k| k.as_slice())); all_keys.extend(meta.loaded_readonly_addresses.iter().map(|k| k.as_slice())); let mut closed = Vec::new(); for (i, key) in all_keys.iter().enumerate() { if key.len() == 32 && meta.pre_balances.get(i).copied().unwrap_or(0) > 0 && meta.post_balances.get(i).copied().unwrap_or(1) == 0 { closed.push(<[u8; 32]>::try_from(*key).unwrap()); } } closed}
cache.remove filters out unrelated closures in the same transaction. No discriminator
check is needed — compress_and_close always drains lamports to zero.To build transactions that decompress cold accounts, see
Router Integration.
fn extract_metadata(mint: &Mint) -> Option<(String, String, String)> { let extensions = mint.extensions.as_ref()?; for ext in extensions { if let ExtensionStruct::TokenMetadata(m) = ext { let name = String::from_utf8_lossy(&m.name).to_string(); let symbol = String::from_utf8_lossy(&m.symbol).to_string(); let uri = String::from_utf8_lossy(&m.uri).to_string(); return Some((name, symbol, uri)); } } None}
---description: Stream light-mint accounts and metadata via Laserstream gRPCallowed-tools: Bash, Read, Write, Edit, Glob, Grep, WebFetch, AskUserQuestion, Task, TaskCreate, TaskGet, TaskList, TaskUpdate, TaskOutput, mcp__deepwiki, mcp__zkcompression---## Stream light-mint accounts and metadata via Laserstream gRPCContext:- Guide: https://zkcompression.com/light-token/toolkits/for-streaming-mints- Skills and resources index: https://zkcompression.com/skill.md- Dedicated skill: https://github.com/Lightprotocol/skills/tree/main/skills/data-streaming- Crates: helius-laserstream, light-token-interface, borsh, futures- Token accounts streaming: https://zkcompression.com/light-token/toolkits/for-streaming-tokensKey APIs: LaserstreamConfig, subscribe(), Mint::deserialize(), ExtensionStruct::TokenMetadata### 1. Index project- Grep `helius_laserstream|laserstream|subscribe|StreamExt|light_token_interface|Mint|cTokenmWW8bLPjZEBAUgYy3zKxQZW6VKi7bqNFEVv3m` across src/- Glob `**/*.rs` and `**/Cargo.toml` for project structure- Identify: existing gRPC streaming setup, account caching, deserialization logic- Read Cargo.toml — note existing dependencies- Task subagent (Grep/Read/WebFetch) if project has multiple crates to scan in parallel### 2. Read references- WebFetch the guide above — follow the Steps (Connect, Subscribe, Deserialize, Detect cold, Extract metadata)- WebFetch skill.md — check for a dedicated skill and resources matching this task- TaskCreate one todo per phase below to track progress### 3. Clarify intention- AskUserQuestion: what is the goal? (new streaming pipeline for mints, add mint streaming to existing token streaming, specific use case like metadata indexing)- AskUserQuestion: mainnet or devnet?- AskUserQuestion: do you need cold/hot transition detection, or just live mint state?- Summarize findings and wait for user confirmation before implementing### 4. Create plan- Based on steps 1–3, draft an implementation plan- Follow the guide's step order: Connect → Subscribe (account + transaction subs) → Deserialize Mint → Detect Cold → Extract Metadata- If anything is unclear or ambiguous, loop back to step 3 (AskUserQuestion)- Present the plan to the user for approval before proceeding### 5. Implement- Add deps if missing: Bash `cargo add helius-laserstream@0.1 light-token-interface@0.3 borsh@0.10 futures@0.3 bs58@0.5 tokio --features full`- Follow the guide and the approved plan- Write/Edit to create or modify files- TaskUpdate to mark each step done### 6. Verify- Bash `cargo check`- Bash `cargo test` if tests exist- TaskUpdate to mark complete### Tools- mcp__zkcompression__SearchLightProtocol("<query>") for API details- mcp__deepwiki__ask_question("Lightprotocol/light-protocol", "<q>") for architecture- Task subagent with Grep/Read/WebFetch for parallel lookups- TaskList to check remaining work