Use this file to discover all available pages before exploring further.
This guide is for teams building custom data pipelines (aggregators, market makers).
If you just need account lookups, use get_account_interface instead.
Agent skill
Use the data-streaming agent skill to add Laserstream support to your project:
Light token accounts share the same base layout as SPL Token (165 bytes), so you can
use your existing parser. The streaming setup requires two gRPC subscriptions, both
targeting the Light Token Program:
Subscription
Detects
How
Account sub (owner: cToken...)
Hot state + cold-to-hot
Pubkey cache lookup
Transaction sub (account_include: cToken...)
Hot-to-cold
Balance heuristic (pre > 0, post == 0)
The account subscription delivers all state changes while accounts are hot.
The transaction subscription is needed to detect accounts going cold
(compress_and_close changes the owner to System Program, which the account
subscription no longer matches).
use spl_pod::bytemuck::pod_from_bytes;use spl_token_2022_interface::pod::PodAccount; // works for SPL-token, SPL-token-2022, and Light-tokenlet parsed: &PodAccount = pod_from_bytes(&data[..165])?;
For accounts with extensions, truncate to 165 bytes before parsing.
For each transaction update, find accounts whose lamport balance dropped to zero.
The cache.remove call ensures only accounts you’re already tracking are processed:Two data structures:
cache: HashMap<[u8; 32], T> — hot account state (for quoting/routing)
cold_cache: HashMap<[u8; 32], AccountInterface> — cold accounts with ColdContext (for building load instructions)
use helius_laserstream::grpc::subscribe_update::UpdateOneof;Some(UpdateOneof::Transaction(tx_update)) => { if let Some(ref tx_info) = tx_update.transaction { for pubkey in find_closed_accounts(tx_info) { if cache.remove(&pubkey).is_some() { // Async: fetch AccountInterface with ColdContext. // Cold accounts are inactive, so this completes well // before anyone tries to swap through them. let rpc = rpc.clone(); let cold_cache = cold_cache.clone(); tokio::spawn(async move { if let Ok(Some(iface)) = rpc.get_account_interface(&pubkey, None).await { cold_cache.insert(pubkey, iface); } }); } } }}
fn find_closed_accounts( tx_info: &helius_laserstream::grpc::SubscribeUpdateTransactionInfo,) -> Vec<[u8; 32]> { let meta = match &tx_info.meta { Some(m) => m, None => return vec![], }; let msg = match tx_info.transaction.as_ref().and_then(|t| t.message.as_ref()) { Some(m) => m, None => return vec![], }; let mut all_keys: Vec<&[u8]> = msg.account_keys.iter().map(|k| k.as_slice()).collect(); all_keys.extend(meta.loaded_writable_addresses.iter().map(|k| k.as_slice())); all_keys.extend(meta.loaded_readonly_addresses.iter().map(|k| k.as_slice())); let mut closed = Vec::new(); for (i, key) in all_keys.iter().enumerate() { if key.len() == 32 && meta.pre_balances.get(i).copied().unwrap_or(0) > 0 && meta.post_balances.get(i).copied().unwrap_or(1) == 0 { closed.push(<[u8; 32]>::try_from(*key).unwrap()); } } closed}
cache.remove filters out unrelated closures in the same transaction. No discriminator
check is needed — compress_and_close always drains lamports to zero.To build transactions that decompress cold accounts, see
Router Integration.
getAccountInfo returns null for cold accounts. get_account_interface() races
hot and cold lookups and returns raw account bytes that work with your standard SPL parser:
use light_client::rpc::{LightClient, LightClientConfig, Rpc};use spl_pod::bytemuck::pod_from_bytes;use spl_token_2022_interface::pod::PodAccount;let config = LightClientConfig::new( "https://api.devnet.solana.com".to_string(), Some("https://photon.helius.com?api-key=YOUR_KEY".to_string()),);let client = LightClient::new(config).await?;let result = client.get_account_interface(&pubkey, None).await?;if let Some(account) = result.value { let parsed: &PodAccount = pod_from_bytes(&account.data()[..165])?; if account.is_cold() { // Compressed -- still valid for routing. }}
use borsh::BorshDeserialize;use light_token_interface::state::{Token, ExtensionStruct};let token = Token::deserialize(&mut data.as_slice())?;if let Some(exts) = &token.extensions { for ext in exts { if let ExtensionStruct::Compressible(info) = ext { // info.compression_authority, info.rent_sponsor, info.last_claimed_slot } }}
Variant
Description
TokenMetadata(TokenMetadata)
Name, symbol, URI, additional metadata
PausableAccount(PausableAccountExtension)
Marker: mint is pausable (no data; pause state lives on mint)
---description: Stream light-token accounts via Laserstream gRPCallowed-tools: Bash, Read, Write, Edit, Glob, Grep, WebFetch, AskUserQuestion, Task, TaskCreate, TaskGet, TaskList, TaskUpdate, TaskOutput, mcp__deepwiki, mcp__zkcompression---## Stream light-token accounts via Laserstream gRPCContext:- Guide: https://zkcompression.com/light-token/toolkits/for-streaming-tokens- Skills and resources index: https://zkcompression.com/skill.md- Dedicated skill: https://github.com/Lightprotocol/skills/tree/main/skills/data-streaming- Crates: helius-laserstream, light-token-interface, spl-pod, spl-token-2022-interface, borsh, futures- Mint accounts streaming: https://zkcompression.com/light-token/toolkits/for-streaming-mints- Point queries: light-client (LightClient, get_account_interface)Key APIs: LaserstreamConfig, subscribe(), PodAccount (pod_from_bytes), LightClient::get_account_interface()### 1. Index project- Grep `helius_laserstream|laserstream|subscribe|PodAccount|pod_from_bytes|spl_token_2022_interface|cTokenmWW8bLPjZEBAUgYy3zKxQZW6VKi7bqNFEVv3m` across src/- Glob `**/*.rs` and `**/Cargo.toml` for project structure- Identify: existing gRPC streaming setup, token account caching, SPL parser usage- Read Cargo.toml — note existing dependencies- Task subagent (Grep/Read/WebFetch) if project has multiple crates to scan in parallel### 2. Read references- WebFetch the guide above — follow the Steps (Connect, Subscribe) and the transition detection sections- WebFetch skill.md — check for a dedicated skill and resources matching this task- TaskCreate one todo per phase below to track progress### 3. Clarify intention- AskUserQuestion: what is the goal? (new streaming pipeline for token accounts, add to existing pipeline, integrate cold/hot detection for routing)- AskUserQuestion: mainnet or devnet?- AskUserQuestion: do you need point queries (get_account_interface) in addition to streaming?- Summarize findings and wait for user confirmation before implementing### 4. Create plan- Based on steps 1–3, draft an implementation plan- Follow the guide's structure: Connect → Subscribe (account + transaction subs) → Detect Transitions (hot-to-cold, cold-to-hot) → Point Queries (optional)- Token accounts use the same 165-byte SPL layout — existing SPL parsers work directly- If anything is unclear or ambiguous, loop back to step 3 (AskUserQuestion)- Present the plan to the user for approval before proceeding### 5. Implement- Add deps if missing: Bash `cargo add helius-laserstream@0.1 light-token-interface@0.3 spl-pod spl-token-2022-interface borsh@0.10 futures@0.3 bs58@0.5 tokio --features full`- For point queries, also: Bash `cargo add light-client@0.19 --features v2`- Follow the guide and the approved plan- Write/Edit to create or modify files- TaskUpdate to mark each step done### 6. Verify- Bash `cargo check`- Bash `cargo test` if tests exist- TaskUpdate to mark complete### Tools- mcp__zkcompression__SearchLightProtocol("<query>") for API details- mcp__deepwiki__ask_question("Lightprotocol/light-protocol", "<q>") for architecture- Task subagent with Grep/Read/WebFetch for parallel lookups- TaskList to check remaining work