Building Shield: Leveraging QuickNode Streams & Function for Real-Time DeFi Transaction Monitoring
Introduction
Imagine trying to monitor thousands of DeFi transactions every second — tracking user positions, identifying whale movements, or catching arbitrage opportunities.
It’s like trying to spot individual raindrops in a thunderstorm. That’s the challenge developers face when building DeFi applications on Solana. Every trade, every liquidation, and every position change needs to be caught, processed, and acted upon instantly.
Shield changes this game entirely. Instead of drowning in complex transaction logs or missing critical market moments, developers can now tap into a real-time stream of clean, structured data that’s ready for analysis.
Whether you’re building a trading bot, training an AI model, or monitoring risk positions, Shield transforms raw Solana transaction chaos into actionable intelligence.
This technical guide explores how we leveraged QuickNode’s Streams and Functions to build a robust transaction monitoring system for DeFi protocols.
We’ll dive deep into the implementation details, explore code samples, and discuss how this solution addresses common challenges in blockchain data processing.
The Challenge: Real-Time DeFi Monitoring
Traditional blockchain data monitoring approaches often rely on polling methods, which come with several limitations:
- High infrastructure load from frequent API calls
- Latency issues leading to delayed updates
- Missed events between polling intervals
- Complex coordination for concurrent requests
- Inconsistent state management
These limitations become particularly problematic when monitoring DeFi positions, where even a few seconds of delay could result in missed opportunities or increased risk exposure.
Solution Architecture: QuickNode Streams Integration
Our solution leverages QuickNode’s WebSocket Streams to establish persistent connections for real-time transaction data. The architecture consists of three main components:
- Stream Connection Layer: Handles WebSocket connections and raw data ingestion
- Transaction Processing Pipeline: Filters and transforms transaction data
- Protocol-Specific Analytics: Processes transactions based on specific protocol rules
Core Components Implementation
Let’s break down each component and examine the implementation details.
1. Stream Processing Pipeline
The main entry point of our system handles multiple types of transaction queries:
function main(stream, options = {}) {
const {
type,
protocol,
limit,
timeframe,
threshold,
callback,
walletAddress,
mintAddress
} = options;
switch (type) {
case 'all':
return getAllTransactions(stream, limit);
case 'byProtocol':
return getTransactionsByProtocol(stream, protocol, limit);
case 'volume':
return getProtocolVolume(stream, protocol, timeframe);
case 'activeWallets':
return getActiveWallets(stream, protocol);
case 'transferStats':
return getTokenTransferStats(stream, protocol);
case 'alertLarge':
return alertOnLargeTransactions(stream, threshold, callback);
case 'multiProtocolStats':
return getMultiProtocolStats(stream);
case 'dailyStats':
return getDailyStats(stream, protocol);
case 'walletSearch':
return searchTransactionsByWallet(stream, walletAddress);
case 'protocolFees':
return calculateProtocolFees(stream, protocol);
case 'valueTransferred':
return calculateTotalValueTransferred(stream, mintAddress);
default:
return stream;
}
This flexible entry point allows us to:
- Process different types of analytics queries
- Filter by protocol, timeframe, or specific addresses
- Apply various transformation and aggregation rules
2. Protocol Recognition System
We maintain a registry of protocol identifiers to accurately categorize transactions:
const PROTOCOL_IDS = {
MARGINFI: {
MAIN: 'MFv2hWf31Z9kbCa1snEPYctwafyhdvnV7FZnsebVacA'
},
JUPITER: {
SWAPS: 'JUP6LkbZbjS1jKKwapdHNy74zcZ3tLUZoi5QNyVTaV4',
LIMIT_ORDER: 'jupoNjAxXgZ4rjzxzPMP4oxduvQsQtZzyknqvzYNrNu',
DCA: 'DCA265Vj8a9CEuX1eb1LWRnDT7uK6q1xMipnNyatn23M'
},
OPENBOOK: {
V2: 'opnb2LAfJYbRMAHHvqjCwQxanZn7ReEHp1k81EohpZb'
},
RAYDIUM: {
OPEN_BOOK: '675kPX9MHTjS2zt1qfr1NYHuzeLXfQM9H24wFSUt1Mp8',
STABLE_SWAP_AMM: '5quBtoiQqxF9Jv6KYKctB59NT3gtJD2Y65kdnB1Uev3h',
STAKING: 'EhhTKczWMGQt46ynNeRX1WfeagwwJd7ufHvCDjRxjo5Q',
STANDARD_AMM: 'CPMMoo8L3F4NbTegBCKVNunggL7H1ZpdTHKxQB5qKP1C',
CONCENTRATED_LIQUIDITY: 'CAMMCzo5YL8w4VFF8KVHrK22GGUsp5VTaW7grrKgrWqK',
}
};
This allows us to:
- Identify transactions by protocol
- Support multiple sub-protocols within each platform
- Maintain clean, extensible protocol definitions
3. Transaction Processing Logic
The core transaction processing function handles the complex task of extracting and structuring relevant data:
function processTransaction(tx) {
const userWallet = tx.accounts[2]?.pubkey || null;
const userAccount = tx.accounts.find(acc => acc.pubkey === userWallet);
const balanceChange = userAccount ? userAccount.postBalance - userAccount.preBalance : 0;
const tokenTransfers = tx.tokenBalanceChanges.map(token => ({
mint: token.mint,
owner: token.owner,
amount: token.uiTokenAmount.uiAmount,
decimals: token.uiTokenAmount.decimals,
rawAmount: token.uiTokenAmount.amount
}));
const accounts = tx.accounts.map(acc => ({
address: acc.pubkey,
balanceChange: acc.postBalance - acc.preBalance
}));
return {
transactionId: tx.signature,
blockSlot: tx.slot,
timestamp: tx.blockTime,
success: tx.success,
type: tx.type,
protocol: tx.protocol,
subType: tx.subType,
program: tx.programId,
userWallet,
userBalanceChange: balanceChange,
tokenTransfers,
accountChanges: accounts,
instruction: {
index: tx.rawInstruction.index,
data: tx.rawInstruction.data
},
logs: tx.logs.filter(log =>
!log.includes('invoke') &&
!log.includes('success') &&
!log.includes('consumed')
),
processedAt: new Date().toISOString(),
lastUpdated: new Date().toISOString()
};
}
Advanced Features and Capabilities
1. Volume Analytics
The system can calculate detailed volume statistics for any protocol:
function getProtocolVolume(stream, protocol, timeframe) {
const now = Math.floor(Date.now() / 1000);
const startTime = now - timeframe;
const transactions = stream.data.filter(
(tx) => tx.protocol === protocol &&
tx.timestamp >= startTime &&
tx.success
);
const volumeByToken = transactions.reduce((acc, tx) => {
tx.tokenTransfers.forEach((transfer) => {
const key = transfer.mint;
acc[key] = (acc[key] || 0) + Math.abs(transfer.amount);
});
return acc;
}, {});
return {
protocol,
timeframe,
volumeByToken,
transactionCount: transactions.length
};
}
2. Real-Time Monitoring
For critical operations, we implement real-time monitoring with customizable alerts:
function alertOnLargeTransactions(stream, threshold, callback) {
stream.data.forEach((tx) => {
const totalValue = tx.tokenTransfers.reduce(
(sum, transfer) => sum + Math.abs(transfer.amount),
0
);
if (totalValue >= threshold) {
callback({
transactionId: tx.transactionId,
value: totalValue,
protocol: tx.protocol,
type: tx.type
});
}
});
}
AI/ML Integration Capabilities
One of the most powerful features of our implementation is its ability to generate clean, structured data that’s ideal for training AI models and agents. This opens up entirely new possibilities for DeFi analytics and automation.
Training AI Agents with Shield
Developers can now leverage Shield to train their AI agents on real-time DeFi transactions and on-chain events. The structured data output from our processing pipeline is perfectly suited for AI model training:
// Example of structured output ideal for AI training
const aiReadyTransaction = {
transactionId: tx.signature,
timestamp: tx.blockTime,
protocol: tx.protocol,
type: tx.type,
tokenTransfers: [{
mint: "Token Mint Address",
amount: "Normalized Amount",
direction: "in/out"
}],
success: true,
metrics: {
volumeUSD: 1000,
impactMetrics: {...},
riskMetrics: {...}
}
};
This structured format enables:
- Pattern recognition in trading behavior
- Risk analysis model training
- Anomaly detection systems
- Market sentiment analysis
- Predictive analytics for DeFi trends
Simplifying Solana Data Access
One of the most significant achievements of this implementation is making Solana data more accessible and analyzable. Solana’s high-performance blockchain generates massive amounts of valuable data, but its complexity has historically made it difficult to work with.
From Complex to Simple
Our system transforms complex Solana transaction data:
// Before: Complex Solana transaction data
{
"signature": "5xw27...",
"slot": 193884161,
"err": null,
"memo": null,
"blockTime": 1684161866,
"confirmationStatus": "finalized",
"accounts": [
// Complex nested account structures
],
"innerInstructions": [
// Complicated instruction arrays
]
}
// After: Clean, analyzed data
{
"type": "swap",
"protocol": "jupiter",
"inputToken": {
"mint": "5xw27...",
"amount": 1.5
},
"outputToken": {
"mint": "5xw27...",
"amount": 30.25
},
"timestamp": 1684161866,
"success": true,
"metrics": {
"priceImpact": 0.1,
"slippage": 0.05
}
}
Key Benefits for Data Analysis
This transformation provides several advantages:
- Simplified Integration: Clean, consistent data structure
- Reduced Processing Overhead: Pre-processed and normalized data
- Enhanced Analyzability: Clear transaction categorization
- AI-Ready Format: Structured for machine learning pipelines
- Real-Time Processing: Immediate access to processed data
Data Applications
The processed data enables various applications:
- Market Analysis
- Trading volume analysis
- Price impact studies
- Liquidity monitoring
2. Risk Management
- Position monitoring
- Exposure analysis
- Risk metric calculation
3. User Behavior Analysis
- Trading patterns
- Portfolio composition
- User segmentation
4. AI/ML Models
- Predictive analytics
- Anomaly detection
- Strategy optimization
Shield’s Performance Optimizations
Several optimizations ensure efficient processing of high-volume transaction streams:
- Early Filtering: Transactions are filtered at the earliest possible stage to reduce processing overhead
- Efficient Data Structures: Use of Sets for unique value tracking and Maps for quick lookups
- Minimal Data Copying: The processing pipeline maintains references where possible
- Selective Log Processing: Only relevant logs are preserved and processed
The combination of real-time data processing, simplified Solana data access, and AI-ready outputs positions shield as a powerful tool for the next generation of DeFi applications, analytics platforms and AI agents.