How we implemented RFC 6962 Merkle Trees, Ed25519 signatures, and hash chain validation for MT4/MT5 trading platformsβand why no one had done it before
TL;DR
We just launched the worldβs first cryptographic audit trail implementation for MetaTrader 4/5 with ABLENET, Japanβs leading MT4/MT5 VPS provider. This article covers:
- Why MT4/MT5βs plain-text logs are a security nightmare
- The cryptographic architecture: Ed25519 + SHA-256 + Merkle Trees
- How we achieved MQL4/MQL5 integration without modifying the platform
- The "sidecar" pattern for non-intrusive deployment
- Performance benchmarks and optimization strategies
- Lessons learned from production validation
GitHub: github.com/veritaschain IETF Draft: [draft-kamimura-scittβ¦
How we implemented RFC 6962 Merkle Trees, Ed25519 signatures, and hash chain validation for MT4/MT5 trading platformsβand why no one had done it before
TL;DR
We just launched the worldβs first cryptographic audit trail implementation for MetaTrader 4/5 with ABLENET, Japanβs leading MT4/MT5 VPS provider. This article covers:
- Why MT4/MT5βs plain-text logs are a security nightmare
- The cryptographic architecture: Ed25519 + SHA-256 + Merkle Trees
- How we achieved MQL4/MQL5 integration without modifying the platform
- The "sidecar" pattern for non-intrusive deployment
- Performance benchmarks and optimization strategies
- Lessons learned from production validation
GitHub: github.com/veritaschain IETF Draft: draft-kamimura-scitt-vcp Evidence Report: VCP Worldβs First Claim Verification
The Problem: MetaTraderβs Plain-Text Audit Trail
Letβs start with a harsh truth that every MT4/MT5 developer knows but rarely discusses publicly.
What MetaQuotes Actually Stores
Open any MT4/MT5 terminal and check your logs directory. Youβll find files named YYYYMMDD.LOG containing entries like:
0 12:34:56.789 Trade order #12345678 buy 1.00 EURUSD at 1.08765 done
0 12:34:57.123 Trade order #12345678 sl 1.08500 tp 1.09200 done
Thatβs it. Plain text. No signatures. No hashes. No verification mechanism.
The Official Documentation Confirms It
From MetaQuotesβ own MQL5 Reference:
"Logs are not physically removed from the computer, they are still available in log files... stored in
YYYYMMDD.LOGformat."
And perhaps more tellingly:
"Cryptography is rarely used in MQL programs. There are not so many opportunities in everyday trading to use cryptography."
Why This Matters
Anyone with server access can:
# Edit trade history
sed -i 's/buy 1.00/buy 0.10/g' 20251225.LOG
# Change execution prices
sed -i 's/1.08765/1.08500/g' 20251225.LOG
# Delete inconvenient records
grep -v "order #12345678" 20251225.LOG > temp && mv temp 20251225.LOG
No cryptographic trail. No tamper detection. No way for traders to prove their records are authentic.
This isnβt a theoretical vulnerabilityβitβs actively exploited by fraudulent brokers running fake MT4/MT5 servers.
Why Hasnβt Anyone Solved This Before?
Before building VCP, we conducted extensive competitive analysis. The results were surprising.
The Research
- 85+ search queries across Google, Bing, academic databases
- 28+ RegTech companies analyzed
- 22+ patents reviewed
- Multi-language searches in English, Japanese, Russian
What We Found: Nothing
Zero cryptographic audit trail solutions for MT4/MT5.
Why the Gap Exists
1. MQL4/MQL5 is a Closed Ecosystem
// MQL5 is C++-like but proprietary
// No external library imports
// Limited cryptographic primitives
// MetaQuotes controls everything
#include <Trade\Trade.mqh> // Only MetaQuotes libraries
// #include <openssl/sha.h> // NOT POSSIBLE
2. Major RegTech Targets Institutional Markets
Companies like NICE Actimize, Nasdaq SMARTS, and Eventus Systems build for:
- Institutional trading desks (FIX Protocol)
- Exchanges (proprietary APIs)
- Prime brokers (SWIFT integration)
The 9.6 million retail traders on MT4/MT5? Not their target market. The monthly broker license of $10,000-$20,000 is too small for enterprise sales teams.
3. Blockchain Analytics β Audit Trail Creation
Chainalysis, Elliptic, and TRM Labs analyze existing blockchain transactions. They:
- Cannot create audit trails where none exist
- Cannot work with off-chain forex trades
- Have zero integration with MQL environments
The Scope of "Worldβs First"
To be clear about our claim: general cryptographic audit mechanisms have existed since blockchain (2009) and Certificate Transparency (2013).
Whatβs unprecedented is an open standard combining:
- Cryptographic verification (Ed25519, SHA-256, Merkle Trees)
- MetaTrader ecosystem integration (MQL4/MQL5)
- Production deployment capability
- Regulatory compliance mapping (EU AI Act, MiFID II, SEC 17a-4)
Full evidence documentation: VCP Worldβs First Evidence Report
The VCP Architecture
Design Principles
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β DESIGN PRINCIPLES β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ€
β 1. Non-intrusive: Zero modifications to MT4/MT5 β
β 2. Cryptographically rigorous: RFC-compliant primitives β
β 3. Third-party verifiable: Anyone can validate β
β 4. Regulatory-aligned: EU AI Act, MiFID II, SEC 17a-4 β
β 5. Performance-conscious: <1ms overhead per trade β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
Core Components
ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β VCP v1.0 ARCHITECTURE β
ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ€
β β
β βββββββββββββββ βββββββββββββββ βββββββββββββββ β
β β VCP-TRADE β β VCP-GOV β β VCP-RISK β β
β β Module β β Module β β Module β β
β β β β β β β β
β β Order Entry β β Parameter β β Risk Limit β β
β β Execution β β Changes β β Breaches β β
β β Modificationβ β Approvals β β Adjustments β β
β ββββββββ¬βββββββ ββββββββ¬βββββββ ββββββββ¬βββββββ β
β β β β β
β ββββββββββββββ¬ββββββ΄ββββββ¬βββββββββββββ β
β β β β
β βΌ βΌ β
β βββββββββββββββββββββββββββββ β
β β HASH CHAIN ENGINE β β
β β β β
β β prev_hash βββΊ SHA-256 β β
β β payload βββΊ βββββββΊ β β
β β timestamp βββΊ hash_n β β
β βββββββββββββββ¬ββββββββββββββ β
β β β
β βΌ β
β βββββββββββββββββββββββββββββ β
β β MERKLE TREE β β
β β (RFC 6962) β β
β β β β
β β [ROOT] β β
β β / \ β β
β β [H01] [H23] β β
β β / \ / \ β β
β β [H0] [H1][H2] [H3] β β
β βββββββββββββββ¬ββββββββββββββ β
β β β
β βΌ β
β βββββββββββββββββββββββββββββ β
β β SIGNATURE ENGINE β β
β β Ed25519 β β
β β (RFC 8032) β β
β βββββββββββββββββββββββββββββ β
β β
ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
Cryptographic Primitives: The Technical Details
1. Digital Signatures: Ed25519 (RFC 8032)
Why Ed25519 over RSA or ECDSA?
# Performance comparison (signatures/second on modern hardware)
# RSA-2048: ~1,000 sig/s
# ECDSA P-256: ~10,000 sig/s
# Ed25519: ~30,000 sig/s β Winner
# Key size comparison
# RSA-2048: 256 bytes public key
# ECDSA P-256: 64 bytes public key
# Ed25519: 32 bytes public key β Winner
# Security level
# All provide ~128-bit security
# Ed25519 is immune to timing attacks by design
Implementation:
from cryptography.hazmat.primitives.asymmetric.ed25519 import Ed25519PrivateKey
from cryptography.hazmat.primitives import serialization
import json
class VCPSigner:
def __init__(self):
self.private_key = Ed25519PrivateKey.generate()
self.public_key = self.private_key.public_key()
def sign_record(self, record: dict) -> bytes:
"""Sign a VCP record with Ed25519"""
# Canonical JSON serialization (RFC 8785)
canonical = json.dumps(record, sort_keys=True, separators=(',', ':'))
return self.private_key.sign(canonical.encode('utf-8'))
def verify_signature(self, record: dict, signature: bytes) -> bool:
"""Verify Ed25519 signature"""
canonical = json.dumps(record, sort_keys=True, separators=(',', ':'))
try:
self.public_key.verify(signature, canonical.encode('utf-8'))
return True
except Exception:
return False
def export_public_key(self) -> str:
"""Export public key for third-party verification"""
return self.public_key.public_bytes(
encoding=serialization.Encoding.PEM,
format=serialization.PublicFormat.SubjectPublicKeyInfo
).decode('utf-8')
2. Hash Function: SHA-256 (FIPS 180-4)
The backbone of our hash chain:
import hashlib
from dataclasses import dataclass
from typing import Optional
import time
@dataclass
class VCPRecord:
record_id: str # UUID v7 (RFC 9562)
timestamp: float # Unix timestamp with microseconds
event_type: str # TRADE, GOV, RISK
payload: dict # Event-specific data
prev_hash: str # Previous record's hash
hash: Optional[str] = None
signature: Optional[bytes] = None
class HashChainEngine:
def __init__(self):
self.chain: list[VCPRecord] = []
self.genesis_hash = "0" * 64 # Genesis block
def compute_hash(self, record: VCPRecord) -> str:
"""Compute SHA-256 hash of record"""
# Canonical representation
data = f"{record.record_id}|{record.timestamp}|{record.event_type}|"
data += json.dumps(record.payload, sort_keys=True, separators=(',', ':'))
data += f"|{record.prev_hash}"
return hashlib.sha256(data.encode('utf-8')).hexdigest()
def add_record(self, event_type: str, payload: dict) -> VCPRecord:
"""Add new record to chain"""
prev_hash = self.chain[-1].hash if self.chain else self.genesis_hash
record = VCPRecord(
record_id=self._generate_uuid_v7(),
timestamp=time.time(),
event_type=event_type,
payload=payload,
prev_hash=prev_hash
)
record.hash = self.compute_hash(record)
self.chain.append(record)
return record
def validate_chain(self) -> tuple[bool, Optional[int]]:
"""
Validate entire hash chain
Returns (is_valid, first_invalid_index)
Time complexity: O(n) - any tampering is detectable
"""
prev_hash = self.genesis_hash
for i, record in enumerate(self.chain):
# Verify link to previous
if record.prev_hash != prev_hash:
return False, i
# Verify hash integrity
computed = self.compute_hash(record)
if computed != record.hash:
return False, i
prev_hash = record.hash
return True, None
def _generate_uuid_v7(self) -> str:
"""Generate time-ordered UUID v7 (RFC 9562)"""
import uuid
# UUID v7: timestamp-based, sortable
return str(uuid.uuid7()) if hasattr(uuid, 'uuid7') else str(uuid.uuid4())
3. Merkle Tree: RFC 6962 Compliance
The Merkle tree enables efficient third-party verification without revealing all records:
from typing import List, Tuple
import hashlib
class MerkleTree:
"""
RFC 6962-compliant Merkle Tree implementation
Used in Certificate Transparency, adapted for VCP
"""
def __init__(self, hash_func=hashlib.sha256):
self.hash_func = hash_func
self.leaves: List[bytes] = []
self.tree: List[List[bytes]] = []
def _hash(self, data: bytes) -> bytes:
return self.hash_func(data).digest()
def _leaf_hash(self, data: bytes) -> bytes:
"""RFC 6962: leaf hash = H(0x00 || data)"""
return self._hash(b'\x00' + data)
def _node_hash(self, left: bytes, right: bytes) -> bytes:
"""RFC 6962: node hash = H(0x01 || left || right)"""
return self._hash(b'\x01' + left + right)
def add_leaf(self, data: bytes) -> int:
"""Add leaf and return index"""
leaf_hash = self._leaf_hash(data)
self.leaves.append(leaf_hash)
return len(self.leaves) - 1
def build_tree(self) -> bytes:
"""Build tree and return root hash"""
if not self.leaves:
return self._hash(b'')
# Start with leaf hashes
current_level = self.leaves.copy()
self.tree = [current_level]
while len(current_level) > 1:
next_level = []
for i in range(0, len(current_level), 2):
left = current_level[i]
# RFC 6962: if odd number, duplicate last node
right = current_level[i + 1] if i + 1 < len(current_level) else left
next_level.append(self._node_hash(left, right))
self.tree.append(next_level)
current_level = next_level
return current_level[0] # Root hash
def get_proof(self, leaf_index: int) -> List[Tuple[bytes, str]]:
"""
Generate inclusion proof for leaf at index
Returns list of (hash, direction) tuples
"""
if not self.tree or leaf_index >= len(self.leaves):
raise ValueError("Invalid leaf index or tree not built")
proof = []
index = leaf_index
for level in self.tree[:-1]: # Exclude root
is_right = index % 2 == 1
sibling_index = index - 1 if is_right else index + 1
if sibling_index < len(level):
direction = 'left' if is_right else 'right'
proof.append((level[sibling_index], direction))
index //= 2
return proof
def verify_proof(self, leaf_data: bytes, proof: List[Tuple[bytes, str]],
root: bytes) -> bool:
"""Verify inclusion proof against root"""
current = self._leaf_hash(leaf_data)
for sibling_hash, direction in proof:
if direction == 'left':
current = self._node_hash(sibling_hash, current)
else:
current = self._node_hash(current, sibling_hash)
return current == root
# Usage example
def demonstrate_merkle_verification():
"""
Show how third-party can verify single record
without accessing entire database
"""
tree = MerkleTree()
# Add trade records
trades = [
b'{"order_id": 1, "action": "BUY", "symbol": "EURUSD", "price": 1.0876}',
b'{"order_id": 2, "action": "SELL", "symbol": "GBPUSD", "price": 1.2650}',
b'{"order_id": 3, "action": "BUY", "symbol": "USDJPY", "price": 149.50}',
b'{"order_id": 4, "action": "SELL", "symbol": "EURUSD", "price": 1.0880}',
]
for trade in trades:
tree.add_leaf(trade)
root = tree.build_tree()
print(f"Merkle Root: {root.hex()}")
# Generate proof for trade #2
proof = tree.get_proof(1)
# Third party can verify trade #2 exists without seeing other trades
is_valid = tree.verify_proof(trades[1], proof, root)
print(f"Trade #2 verification: {is_valid}")
# Tampered trade fails verification
tampered = b'{"order_id": 2, "action": "SELL", "symbol": "GBPUSD", "price": 1.2600}'
is_valid_tampered = tree.verify_proof(tampered, proof, root)
print(f"Tampered trade verification: {is_valid_tampered}")
# Output:
# Merkle Root: a1b2c3d4...
# Trade #2 verification: True
# Tampered trade verification: False
4. Time-Ordered Identifiers: UUID v7 (RFC 9562)
Why UUID v7 matters for audit trails:
import time
import os
import struct
def generate_uuid_v7() -> str:
"""
RFC 9562 UUID v7: Time-ordered unique identifier
Structure:
ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β 48-bit timestamp (ms) β 4-bit ver β 12-bit rand_a β β
β 2-bit var β 62-bit rand_b β
ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
Benefits:
- Sortable by creation time (crucial for audit trails)
- No coordination needed (unlike sequential IDs)
- Database index-friendly
- Globally unique
"""
# 48-bit Unix timestamp in milliseconds
timestamp_ms = int(time.time() * 1000)
# Random bits
rand_bytes = os.urandom(10)
# Construct UUID v7
uuid_bytes = bytearray(16)
# Timestamp (48 bits)
uuid_bytes[0:6] = struct.pack('>Q', timestamp_ms)[2:8]
# Version 7 (4 bits) + rand_a (12 bits)
uuid_bytes[6] = 0x70 | (rand_bytes[0] & 0x0F)
uuid_bytes[7] = rand_bytes[1]
# Variant (2 bits) + rand_b (62 bits)
uuid_bytes[8] = 0x80 | (rand_bytes[2] & 0x3F)
uuid_bytes[9:16] = rand_bytes[3:10]
# Format as string
hex_str = uuid_bytes.hex()
return f"{hex_str[:8]}-{hex_str[8:12]}-{hex_str[12:16]}-{hex_str[16:20]}-{hex_str[20:]}"
# Demonstration: UUIDs are sortable by time
uuids = [generate_uuid_v7() for _ in range(5)]
time.sleep(0.01)
uuids.extend([generate_uuid_v7() for _ in range(5)])
print("UUID v7 values (time-ordered):")
for uuid in sorted(uuids):
print(f" {uuid}")
5. JSON Canonicalization: RFC 8785
Critical for deterministic hashing:
import json
from decimal import Decimal
def canonicalize_json(obj) -> str:
"""
RFC 8785 JSON Canonicalization Scheme (JCS)
Rules:
1. No whitespace between tokens
2. Object keys sorted lexicographically
3. Numbers in shortest form (no trailing zeros)
4. Strings use minimal escaping
5. UTF-8 encoding
"""
def normalize_value(v):
if isinstance(v, dict):
# Sort keys lexicographically
return {k: normalize_value(v[k]) for k in sorted(v.keys())}
elif isinstance(v, list):
return [normalize_value(item) for item in v]
elif isinstance(v, float):
# Handle special float cases
if v != v: # NaN
raise ValueError("NaN not allowed in canonical JSON")
if v == float('inf') or v == float('-inf'):
raise ValueError("Infinity not allowed in canonical JSON")
# Use shortest representation
return float(Decimal(str(v)).normalize())
else:
return v
normalized = normalize_value(obj)
return json.dumps(normalized, separators=(',', ':'), ensure_ascii=False)
# Why this matters:
original = {"b": 1, "a": 2, "c": {"z": 3, "y": 4}}
# Standard JSON: non-deterministic key order
print(json.dumps(original))
# Could be: {"b": 1, "a": 2, "c": {"z": 3, "y": 4}}
# Or: {"a": 2, "b": 1, "c": {"y": 4, "z": 3}}
# Canonical JSON: always the same
print(canonicalize_json(original))
# Always: {"a":2,"b":1,"c":{"y":4,"z":3}}
# This ensures hash(record) is always identical for same data
The Sidecar Architecture: Non-Intrusive Integration
The biggest technical challenge wasnβt the cryptographyβit was integrating with MT4/MT5 without modifying the platform.
The Problem
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β MT4/MT5 PLATFORM β
β β
β βββββββββββββββ βββββββββββββββ βββββββββββββββ β
β β Core β β Trade β β Log β β
β β Engine β β Server β β System β β
β β (Closed) β β (Closed) β β (Closed) β β
β βββββββββββββββ βββββββββββββββ βββββββββββββββ β
β β
β βββββββββββββββββββββββββββββββββββββββββββββββββββββββ β
β β MQL4/MQL5 SANDBOX β β
β β - No external library imports β β
β β - Limited system calls β β
β β - No direct network socket access β β
β β - Restricted file system access β β
β βββββββββββββββββββββββββββββββββββββββββββββββββββββββ β
β β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
The Solution: Sidecar Pattern
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β VCP SIDECAR ARCHITECTURE β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ€
β β
β βββββββββββββββββββββββββββββ βββββββββββββββββββββββββββββ β
β β MT4/MT5 TERMINAL β β VCP SIDECAR β β
β β β β β β
β β βββββββββββββββββββββββ β β βββββββββββββββββββββββ β β
β β β VCP Bridge EA ββββΌβββΊβββΌβββ Event Collector β β β
β β β β β WS β β β β β
β β β - OnTrade() β β β β - Parse events β β β
β β β - OnTradeTransaction() β β - Validate format β β β
β β β - File export β β β β - Queue for proc β β β
β β βββββββββββββββββββββββ β β ββββββββββββ¬βββββββββββ β β
β β β β β β β
β β Trade Events: β β βΌ β β
β β - ORDER_SEND β β βββββββββββββββββββββββ β β
β β - ORDER_MODIFY β β β Hash Chain Eng β β β
β β - ORDER_CLOSE β β β β β β
β β - POSITION_OPEN β β β - Compute SHA-256 β β β
β β - POSITION_CLOSE β β β - Link to prev β β β
β β β β β - Generate UUID v7 β β β
β βββββββββββββββββββββββββββββ β ββββββββββββ¬βββββββββββ β β
β β β β β
β β βΌ β β
β β βββββββββββββββββββββββ β β
β β β Merkle Builder β β β
β β β β β β
β β β - Batch records β β β
β β β - Build tree β β β
β β β - Publish root β β β
β β ββββββββββββ¬βββββββββββ β β
β β β β β
β β βΌ β β
β β βββββββββββββββββββββββ β β
β β β Signature Eng β β β
β β β β β β
β β β - Sign root β β β
β β β - Timestamp β β β
β β β - Anchor (opt) β β β
β β βββββββββββββββββββββββ β β
β β β β
β βββββββββββββββββββββββββββββ β
β β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
MQL5 Bridge Implementation
//+------------------------------------------------------------------+
//| VCP_Bridge.mq5 |
//| VeritasChain Standards Organization |
//| https://veritaschain.org |
//+------------------------------------------------------------------+
#property copyright "VSO"
#property link "https://veritaschain.org"
#property version "1.00"
#include <Trade\Trade.mqh>
// WebSocket connection to VCP Sidecar
int ws_handle = INVALID_HANDLE;
string vcp_endpoint = "ws://localhost:8765/vcp";
//+------------------------------------------------------------------+
//| Expert initialization function |
//+------------------------------------------------------------------+
int OnInit()
{
// Initialize WebSocket connection
ws_handle = WebSocketCreate(vcp_endpoint);
if(ws_handle == INVALID_HANDLE)
{
PrintFormat("VCP Bridge: Failed to connect to %s", vcp_endpoint);
// Fallback to file-based export
return(INIT_SUCCEEDED);
}
PrintFormat("VCP Bridge: Connected to %s", vcp_endpoint);
return(INIT_SUCCEEDED);
}
//+------------------------------------------------------------------+
//| Expert deinitialization function |
//+------------------------------------------------------------------+
void OnDeinit(const int reason)
{
if(ws_handle != INVALID_HANDLE)
{
WebSocketClose(ws_handle);
}
}
//+------------------------------------------------------------------+
//| TradeTransaction function - captures all trade events |
//+------------------------------------------------------------------+
void OnTradeTransaction(const MqlTradeTransaction& trans,
const MqlTradeRequest& request,
const MqlTradeResult& result)
{
// Build VCP event payload
string payload = BuildVCPPayload(trans, request, result);
// Send to sidecar
if(ws_handle != INVALID_HANDLE)
{
WebSocketSend(ws_handle, payload);
}
else
{
// Fallback: write to file for sidecar to pick up
ExportToFile(payload);
}
}
//+------------------------------------------------------------------+
//| Build VCP-compliant JSON payload |
//+------------------------------------------------------------------+
string BuildVCPPayload(const MqlTradeTransaction& trans,
const MqlTradeRequest& request,
const MqlTradeResult& result)
{
string json = "{";
// Event metadata
json += StringFormat("\"timestamp\":%f,", GetMicrosecondCount() / 1000000.0);
json += StringFormat("\"event_type\":\"%s\",", TransactionTypeToString(trans.type));
// Transaction details
json += "\"payload\":{";
json += StringFormat("\"deal\":%I64u,", trans.deal);
json += StringFormat("\"order\":%I64u,", trans.order);
json += StringFormat("\"symbol\":\"%s\",", trans.symbol);
json += StringFormat("\"type\":%d,", trans.type);
json += StringFormat("\"deal_type\":%d,", trans.deal_type);
json += StringFormat("\"price\":%f,", trans.price);
json += StringFormat("\"volume\":%f,", trans.volume);
json += StringFormat("\"sl\":%f,", trans.price_sl);
json += StringFormat("\"tp\":%f", trans.price_tp);
json += "}";
json += "}";
return json;
}
//+------------------------------------------------------------------+
//| Convert transaction type to string |
//+------------------------------------------------------------------+
string TransactionTypeToString(ENUM_TRADE_TRANSACTION_TYPE type)
{
switch(type)
{
case TRADE_TRANSACTION_ORDER_ADD: return "ORDER_ADD";
case TRADE_TRANSACTION_ORDER_UPDATE: return "ORDER_UPDATE";
case TRADE_TRANSACTION_ORDER_DELETE: return "ORDER_DELETE";
case TRADE_TRANSACTION_DEAL_ADD: return "DEAL_ADD";
case TRADE_TRANSACTION_DEAL_UPDATE: return "DEAL_UPDATE";
case TRADE_TRANSACTION_DEAL_DELETE: return "DEAL_DELETE";
case TRADE_TRANSACTION_HISTORY_ADD: return "HISTORY_ADD";
case TRADE_TRANSACTION_HISTORY_UPDATE: return "HISTORY_UPDATE";
case TRADE_TRANSACTION_HISTORY_DELETE: return "HISTORY_DELETE";
case TRADE_TRANSACTION_POSITION: return "POSITION";
case TRADE_TRANSACTION_REQUEST: return "REQUEST";
default: return "UNKNOWN";
}
}
//+------------------------------------------------------------------+
//| File-based export fallback |
//+------------------------------------------------------------------+
void ExportToFile(string payload)
{
string filename = StringFormat("VCP_%s.json",
TimeToString(TimeCurrent(), TIME_DATE));
int handle = FileOpen(filename, FILE_WRITE|FILE_READ|FILE_TXT|FILE_ANSI);
if(handle != INVALID_HANDLE)
{
FileSeek(handle, 0, SEEK_END);
FileWriteString(handle, payload + "\n");
FileClose(handle);
}
}
Python Sidecar Service
#!/usr/bin/env python3
"""
VCP Sidecar Service
Receives events from MT4/MT5 bridge, processes through VCP pipeline
"""
import asyncio
import websockets
import json
from dataclasses import dataclass, asdict
from typing import Optional
import logging
from vcp_core import HashChainEngine, MerkleTree, VCPSigner
logging.basicConfig(level=logging.INFO)
logger = logging.getLogger(__name__)
class VCPSidecar:
def __init__(self, host: str = "localhost", port: int = 8765):
self.host = host
self.port = port
self.hash_chain = HashChainEngine()
self.merkle_tree = MerkleTree()
self.signer = VCPSigner()
self.batch_size = 100
self.current_batch = []
async def handle_connection(self, websocket, path):
"""Handle incoming WebSocket connection from MT bridge"""
client_id = id(websocket)
logger.info(f"New connection: {client_id}")
try:
async for message in websocket:
await self.process_event(message, websocket)
except websockets.exceptions.ConnectionClosed:
logger.info(f"Connection closed: {client_id}")
async def process_event(self, message: str, websocket):
"""Process incoming trade event"""
try:
event = json.loads(message)
# Add to hash chain
record = self.hash_chain.add_record(
event_type=event.get('event_type', 'UNKNOWN'),
payload=event.get('payload', {})
)
# Sign the record
record.signature = self.signer.sign_record(asdict(record))
# Add to current batch
self.current_batch.append(record)
# Build Merkle tree when batch is full
if len(self.current_batch) >= self.batch_size:
await self.finalize_batch()
# Send acknowledgment
ack = {
"status": "ok",
"record_id": record.record_id,
"hash": record.hash
}
await websocket.send(json.dumps(ack))
except json.JSONDecodeError as e:
logger.error(f"Invalid JSON: {e}")
await websocket.send(json.dumps({"status": "error", "message": str(e)}))
async def finalize_batch(self):
"""Finalize current batch with Merkle tree"""
if not self.current_batch:
return
# Add all records to Merkle tree
for record in self.current_batch:
record_bytes = json.dumps(asdict(record), sort_keys=True).encode()
self.merkle_tree.add_leaf(record_bytes)
# Build tree and get root
root = self.merkle_tree.build_tree()
# Sign root
root_signature = self.signer.sign_record({"merkle_root": root.hex()})
logger.info(f"Batch finalized: {len(self.current_batch)} records, root: {root.hex()[:16]}...")
# Store/publish root (implementation depends on anchoring strategy)
await self.publish_root(root, root_signature)
# Reset for next batch
self.current_batch = []
self.merkle_tree = MerkleTree()
async def publish_root(self, root: bytes, signature: bytes):
"""Publish Merkle root (stub - implement anchoring strategy)"""
# Options:
# 1. Store locally
# 2. Publish to blockchain
# 3. Send to transparency log
# 4. Distribute to third-party verifiers
pass
async def run(self):
"""Start WebSocket server"""
logger.info(f"Starting VCP Sidecar on {self.host}:{self.port}")
async with websockets.serve(self.handle_connection, self.host, self.port):
await asyncio.Future() # Run forever
if __name__ == "__main__":
sidecar = VCPSidecar()
asyncio.run(sidecar.run())
Three-Tier Compliance Model
Different trading environments have different requirements. VCP addresses this with a three-tier model:
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β VCP THREE-TIER COMPLIANCE MODEL β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ€
β β
β βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ β
β β PLATINUM TIER β β
β β Target: HFT Firms, Exchanges β β
β β Time Sync: PTP (IEEE 1588) β β
β β Precision: Nanosecond β β
β β Latency Budget: <100ΞΌs β β
β β MT4/MT5: Not applicable (different platforms) β β
β βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ β
β β
β βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ β
β β GOLD TIER β β
β β Target: Institutional Brokers, Prime Brokers β β
β β Time Sync: NTP with multiple sources β β
β β Precision: Microsecond (Β±1ΞΌs) β β
β β Latency Budget: <1ms β β
β β MT4/MT5: Partial support (server-side) β β
β βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ β
β β
β βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ β
β β SILVER TIER β β
β β Target: Retail Traders, Prop Firms β β
β β Time Sync: Best-effort (system clock) β β
β β Precision: Millisecond (Β±1ms) β β
β β Latency Budget: <10ms β β
β β MT4/MT5: FULL SUPPORT β β β
β βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ β
β β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
Why Silver Tier Matters
The Silver Tier innovation is what makes VCP practical for MT4/MT5:
# Silver Tier tolerances
SILVER_TIER_CONFIG = {
"timestamp_tolerance_ms": 1000, # Β±1 second acceptable
"hash_algorithm": "SHA-256", # Same as higher tiers
"signature_algorithm": "Ed25519", # Same as higher tiers
"merkle_batch_size": 100, # Flexible batching
"sync_requirement": "best_effort", # No PTP/NTP mandate
}
# This means:
# - Works on standard VPS without specialized hardware
# - No network infrastructure changes required
# - Same cryptographic guarantees as higher tiers
# - Only timing precision is relaxed
Performance Benchmarks
From our ABLENET production validation:
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β PERFORMANCE BENCHMARKS β
β ABLENET Production Environment β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ€
β β
β Test Environment: β
β - VPS: ABLENET Win2 (2 vCPU, 2GB RAM) β
β - OS: Windows Server 2019 β
β - MT5: Build 3914 β
β - Sidecar: Python 3.11 + uvloop β
β β
β Results (1000 trade events): β
β β
β ββββββββββββββββββββββββββββββ¬βββββββββββββββ¬βββββββββββββββ β
β β Operation β Avg Latency β P99 Latency β β
β ββββββββββββββββββββββββββββββΌβββββββββββββββΌβββββββββββββββ€ β
β β Event capture (MQLβWS) β 0.3ms β 0.8ms β β
β β Hash computation (SHA-256) β 0.02ms β 0.05ms β β
β β Chain linking β 0.01ms β 0.03ms β β
β β Signature (Ed25519) β 0.05ms β 0.12ms β β
β β Merkle tree (100 leaves) β 0.4ms β 0.9ms β β
β ββββββββββββββββββββββββββββββΌβββββββββββββββΌβββββββββββββββ€ β
β β TOTAL per event β 0.78ms β 1.9ms β β
β ββββββββββββββββββββββββββββββ΄βββββββββββββββ΄βββββββββββββββ β
β β
β Throughput: ~1,200 events/second sustained β
β Memory overhead: ~50MB for sidecar process β
β Disk write: ~200 bytes per event (compressed) β
β β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
Optimization Strategies
# 1. Batch Merkle tree construction
# Instead of per-event trees, batch for efficiency
class OptimizedMerkleBuilder:
def __init__(self, batch_size: int = 100):
self.batch_size = batch_size
self.pending = []
async def add_and_maybe_finalize(self, record):
self.pending.append(record)
if len(self.pending) >= self.batch_size:
root = await self.build_tree_async()
self.pending = []
return root
return None
# 2. Async signature generation
# Ed25519 is fast, but async prevents blocking
async def sign_batch_async(records: list, signer: VCPSigner):
loop = asyncio.get_event_loop()
signatures = await asyncio.gather(*[
loop.run_in_executor(None, signer.sign_record, r)
for r in records
])
return signatures
# 3. Connection pooling for WebSocket
# Reuse connections, don't create per-event
class ConnectionPool:
def __init__(self, max_connections: int = 10):
self.pool = asyncio.Queue(maxsize=max_connections)
async def get_connection(self):
return await self.pool.get()
async def return_connection(self, conn):
await self.pool.put(conn)
Regulatory Alignment
VCP was designed with regulatory compliance in mind from day one.
EU AI Act (Article 12 & 13)
# VCP directly addresses EU AI Act requirements
EU_AI_ACT_MAPPING = {
"Article 12.1": {
"requirement": "Automatic logging capabilities",
"vcp_implementation": "VCP-TRADE module captures all trade events automatically"
},
"Article 12.2": {
"requirement": "Traceability of AI system operation",
"vcp_implementation": "Hash chain provides complete event sequence"
},
"Article 12.3": {
"requirement": "Logging throughout system lifetime",
"vcp_implementation": "Immutable records with cryptographic timestamps"
},
"Article 13.1": {
"requirement": "Transparency for users",
"vcp_implementation": "Merkle proofs enable independent verification"
}
}
# Penalty for non-compliance: β¬35M or 7% of global revenue
# Full enforcement: August 2, 2026
MiFID II RTS 25
# "Operators shall be able to EVIDENCE that systems meet requirements"
# Key word: EVIDENCE, not just "claim" or "assert"
MIFID_II_MAPPING = {
"Article 3": {
"requirement": "Synchronization accuracy",
"vcp_silver_tier": "Β±1ms (exceeds retail requirements)"
},
"Article 4": {
"requirement": "Record keeping",
"vcp_implementation": "Immutable hash chain with Merkle verification"
},
"Article 5": {
"requirement": "Audit trail capability",
"vcp_implementation": "Third-party verifiable proofs"
}
}
SEC Rule 17a-4
# Audit trail alternative requirements
SEC_17A4_MAPPING = {
"Modification tracking": {
"vcp_implementation": "Hash chain - O(n) tamper detection",
"description": "Any modification invalidates all subsequent hashes"
},
"Timestamp accuracy": {
"silver_tier": "Β±1ms",
"gold_tier": "Β±1ΞΌs"
},
"User identification": {
"vcp_fields": ["OperatorID", "LastApprovalBy", "SignerPublicKey"]
}
}
ABLENET Partnership: Why It Matters
This isnβt a theoretical projectβitβs production-validated infrastructure.
ABLENET Credentials
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β ABLENET (K&K Corporation) β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ€
β β
β Established: February 1998 (25+ years operation) β
β Uptime: 99.99%+ SLA β
β Data Centers: Osaka, Japan (Tier 1/2 class) β
β Bandwidth: 220+ Gbps backbone β
β Carriers: NTT Communications, OPTAGE (redundant) β
β Specialization: Japan's leading MT4/MT5 VPS provider β
β β
β 2022 Investment: MSD Enterprise Investment Group β
β - Mitsui & Co. (Japan's largest trading company) β
β - Sumitomo Mitsui Banking Corporation (Tier 1 bank) β
β - Development Bank of Japan (Government-backed) β
β β
β Note: MSD invested in ABLENET, not VCP/VSO directly. β
β This provides partner credibility, not protocol endorsement. β
β β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
What This Validates
- Production Feasibility: VCP works in real trading environments
- Performance: Sub-millisecond overhead is achievable
- Integration: Sidecar architecture is non-intrusive
- Scale: Infrastructure can handle thousands of concurrent traders
Whatβs Next
Roadmap
2025 Q4: PoC launch with ABLENET β WE ARE HERE
2026 Q1: Production deployment (initial customers)
2026 Q2: VC-Certified program launch
- Silver Tier certification
- Gold Tier certification
- Platinum Tier certification
2026 H2: Global expansion
- Additional VPS providers
- Broker integrations
2027+: Post-quantum migration
- Dilithium signatures (NIST PQC)
- Hybrid classical/PQ mode
How to Get Involved
For Developers:
- Star & fork: github.com/veritaschain
- Review IETF draft: draft-kamimura-scitt-vcp
- Submit PRs for SDK improvements
For Brokers/VPS Providers:
- Contact: [enterprise@veritaschain.o