DTL Developer Documentation
Complete guide to building with Domain Transport Language. Learn parsing, validation, signing, and Web3 integration. v1.0
Introduction
DTL (Domain Transport Language) is an ultra-compact, cryptographically-secure data interchange format designed for modern applications requiring data integrity, confidentiality, and provenance. DTL enables domain-specific data transport with built-in security mechanisms and blockchain integration capabilities.
What You'll Learn
- Parse and validate DTL files
- Create DTL files programmatically
- Sign files with cryptographic keys
- Integrate with Web3 and blockchains
- Work with different domains (Healthcare, Finance, IoT, etc.)
- Implement field-level and row-level security
- Handle errors and edge cases
- Optimize performance for large datasets
Key Features
🔐 Security-First
Built-in cryptographic signing, field-level encryption, and row hashing. Support for ECDSA, Blake3, and SHA-256.
⛓️ Blockchain Ready
Native integration with multiple blockchains. File-level hashing and signature verification with wallet addresses.
📦 Ultra-Compact
Efficient binary-like encoding with human-readable format. Reduces file size by 60-80% compared to JSON.
🏥 Domain-Aware
Built-in support for healthcare, fintech, IoT, and enterprise domains with pre-defined schemas.
Installation
Python SDK Recommended
pip install dtl-parser # For development pip install dtl-parser[dev] # Verify installation python -c "import dtl; print(dtl.__version__)"
TypeScript/Node.js
npm install @dtlaz/sdk # or yarn add @dtlaz/sdk # or pnpm add @dtlaz/sdk # TypeScript support included npm install --save-dev @types/node
Go High Performance
go get github.com/dtlaz/go-sdk # Enable Go modules go mod init your-module go mod tidy
CLI Tool
curl -fsSL https://dtlaz.org/install.sh | sh # Or download directly wget https://releases.dtlaz.org/dtl-cli-latest-linux-x64 chmod +x dtl-cli-latest-linux-x64 sudo mv dtl-cli-latest-linux-x64 /usr/local/bin/dtl # Verify installation dtl --version dtl --help
Quick Start
Python Example
from dtl import DTLParser, DTLBuilder
# Parse an existing DTL file
parser = DTLParser()
content = open('data.dtl').read()
result = parser.parse(content)
if result.valid:
print(f"✓ Valid DTL file")
print(f" Domain: {result.header.domain}")
print(f" Profile: {result.header.profile}")
print(f" Tables: {len(result.tables)}")
# Access data
for table in result.tables:
print(f"\n Table: {table.name}")
print(f" Schema: {table.schema}")
print(f" Rows: {table.row_count}")
# Iterate through rows
for i, row in enumerate(table.rows[:3]): # First 3 rows
print(f" [{i+1}] {row}")
else:
print(f"✗ Parse error: {result.error}")
print(f" Line {result.error_line}")
print(f" Context: {result.error_context}")
TypeScript/Node.js Example
import { DTLParser, DTLBuilder } from '@dtlaz/sdk';
import * as fs from 'fs';
async function main() {
// Parse DTL file
const content = await fs.promises.readFile('data.dtl', 'utf-8');
const parser = new DTLParser();
const result = parser.parse(content);
if (result.valid) {
console.log(`✓ Valid DTL file`);
console.log(` Domain: ${result.header.domain}`);
console.log(` Profile: ${result.header.profile}`);
console.log(` Tables: ${result.tables.length}`);
// Access header info
console.log(` Confidentiality: ${result.header.confidentiality}`);
console.log(` Security Level: ${result.header.security}`);
} else {
console.error(`✗ Parse error: ${result.error}`);
console.error(` Line: ${result.errorLine}`);
}
}
main().catch(console.error);
Go Example
package main
import (
"fmt"
"log"
"os"
"github.com/dtlaz/go-sdk/dtl"
)
func main() {
// Read DTL file
content, err := os.ReadFile("data.dtl")
if err != nil {
log.Fatal(err)
}
// Parse DTL
parser := dtl.NewParser()
result, err := parser.Parse(string(content))
if err != nil {
fmt.Printf("✗ Error: %v\n", err)
return
}
fmt.Printf("✓ Valid DTL file\n")
fmt.Printf(" Domain: %s\n", result.Header.Domain)
fmt.Printf(" Profile: %s\n", result.Header.Profile)
fmt.Printf(" Tables: %d\n", len(result.Tables))
// Iterate tables
for _, table := range result.Tables {
fmt.Printf("\n Table: %s\n", table.Name)
fmt.Printf(" Rows: %d\n", table.RowCount)
}
}
Core Concepts
Header Structure
Every DTL file starts with two mandatory header lines containing configuration and security metadata:
@dtlv1.0^dtHC^pAsterERB1^c3^s1^w1^hb3 @sec^fh7d3a1c92f0aa^wa0xABCDEF1234^sg0x123456abc^chZC01
Header Line 1: Configuration Mandatory
| Component | Description | Example | Options |
|---|---|---|---|
@dtlv1.0 |
DTL version identifier | 1.0 | 1.0, 1.1 (future) |
dtHC |
Domain code | Healthcare | dtHC, dtFN, dtIO, dtEN, dtED |
pAsterERB1 |
Schema profile | Aster ER Bundle v1 | Custom profiles |
c3 |
Default confidentiality level | C3 (Restricted) | C0 (Public), C1 (Internal), C2 (Confidential), C3 (Restricted) |
s1 |
Default security mechanism | S1 (Row hashing) | S0 (None), S1 (Row hash), S2 (Field encrypt) |
w1 |
Web3/Blockchain support | W1 (File signature) | W0 (None), W1 (File sig), W2 (Full chain) |
hb3 |
Hash algorithm | Blake3-256 | hb3, hs2 (SHA256), hs5 (SHA512) |
Header Line 2: Security & Web3 Optional
| Component | Format | Description |
|---|---|---|
@sec |
Literal | Security header marker |
fh... |
64-char hex | File-level Blake3 hash (SHA256 is 64 chars) |
wa0x... |
0x + 40 chars | Ethereum-style wallet address (signer) |
sg0x... |
0x + 128+ chars | ECDSA digital signature |
chZC01 |
2 letters + number | Blockchain chain ID (ZC01=ZettaChain, ETH, POLY, BSC) |
Domain Codes
| Code | Domain | Use Case | Default Profile |
|---|---|---|---|
dtHC |
Healthcare | Medical records, EHR, diagnostics | pAsterERB1 (Aster ER Bundle) |
dtFN |
FinTech | Transactions, compliance, audit logs | pAsterFIN1 |
dtIO |
IoT/Sensors | Telemetry, time-series, device data | pAsterIOT1 |
dtEN |
Enterprise | General business data, HR, operations | pAsterENT1 |
dtED |
Education | Student records, grades, credentials | pAsterEDU1 |
Table Format
Each table has a header line that declares its schema, row count, and security settings inline:
PATIENT|id:s,name:s,gender:e(M,F,O),dob:D,vitals:a(f)|128|S1|W0|C3
Table Header Anatomy
TableName | field:type,field:type,... | RowCount | SecurityLevel | Web3Level | Confidentiality
| Position | Component | Type | Example |
|---|---|---|---|
| 1 | Table name (alphanumeric, _) | Identifier | PATIENT, USER_TRANSACTIONS |
| 2 | Schema definition | Field list | id:s,name:s,age:i |
| 3 | Row count | Integer | 128, 50000 |
| 4 | Security level | S0-S2 | S1 |
| 5 | Web3 level | W0-W2 | W1 |
| 6 | Confidentiality | C0-C3 | C3 |
Type System
| Type Code | Data Type | Size | Example | Validation |
|---|---|---|---|---|
s |
String (UTF-8) | Variable | Ahmed Ashraf | Any printable character |
i |
Integer | 4-8 bytes | 42, -1000 | Must be valid integer |
f |
Float/Decimal | 8 bytes | 36.9, 98.6 | IEEE 754 float |
b |
Boolean | 1 byte | 1 or 0 | 0 (false) or 1 (true) |
D |
Date | 10 bytes | 2025-11-18 | YYYY-MM-DD format |
T |
DateTime | 20+ bytes | 2025-11-18T14:30:00Z | ISO 8601 format |
a(type) |
Array of type | Variable | a(f): 1.2,3.4,5.6 | Comma-separated values |
e(v1,v2,...) |
Enumeration | Variable | e(M,F,O): M | Must match defined values |
j |
JSON | Variable | {"key": "value"} | Valid JSON object |
x |
Hex binary | Variable | a1b2c3d4ef | Valid hex string |
Security Levels
| Level | Description | Mechanism | Use Case |
|---|---|---|---|
S0 |
No security | None | Public data, test environments |
S1 |
Row-level hashing | Blake3 hash per row | Integrity verification |
S2 |
Field-level encryption | AES-256-GCM | Sensitive PII, medical data |
Confidentiality Levels
| Level | Label | Access | Examples |
|---|---|---|---|
C0 |
Public | Anyone | Marketing content, public APIs |
C1 |
Internal | Organization members | Internal documentation, policies |
C2 |
Confidential | Authorized users only | Financial reports, contracts |
C3 |
Restricted | Specific individuals | Medical records, biometric data |
Parsing DTL Files
Basic Parsing
from dtl import DTLParser, ParserResult
parser = DTLParser()
result: ParserResult = parser.parse(dtl_content)
# Check validity
if not result.valid:
print(f"Error at line {result.error_line}: {result.error}")
print(f"Context: {result.error_context}")
exit(1)
# Access header information
print(f"Domain: {result.header.domain}")
print(f"Profile: {result.header.profile}")
print(f"Confidentiality: {result.header.confidentiality}")
print(f"Security: {result.header.security}")
# Iterate through tables
for table in result.tables:
print(f"\nTable: {table.name}")
print(f" Fields: {len(table.schema)} fields")
print(f" Rows: {table.row_count}")
print(f" Schema: {table.schema}")
# Access individual rows
for i, row in enumerate(table.rows):
print(f" Row {i}: {row}")
# Access row data as dict
row_dict = table.get_row_dict(i)
print(f" Parsed: {row_dict}")
Error Handling
from dtl import DTLParser, DTLParseError
parser = DTLParser()
try:
result = parser.parse(dtl_content)
if not result.valid:
# Validation failed
raise DTLParseError(
f"Parse error at line {result.error_line}: {result.error}"
)
# Proceed with processing
for table in result.tables:
process_table(table)
except DTLParseError as e:
print(f"Failed to parse DTL: {e}")
# Handle parse error
except Exception as e:
print(f"Unexpected error: {e}")
# Handle other errors
Advanced Parsing Options
from dtl import DTLParser, ParsingOptions
# Create parser with options
options = ParsingOptions(
strict_mode=True, # Fail on any deviation
validate_signatures=True, # Verify blockchain sigs
decrypt_fields=True, # Decrypt S2 fields (needs key)
include_raw_rows=False # Don't keep raw text
)
parser = DTLParser(options=options)
result = parser.parse(dtl_content)
if result.valid:
# All validations passed
print(f"✓ Fully validated DTL file")
else:
print(f"✗ Validation failed: {result.error}")
Streaming Large Files
from dtl import DTLStreamParser
# For files > 100MB
parser = DTLStreamParser()
with open('large-file.dtl', 'rb') as f:
# Parse in chunks
for table in parser.parse_stream(f):
print(f"Processing table: {table.name}")
# Process rows as they come in
for row in table.rows:
process_row(row)
Creating DTL Files
Basic File Creation
from dtl import DTLBuilder, SecurityLevel, ConfidentialityLevel
# Create builder
builder = DTLBuilder(
domain='dtHC',
profile='pAsterERB1',
confidentiality=ConfidentialityLevel.C3,
security=SecurityLevel.S1
)
# Add table
builder.add_table('PATIENT', [
('id', 'i'),
('name', 's'),
('gender', 'e(M,F,O)'),
('dob', 'D'),
('blood_type', 'e(A,B,AB,O)')
])
# Add rows
builder.add_row('PATIENT', [1, 'Ahmed Ashraf', 'M', '1990-05-15', 'O'])
builder.add_row('PATIENT', [2, 'Fatima Khan', 'F', '1992-08-22', 'AB'])
builder.add_row('PATIENT', [3, 'Hassan Ali', 'M', '1988-03-10', 'B'])
# Generate DTL content
dtl_content = builder.build()
# Save to file
with open('patients.dtl', 'w') as f:
f.write(dtl_content)
print("✓ DTL file created: patients.dtl")
Adding Security and Signatures
from dtl import DTLBuilder, SecurityLevel
from dtl.security import DTLSigner
builder = DTLBuilder(
domain='dtHC',
profile='pAsterERB1',
security=SecurityLevel.S1
)
# Add table with data
builder.add_table('PATIENT', [
('id', 'i'),
('name', 's')
])
builder.add_row('PATIENT', [1, 'John Doe'])
# Build and sign
dtl_content = builder.build()
# Sign the file
signer = DTLSigner(
private_key='0x...', # Your private key
wallet_address='0xABCDEF1234...'
)
signed_content = signer.sign(dtl_content, chain_id='ETH')
# Save signed DTL
with open('patients-signed.dtl', 'w') as f:
f.write(signed_content)
print("✓ Signed DTL file created")
Complex Schema with Arrays and Enums
from dtl import DTLBuilder, DTLArray
builder = DTLBuilder(domain='dtHC')
# Table with complex types
builder.add_table('PATIENT_VITALS', [
('patient_id', 'i'),
('measurements_date', 'D'),
('temperature', 'f'),
('blood_pressure', 's'), # e.g., "120/80"
('symptoms', 'a(s)'), # Array of symptoms
('readings', 'a(f)'), # Array of floats
('status', 'e(OK,WARNING,CRITICAL)')
])
# Add row with array data
builder.add_row('PATIENT_VITALS', [
1,
'2025-11-18',
36.9,
'120/80',
'cough,fever,fatigue', # CSV for arrays
'98.6,99.1,98.8', # CSV for float arrays
'WARNING'
])
dtl_content = builder.build()
print(dtl_content)
Using Web3 Integration
from dtl import DTLBuilder, Web3Level
from dtl.web3 import DTLWeb3Manager
builder = DTLBuilder(
domain='dtFN',
web3=Web3Level.W2 # Full blockchain integration
)
builder.add_table('TRANSACTIONS', [
('tx_id', 's'),
('sender', 's'),
('amount', 'f'),
('status', 'e(PENDING,CONFIRMED,FAILED)')
])
builder.add_row('TRANSACTIONS', [
'tx_001',
'0xABCD...',
150.50,
'CONFIRMED'
])
dtl_content = builder.build()
# Register with blockchain
web3_manager = DTLWeb3Manager()
tx_hash = web3_manager.register_on_chain(
dtl_content,
chain='ETH',
private_key='0x...'
)
print(f"✓ Registered on chain: {tx_hash}")
Validation
Schema Validation
from dtl import DTLValidator
validator = DTLValidator()
# Parse first
parser = DTLParser()
result = parser.parse(dtl_content)
# Validate against domain schema
validation = validator.validate(
result,
domain='dtHC',
profile='pAsterERB1'
)
if validation.is_valid:
print("✓ Schema is valid")
else:
for error in validation.errors:
print(f"✗ {error.field}: {error.message} (line {error.line})")
for warning in validation.warnings:
print(f"⚠ {warning.message}")
Field-Level Validation
from dtl import DTLValidator
import re
class CustomValidator(DTLValidator):
def validate_email(self, value):
pattern = r'^[a-zA-Z0-9._%+-]+@[a-zA-Z0-9.-]+\.[a-zA-Z]{2,}$'
return re.match(pattern, value) is not None
def validate_phone(self, value):
pattern = r'^\+?1?\d{9,15}$'
return re.match(pattern, value) is not None
validator = CustomValidator()
# Add custom rules
validator.add_field_rule('email', 'EMAIL', validator.validate_email)
validator.add_field_rule('phone', 'PHONE', validator.validate_phone)
# Validate
validation = validator.validate(result)
print(f"Errors: {len(validation.errors)}")
print(f"Warnings: {len(validation.warnings)}")
Type Checking
from dtl import DTLValidator, DTLTypeError
validator = DTLValidator()
# Strict type checking
try:
# Verify all data types
for table in result.tables:
for field in table.schema:
field_type = field.type_code
# Validate each row
for row_idx, row in enumerate(table.rows):
value = row[field.name]
if not validator.validate_type(value, field_type):
raise DTLTypeError(
f"Type mismatch in {table.name}[{row_idx}]."
f"{field.name}: expected {field_type}, "
f"got {type(value)}"
)
print("✓ All types are correct")
except DTLTypeError as e:
print(f"✗ Type error: {e}")
Security & Signing
Cryptographic Signing
DTL supports ECDSA (Elliptic Curve Digital Signature Algorithm) for file-level signing, enabling verification of file authenticity and integrity.
Sign a DTL File
from dtl import DTLParser
from dtl.security import DTLSigner
# Parse DTL file
parser = DTLParser()
result = parser.parse(open('data.dtl').read())
# Create signer with private key
signer = DTLSigner(
private_key='0x1234567890abcdef...',
wallet_address='0xABCDEF1234567890...'
)
# Sign the file
signed_content = signer.sign(
dtl_content=result.raw_content,
chain_id='ETH', # Ethereum
hash_algorithm='blake3'
)
# Save signed DTL
with open('data-signed.dtl', 'w') as f:
f.write(signed_content)
print(f"✓ File signed by {signer.wallet_address}")
Verify Signatures
from dtl import DTLParser
from dtl.security import DTLSignatureVerifier
# Parse signed DTL
parser = DTLParser()
result = parser.parse(open('data-signed.dtl').read())
# Create verifier
verifier = DTLSignatureVerifier()
# Verify signature
verification = verifier.verify(
dtl_content=result.raw_content,
signature=result.header.signature,
public_key=result.header.wallet_address,
chain_id=result.header.chain_id
)
if verification.valid:
print(f"✓ Signature valid")
print(f" Signer: {verification.signer}")
print(f" Chain: {verification.chain_id}")
print(f" Timestamp: {verification.signed_at}")
else:
print(f"✗ Signature invalid: {verification.error}")
Field-Level Encryption (S2)
For highly sensitive data, use S2 security level with AES-256-GCM field-level encryption.
Encrypt Sensitive Fields
from dtl import DTLBuilder, SecurityLevel
from dtl.security import FieldEncryptor
# Create builder with S2 security
builder = DTLBuilder(
domain='dtHC',
security=SecurityLevel.S2 # Field-level encryption
)
builder.add_table('PATIENT', [
('id', 'i'),
('name', 's'),
('ssn', 's'), # Will be encrypted
('phone', 's'), # Will be encrypted
('email', 's')
])
builder.add_row('PATIENT', [
1,
'Ahmed Ashraf',
'123-45-6789',
'+971-50-123-4567',
'ahmed@example.com'
])
# Build with encryption
dtl_content = builder.build()
# Encrypt sensitive fields
encryptor = FieldEncryptor(
encryption_key='your-32-byte-key-here...',
algorithm='AES-256-GCM'
)
encrypted_content = encryptor.encrypt_fields(
dtl_content=dtl_content,
fields=['ssn', 'phone'] # Encrypt these fields
)
print("✓ Sensitive fields encrypted")
Decrypt Fields
from dtl.security import FieldEncryptor
# Decrypt fields with key
decryptor = FieldEncryptor(
encryption_key='your-32-byte-key-here...',
algorithm='AES-256-GCM'
)
decrypted_content = decryptor.decrypt_fields(
dtl_content=encrypted_content,
fields=['ssn', 'phone']
)
# Parse decrypted content
parser = DTLParser()
result = parser.parse(decrypted_content)
# Access decrypted values
for table in result.tables:
for row in table.rows:
print(f"SSN: {row['ssn']}, Phone: {row['phone']}")
Web3 Integration
Blockchain Registration
DTL files can be registered on blockchain for immutable audit trails. Supported chains: Ethereum, Polygon, BSC, ZettaChain.
Register DTL on Blockchain
from dtl import DTLParser
from dtl.web3 import DTLWeb3Manager
# Parse DTL file
parser = DTLParser()
result = parser.parse(open('data.dtl').read())
# Create Web3 manager
web3_manager = DTLWeb3Manager(
rpc_url='https://eth-rpc.example.com',
contract_address='0x1234567890...',
private_key='0xabcd1234...'
)
# Register on Ethereum
tx_result = web3_manager.register_on_chain(
dtl_content=result.raw_content,
chain='ETH',
metadata={
'domain': 'dtHC',
'organization': 'Aster Hospitals',
'timestamp': '2025-11-18T14:30:00Z'
}
)
print(f"✓ Registered on chain")
print(f" Transaction: {tx_result.tx_hash}")
print(f" Block: {tx_result.block_number}")
print(f" URL: https://etherscan.io/tx/{tx_result.tx_hash}")
Verify Blockchain Registration
from dtl.web3 import DTLWeb3Verifier
verifier = DTLWeb3Verifier(
rpc_url='https://eth-rpc.example.com',
contract_address='0x1234567890...'
)
# Verify registration
verification = verifier.verify_registration(
dtl_hash='0x7d3a1c92f0aa...',
chain='ETH'
)
if verification.registered:
print(f"✓ DTL file registered on blockchain")
print(f" Block: {verification.block_number}")
print(f" Timestamp: {verification.timestamp}")
print(f" Registered by: {verification.registrar}")
else:
print(f"✗ DTL file not found on blockchain")
SDK Reference
Python SDK v1.0.0
DTLParser
Main parser for DTL files. Handles syntax validation, structure parsing, and optional signature/encryption verification.
| Method | Description | Returns |
|---|---|---|
parse(content: str) |
Parse DTL file content | ParserResult |
parse_file(path: str) |
Parse DTL file from disk | ParserResult |
parse_stream(file_obj) |
Stream parse large files | Iterator[Table] |
validate_header(content: str) |
Validate DTL headers only | HeaderValidation |
result.valid: bool # Parsing succeeded result.error: str # Error message (if any) result.error_line: int # Line number of error result.error_context: str # Context around error result.header: Header # Parsed header result.tables: List[Table] # Parsed tables result.raw_content: str # Original content
DTLBuilder
Build DTL files programmatically with fluent API.
| Method | Parameters | Returns |
|---|---|---|
add_table() |
name: str, schema: List[Tuple] | DTLBuilder |
add_row() |
table_name: str, values: List | DTLBuilder |
add_rows() |
table_name: str, rows: List[List] | DTLBuilder |
build() |
str | |
build_to_file() |
path: str | str (path) |
builder \
.add_table('USERS', [('id', 'i'), ('name', 's')]) \
.add_row('USERS', [1, 'John']) \
.add_row('USERS', [2, 'Jane']) \
.build()
TypeScript SDK v1.0.0
DTLParser
import { DTLParser, ParserResult } from '@dtlaz/sdk';
const parser = new DTLParser();
const result: ParserResult = parser.parse(content);
// Type-safe access
if (result.valid) {
const domain = result.header.domain; // string
const tables = result.tables; // Table[]
for (const table of tables) {
console.log(`${table.name}: ${table.rowCount} rows`);
for (const row of table.rows) {
console.log(row); // Row object
}
}
}
DTLBuilder (TypeScript)
import { DTLBuilder } from '@dtlaz/sdk';
const builder = new DTLBuilder({
domain: 'dtHC',
profile: 'pAsterERB1',
confidentiality: 'C3',
security: 'S1'
});
// Fluent API
await builder
.addTable('PATIENT', [
{ name: 'id', type: 'i' },
{ name: 'name', type: 's' },
{ name: 'dob', type: 'D' }
])
.addRow('PATIENT', [1, 'Ahmed Ashraf', '1990-05-15'])
.addRow('PATIENT', [2, 'Fatima Khan', '1992-08-22'])
.buildToFile('patients.dtl');
console.log('✓ DTL file created'); Go SDK High Performance
Parser Usage
package main
import (
"fmt"
"log"
"os"
dtl "github.com/dtlaz/go-sdk"
)
func main() {
// Read DTL file
content, err := os.ReadFile("data.dtl")
if err != nil {
log.Fatal(err)
}
// Parse DTL
parser := dtl.NewParser()
result, err := parser.Parse(string(content))
if err != nil {
fmt.Printf("✗ Error: %v\n", err)
return
}
fmt.Printf("✓ Valid DTL file\n")
fmt.Printf(" Domain: %s\n", result.Header.Domain)
fmt.Printf(" Tables: %d\n", len(result.Tables))
// Iterate tables
for _, table := range result.Tables {
fmt.Printf("\n Table: %s\n", table.Name)
fmt.Printf(" Rows: %d\n", table.RowCount)
// Process rows
for i, row := range table.Rows {
fmt.Printf(" [%d] %v\n", i+1, row)
}
}
} API Reference
Parser API
Core parsing functionality for reading DTL files.
parse(content: string) → ParserResult
| Property | Type | Description |
|---|---|---|
| valid | boolean | True if parsing succeeded |
| error | string | Error message if parsing failed |
| error_line | number | Line number where error occurred |
| header | Header | Parsed DTL header object |
| tables | Table[] | Array of parsed tables |
Builder API
Programmatic construction of DTL files.
addTable(name: string, schema: Field[]) → Builder
Add a new table definition to the DTL file.
addRow(tableName: string, values: any[]) → Builder
Add a single row to a table. Values must match schema types.
build() → string
Generate complete DTL file content.
Validator API
validate(result: ParserResult) → ValidationResult
Validate parsed DTL against domain schema and type rules.
is_valid: bool # All checks passed errors: List[Error] # Validation errors warnings: List[Warning] # Non-critical issues
Signer API
sign(content: string, options: SignOptions) → string
Sign DTL file with ECDSA private key.
| Parameter | Type | Description |
|---|---|---|
| content | string | DTL file content |
| private_key | string | ECDSA private key (hex or PEM) |
| wallet_address | string | Ethereum address for signing |
| chain_id | string | Blockchain ID (ETH, POLY, ZC01) |
Examples
Healthcare: Patient Records
from dtl import DTLBuilder, SecurityLevel, ConfidentialityLevel
# Create healthcare DTL
builder = DTLBuilder(
domain='dtHC',
profile='pAsterERB1',
security=SecurityLevel.S2, # Field encryption for sensitive data
confidentiality=ConfidentialityLevel.C3
)
# Create PATIENT table
builder.add_table('PATIENT', [
('patient_id', 'i'),
('full_name', 's'),
('dob', 'D'),
('gender', 'e(M,F,O)'),
('blood_type', 'e(A,B,AB,O)'),
('allergies', 'a(s)'),
('admission_date', 'T')
])
# Add patient records
builder.add_row('PATIENT', [
1001,
'Ahmed Hassan Al-Maktoum',
'1975-03-15',
'M',
'O',
'Penicillin,Peanuts',
'2025-11-15T09:30:00Z'
])
builder.add_row('PATIENT', [
1002,
'Fatima Mohammed Al-Dhaheri',
'1982-07-22',
'F',
'AB',
'None',
'2025-11-16T14:45:00Z'
])
# Create VITAL_SIGNS table
builder.add_table('VITAL_SIGNS', [
('patient_id', 'i'),
('measurement_time', 'T'),
('temperature_c', 'f'),
('systolic_bp', 'i'),
('diastolic_bp', 'i'),
('heart_rate', 'i'),
('status', 'e(NORMAL,WARNING,CRITICAL)')
])
builder.add_row('VITAL_SIGNS', [
1001,
'2025-11-18T10:00:00Z',
36.8,
120,
80,
72,
'NORMAL'
])
# Build and sign
dtl_content = builder.build()
# Save
with open('patient-records.dtl', 'w') as f:
f.write(dtl_content)
print("✓ Healthcare DTL created with encryption") FinTech: Transaction Records
from dtl import DTLBuilder, SecurityLevel, ConfidentialityLevel
# Create FinTech DTL for transactions
builder = DTLBuilder(
domain='dtFN',
profile='pAsterFIN1',
security=SecurityLevel.S2,
confidentiality=ConfidentialityLevel.C2
)
# Transaction table
builder.add_table('TRANSACTIONS', [
('transaction_id', 's'),
('sender_account', 's'),
('recipient_account', 's'),
('amount', 'f'),
('currency', 'e(AED,USD,EUR,GBP)'),
('transaction_date', 'T'),
('status', 'e(PENDING,COMPLETED,FAILED,REVERSED)'),
('notes', 's')
])
builder.add_row('TRANSACTIONS', [
'TXN20251118001',
'ACC_AE_12345',
'ACC_AE_67890',
50000.00,
'AED',
'2025-11-18T14:30:00Z',
'COMPLETED',
'Invoice #2025-001'
])
builder.add_row('TRANSACTIONS', [
'TXN20251118002',
'ACC_AE_98765',
'ACC_US_11111',
1500.50,
'USD',
'2025-11-18T15:45:00Z',
'COMPLETED',
'International transfer'
])
# Compliance audit log
builder.add_table('AUDIT_LOG', [
('log_id', 'i'),
('transaction_id', 's'),
('action', 's'),
('timestamp', 'T'),
('user_id', 's'),
('ip_address', 's')
])
builder.add_row('AUDIT_LOG', [
1,
'TXN20251118001',
'INITIATED',
'2025-11-18T14:25:00Z',
'USR_A001',
'192.168.1.100'
])
dtl_content = builder.build()
print("✓ FinTech DTL created with audit logging") IoT: Sensor Telemetry
from dtl import DTLBuilder, SecurityLevel
# Create IoT DTL for sensor data
builder = DTLBuilder(
domain='dtIO',
profile='pAsterIOT1',
security=SecurityLevel.S1
)
# Sensor metadata table
builder.add_table('SENSORS', [
('sensor_id', 's'),
('location', 's'),
('sensor_type', 'e(TEMPERATURE,HUMIDITY,PRESSURE,CO2)'),
('last_calibration', 'D'),
('status', 'e(ACTIVE,INACTIVE,MAINTENANCE)')
])
builder.add_row('SENSORS', [
'SENSOR_DXB_001',
'Deira Clinic - Room 101',
'TEMPERATURE',
'2025-11-01',
'ACTIVE'
])
builder.add_row('SENSORS', [
'SENSOR_DXB_002',
'Deira Clinic - Waiting Area',
'HUMIDITY',
'2025-11-05',
'ACTIVE'
])
# Time-series telemetry
builder.add_table('TELEMETRY', [
('reading_id', 'i'),
('sensor_id', 's'),
('timestamp', 'T'),
('value', 'f'),
('unit', 's'),
('signal_strength', 'i'),
('battery_level', 'i')
])
# Add multiple readings
readings = [
(1, 'SENSOR_DXB_001', '2025-11-18T10:00:00Z', 22.3, 'C', -75, 85),
(2, 'SENSOR_DXB_001', '2025-11-18T10:15:00Z', 22.5, 'C', -74, 85),
(3, 'SENSOR_DXB_001', '2025-11-18T10:30:00Z', 22.4, 'C', -76, 84),
(4, 'SENSOR_DXB_002', '2025-11-18T10:00:00Z', 45.2, '%', -72, 90),
(5, 'SENSOR_DXB_002', '2025-11-18T10:15:00Z', 45.8, '%', -73, 89),
]
for reading in readings:
builder.add_row('TELEMETRY', list(reading))
dtl_content = builder.build()
print("✓ IoT telemetry DTL created") Type Reference
Primitive Types
| Code | Type | Size | Range/Format | Example |
|---|---|---|---|---|
s |
String | Variable | UTF-8, max 65535 chars | Ahmed Ashraf |
i |
Integer | 4 bytes | -2,147,483,648 to 2,147,483,647 | 42, -1000 |
I |
Long Integer | 8 bytes | -9,223,372,036,854,775,808 to 9,223,372,036,854,775,807 | 9876543210 |
f |
Float | 8 bytes | IEEE 754 double precision | 36.9, 98.6 |
b |
Boolean | 1 byte | 0 or 1 | 1, 0 |
D |
Date | 10 chars | YYYY-MM-DD (ISO 8601) | 2025-11-18 |
T |
DateTime | 20+ chars | ISO 8601 with timezone | 2025-11-18T14:30:00Z |
j |
JSON | Variable | Valid JSON object or array | {"key": "value"} |
x |
Hex Binary | Variable | Hexadecimal string | a1b2c3d4 |
Complex Types
| Syntax | Description | Example |
|---|---|---|
a(type) |
Array of primitive type | a(s), a(f), a(i) |
e(val1,val2,...) |
Enumeration - fixed values | e(ACTIVE,INACTIVE), e(M,F,O) |
o(field:type,...) |
Nested object | o(id:i,name:s) |
Type Validation Rules
Strings (s)
UTF-8 encoded. Commas and pipes require escaping with backslash. Max length 65535 characters.
Numbers (i, I, f)
No quotes required. Scientific notation supported for floats (e.g., 1.23e-4). Leading zeros allowed.
Arrays (a(type))
Comma-separated values. No spaces. Values must match array's base type. Empty arrays: empty string.
Enumerations (e(val1,val2,...))
Must be exact match. Case-sensitive. No extra values allowed.
Error Codes
Parse Errors
| Code | Message | Cause | Solution |
|---|---|---|---|
DTL001 |
Invalid header format | First line doesn't start with @dtlv | Check header line 1 format |
DTL002 |
Unknown domain code | Domain code not recognized | Use valid domain: dtHC, dtFN, dtIO, dtEN, dtED |
DTL003 |
Invalid table header | Table definition malformed | Check table header syntax |
DTL004 |
Type mismatch | Value doesn't match field type | Convert value to correct type |
DTL005 |
Row count mismatch | Actual rows != declared count | Update row count or add/remove rows |
DTL006 |
Invalid enum value | Value not in enum list | Use value from enum definition |
DTL007 |
Invalid date format | Date not YYYY-MM-DD | Format date as YYYY-MM-DD |
DTL008 |
Signature verification failed | Digital signature invalid | File may be corrupted; verify signer |
DTL009 |
Decryption failed | Wrong key or encrypted data corrupted | Provide correct decryption key |
DTL010 |
Hash mismatch | Content hash doesn't match file hash | File integrity check failed |
Validation Errors
| Code | Message | Severity |
|---|---|---|
VAL001 |
Schema not found for domain | Warning |
VAL002 |
Unknown field in table | Error |
VAL003 |
Missing required field | Error |
VAL004 |
Field value out of range | Warning |
Frequently Asked Questions
General Questions
What is DTL?
DTL (Domain Transport Language) is a compact, cryptographically-secure data interchange format designed for modern applications. It combines the human-readability of CSV with the structure of JSON and adds native cryptographic capabilities.
Why use DTL instead of JSON/CSV?
- Compact: 60-80% smaller file size than JSON
- Secure: Built-in cryptographic signing and encryption
- Domain-aware: Optimized schemas for healthcare, fintech, IoT
- Blockchain-ready: Native Web3 integration
- Type-safe: Strict type checking with early validation
Is DTL human-readable?
Yes! Unlike binary formats, DTL uses human-readable headers and structure. You can open a .dtl file in any text editor and understand its contents.
Technical Questions
What security algorithms does DTL use?
DTL supports ECDSA for signing, AES-256-GCM for field encryption, and Blake3/SHA-256 for hashing. All use industry-standard implementations.
Can I use DTL with my existing database?
Yes! DTL can import/export data from most databases. SDKs provide tools to convert between DTL and SQL, MongoDB, CSV, and other formats.
What's the maximum file size?
Theoretically unlimited. For files >100MB, use streaming parsers to process rows incrementally without loading entire file into memory.
Is DTL suitable for real-time data?
Yes, especially IoT data. Use W0 (no Web3) for fastest performance. Stream parsing supports continuous data ingestion.
How do I migrate from JSON to DTL?
from dtl import DTLBuilder
import json
# Read JSON
with open('data.json') as f:
data = json.load(f)
# Convert to DTL
builder = DTLBuilder(domain='dtEN')
builder.add_table('DATA', [('id', 'i'), ('value', 's')])
for item in data['items']:
builder.add_row('DATA', [item['id'], item['value']])
# Save DTL
dtl_content = builder.build()
with open('data.dtl', 'w') as f:
f.write(dtl_content)
Can DTL files be versioned?
Yes! DTL supports multiple versions via profile parameter. Current version is v1.0. Future versions will maintain backward compatibility.
How do I ensure confidentiality of my DTL files?
Use S2 security level with field-level encryption for sensitive data, and assign appropriate confidentiality level (C2 or C3). Store encryption keys securely in key management systems.
Performance & Optimization
How fast is DTL parsing?
Very fast! Go SDK parses ~50MB/sec. Python SDK ~10MB/sec. TypeScript ~5MB/sec. All SDKs provide streaming for larger files.
Should I compress DTL files?
Compression can reduce file size by additional 30-50%, but adds CPU overhead. Recommended for archival, not for frequent access.
How do I improve parsing performance?
- Use streaming parser for large files (>100MB)
- Disable signature verification if not needed
- Use Go SDK for maximum performance
- Process tables in parallel with async/threading
Integration & Compliance
Is DTL HIPAA-compliant?
DTL provides building blocks for HIPAA compliance (encryption, audit trails, access controls). Your implementation must also follow HIPAA procedures and policies.
Can DTL be used with APIs?
Yes! DTL-over-HTTP is supported. Transfer signed/encrypted DTL files via REST APIs. Include X-DTL-Hash and X-DTL-Signature headers.
How do I integrate DTL with microservices?
from dtl import DTLParser
from fastapi import FastAPI, File, UploadFile
app = FastAPI()
@app.post("/upload-dtl")
async def upload_dtl(file: UploadFile):
content = await file.read()
parser = DTLParser()
result = parser.parse(content.decode('utf-8'))
if result.valid:
# Process tables
for table in result.tables:
print(f"Processing {table.name}")
return {"status": "success"}
else:
return {"status": "error", "message": result.error} Support & Resources
Getting Help
- Documentation: dtlaz.org/docs
- GitHub Issues: github.com/dtlaz/dtl-sdk
- Email Support: support@dtlaz.org
- Community Forum: community.dtlaz.org
Additional Resources
SDK Status
| SDK | Version | Status | Latest Release |
|---|---|---|---|
| Python | 1.0.0 | Stable | Nov 18, 2025 |
| TypeScript | 1.0.0 | Stable | Nov 18, 2025 |
| Go | 1.0.0 | Stable | Nov 18, 2025 |
| CLI | 1.0.0 | Stable | Nov 18, 2025 |
Changelog
Version 1.0.0 (Nov 18, 2025)
- ✓ Initial stable release
- ✓ Python, TypeScript, Go SDKs
- ✓ Support for 5 domains (HC, FN, IO, EN, ED)
- ✓ Field-level encryption (S2)
- ✓ Blockchain integration (ETH, POLY, BSC, ZC01)
- ✓ ECDSA signing and verification
- ✓ Streaming parser for large files
- ✓ Full documentation and examples
Roadmap
- 🔄 Version 1.1: Advanced compression algorithms
- 🔄 Version 1.2: Multi-signature support
- 🔄 Version 2.0: Sharding for massive datasets
- 🔄 ZettaChain Protocol integration
- 🔄 Enterprise Dashboard & Analytics