Step 1: Install the SDK
🐍 Python
pip install dtl-parser
📘 TypeScript / JavaScript
npm install dtl-parser
💜 .NET
dotnet add package Dtlaz.Parser
💻 VS Code Extension
code --install-extension dtlaz.dtl-language
Step 2: Create Your First DTL File
Create a file named users.dtl:
@dtlv1.0^dtWEB^pMyApp^c0^s0^w0^hash @sec^none^0x0^none^0 # Users table with role and status enums USERS|id:s,name:s,email:s,role:e(admin,user,guest),status:e(active,inactive,pending),created:D|3|S0|W0|C0 U001|Alice Smith|alice@example.com|admin|active|2025-01-01 U002|Bob Johnson|bob@example.com|user|active|2025-01-02 U003|Carol White|carol@example.com|guest|pending|2025-01-03
Step 3: Parse and Validate
🐍 Python Example
from dtl_parser import DTLParser # Parse the file parser = DTLParser() doc = parser.parse_file("users.dtl") # Access tables users_table = doc.get_table("USERS") print(f"Found {len(users_table.rows)} users") # Iterate rows for row in users_table.rows: print(f"{row['name']} is a {row['role']}") # Validate (checks enum values!) errors = doc.validate() if errors: for e in errors: print(f"Error: {e}") else: print("✅ All valid!")
📘 TypeScript Example
import { parse, validate } from 'dtl-parser'; import { readFileSync } from 'fs'; // Parse the file const content = readFileSync('users.dtl', 'utf-8'); const doc = parse(content); // Access tables const usersTable = doc.tables.find(t => t.name === 'USERS'); console.log(`Found ${usersTable?.rows.length} users`); // Iterate rows usersTable?.rows.forEach(row => { console.log(`${row.name} is a ${row.role}`); }); // Validate (checks enum values!) const errors = validate(doc); if (errors.length === 0) { console.log('✅ All valid!'); }
💜 C# Example
using Dtlaz.Parser; // Parse the file var parser = new DtlParser(); var doc = parser.ParseFile("users.dtl"); // Access tables var usersTable = doc.GetTable("USERS"); Console.WriteLine($"Found {usersTable?.Rows.Count} users"); // Iterate rows foreach (var row in usersTable?.Rows ?? []) { Console.WriteLine($"{row["name"]} is a {row["role"]}"); } // Validate var errors = doc.Validate(); if (!errors.Any()) Console.WriteLine("✅ All valid!");
Step 4: Create DTL Programmatically
from dtl_parser import create_table # Create a table with enum fields orders = create_table("ORDERS", { "id": "s", "customer": "s", "amount": "f", "status": "e(pending,processing,shipped,delivered)", # Enum! "priority": "e(low,medium,high)", # Another enum! }) # Add rows orders.add_row({ "id": "ORD001", "customer": "Alice", "amount": 99.99, "status": "pending", # Must be valid enum value "priority": "high" # Must be valid enum value }) # Convert to DTL string dtl_output = orders.to_dtl() print(dtl_output)
Common Enum Patterns
Here are commonly used enum patterns across different domains:
# Workflow Status status:e(draft,pending,approved,rejected,archived) # Order Status order_status:e(pending,confirmed,processing,shipped,delivered,cancelled) # Priority priority:e(low,medium,high,critical,urgent) # User Roles role:e(admin,moderator,user,guest,viewer) # Payment Status payment:e(pending,processing,completed,failed,refunded) # HTTP Methods method:e(GET,POST,PUT,PATCH,DELETE) # Environment env:e(development,staging,production,testing) # Healthcare gender:e(M,F,O) blood_type:e(A+,A-,B+,B-,AB+,AB-,O+,O-) # Currency currency:e(USD,EUR,GBP,AED,JPY,CNY,INR) # Rating rating:e(1,2,3,4,5) # Size size:e(XS,S,M,L,XL,XXL)
Step 5: Autofix & Magic Corrections
The SDK can automatically fix common errors like typos in enum values, wrong date formats, and row count mismatches:
🐍 Python Autofix
from dtl_parser import DTLParser # Content with errors dtl_with_errors = """ @dtlv1.0^dtWEB^pDemo^c0^s0^w0^hash @sec^none^0x0^none^0 users|id:s,status:e(active,inactive,pending),date:D|2|S0|W0|C0 U001|actve|01/15/2025 U002|pendng|2025-01-20 U003|inactive|20250125 """ parser = DTLParser() doc = parser.parse(dtl_with_errors) # Apply autofix - magic corrections! fixed_doc, changes = doc.autofix() # See what was fixed for change in changes: print(f"✨ {change}") # Output: # ✨ Fixed row count in USERS: 2 → 3 # ✨ Fixed table name case: users → USERS # ✨ Fixed USERS[0].status: 'actve' → 'active' # ✨ Fixed USERS[0].date: '01/15/2025' → '2025-01-15' # ✨ Fixed USERS[1].status: 'pendng' → 'pending' # ✨ Fixed USERS[2].date: '20250125' → '2025-01-25'
📘 TypeScript Autofix
import { parse, autofix, validate } from 'dtl-parser'; // Parse content with errors const doc = parse(dtlWithErrors); // Check errors before const errorsBefore = validate(doc); console.log(`Errors before: ${errorsBefore.length}`); // Apply autofix const [fixedDoc, changes] = autofix(doc); changes.forEach(c => console.log(`✨ ${c}`)); // Check errors after const errorsAfter = validate(fixedDoc); console.log(`Errors after: ${errorsAfter.length}`); // 0!
💜 C# Autofix
using Dtlaz.Parser; var parser = new DtlParser(); var doc = parser.Parse(dtlWithErrors); // Apply autofix var (fixedDoc, changes) = doc.Autofix(); foreach (var change in changes) Console.WriteLine($"✨ {change}"); // Save the fixed file File.WriteAllText("users_fixed.dtl", fixedDoc.ToDtl());
Step 6: Convert to JSON/CSV
🐍 Python Export
from dtl_parser import DTLParser import json parser = DTLParser() doc = parser.parse_file("users.dtl") users = doc.get_table("USERS") # Export to JSON json_data = users.to_json() print(json.dumps(json_data, indent=2)) # Export to CSV csv_data = users.to_csv() with open("users.csv", "w") as f: f.write(csv_data)
📘 TypeScript Export
import { parse, tableToJson, tableToCsv } from 'dtl-parser'; const doc = parse(content); const users = doc.tables.find(t => t.name === 'USERS'); // To JSON const json = tableToJson(users!); console.log(JSON.stringify(json, null, 2)); // To CSV const csv = tableToCsv(users!); fs.writeFileSync('users.csv', csv);