Basic Patterns
Process JSON Data
Transform Data Format
Filter and Aggregate
Pipeline Patterns
Multi-Stage Processing
Chain multiple AI scripts together:Parallel Processing
Process multiple files concurrently:Real-World Use Cases
API Response Analysis
Log Analysis
Database Query Results
Git History Analysis
Data Transformation Recipes
JSON to Markdown Table
CSV Cleanup
Data Enrichment
Anomaly Detection
Advanced Patterns
Streaming Large Files
Process large files in chunks:Live Streaming with —live
Multi-Source Aggregation
Error Recovery
Data Processing in CI/CD
Daily Metrics Analysis
Process S3 Data
Cost Optimization
Choose the Right Model
| Task | Model | Why |
|---|---|---|
| Simple CSV transformations | --haiku | Fast, cheap |
| Log analysis, anomaly detection | --sonnet | Balanced reasoning |
| Complex data modeling | --opus | Deep analysis |
Batch Processing
Process multiple files in one prompt (saves API calls):Monitoring and Logging
Pipeline with Status Tracking
Health Checks
Troubleshooting
Empty Output
Problem: Pipeline produces empty files. Debug:Data Loss in Pipeline
Problem: Data disappears between stages. Solution: Save intermediate results:Encoding Issues
Problem: Special characters corrupted. Solution: Force UTF-8:Next Steps
Stdin Processing
Unix pipe fundamentals
CI/CD Integration
Automate data pipelines
Live Streaming
Process data in real-time
Scripting Guide
Advanced pipeline patterns