CipherStream Enterprise Data Extraction Platform
What made this section unhelpful for you?
On this page
- CipherStream Enterprise Data Extraction Platform
Data Security and Decryption
You may opt to receive your export encrypted at rest. Along with your API key, we’ll issue a base64-encoded 32-byte AES key (“Customer Encryption Key”).
Important: Store your key in a secrets manager. Never share it with us.
Encrypted delivery format
To request encryption:
{
"compression": "zip",
"encrypted": true
}
You’ll receive:
- A ZIP from a presigned S3 URL
- Inside, a single
.ndjsonfile - Each line contains:
{"encrypted_data":"key_id:iv_b64:ciphertext_plus_tag_b64"}
Part | Meaning |
| Identifier of the key used (informational) |
| 12-byte IV (nonce), base64 |
| Base64 of `ciphertext |
Example (shortened):
{"encrypted_data":"3e9c..e2f1:0R0y8kS3g8m2s6v8:AAABBBCCC...zzz"}
Cryptography profile
- Cipher: AES-256-GCM
- Key: your 32-byte key (base64-decode first)
- IV: 12 bytes (from
iv_b64) - Auth tag: 16 bytes (trailing bytes of the decoded blob)
- AAD:
stream:{customer_id}:{row_index} row_indexis zero-based for each NDJSON line.
Decryption steps
- Unzip to get
job_id.ndjson. - Read file line by line.
- Parse JSON; get
encrypted_data. split(":") → [key_id, iv_b64, ctext_tag_b64].iv = base64(iv_b64).raw = base64(ctext_tag_b64); then:ciphertext = raw[0 : len(raw) - 16]tag = raw[len(raw) - 16 : ]aad = "stream:{customer_id}:{row_index}".- Decrypt with AES-256-GCM (
key,iv,aad,ciphertext,tag). - Parse the UTF-8 JSON result.
Reference implementations
Replace {{customer_key_b64}} and {{customer_id}} with your values (or Theneo variables).
Python
from cryptography.hazmat.primitives.ciphers.aead import AESGCM
import base64
import json
# Your pre-shared values
derived_key_base64 = "your-pre-shared-key"
customer_id = "your-customer-id"
# Decode your key
encryption_key = base64.b64decode(derived_key_base64)
# For each encrypted line (row_index starts at 0)
def decrypt_line(encrypted_data, row_index):
# Split the encrypted data
parts = encrypted_data.split(":")
iv = base64.b64decode(parts[1])
ciphertext = base64.b64decode(parts[2])
# Create AAD
aad = f"stream:{customer_id}:{row_index}".encode('utf-8')
# Decrypt
aesgcm = AESGCM(encryption_key)
decrypted_bytes = aesgcm.decrypt(iv, ciphertext, aad)
# Parse JSON
return json.loads(decrypted_bytes.decode('utf-8'))Node.js
const crypto = require("node:crypto");
const CUSTOMER_KEY_B64 = "{{customer_key_b64}}";
const CUSTOMER_ID = "{{customer_id}}";
const KEY = Buffer.from(CUSTOMER_KEY_B64, "base64");
function decryptLine(encryptedData, rowIndex) {
const [keyId, iv_b64, ctext_tag_b64] = encryptedData.split(":");
const iv = Buffer.from(iv_b64, "base64");
const raw = Buffer.from(ctext_tag_b64, "base64");
const ciphertext = raw.slice(0, -16);
const tag = raw.slice(-16);
const aad = Buffer.from(`stream:${CUSTOMER_ID}:${rowIndex}`, "utf8");
const decipher = crypto.createDecipheriv("aes-256-gcm", KEY, iv);
decipher.setAAD(aad);
decipher.setAuthTag(tag);
const decrypted = Buffer.concat([decipher.update(ciphertext), decipher.final()]);
return JSON.parse(decrypted.toString("utf8"));
}Java
C# (.NET)
using System;
using System.Text;
using System.Text.Json;
using System.Security.Cryptography;
public static class Decrypter
{
private static readonly string CustomerKeyB64 = "{{customer_key_b64}}";
private static readonly string CustomerId = "{{customer_id}}";
private static readonly byte[] Key = Convert.FromBase64String(CustomerKeyB64);
public static string DecryptLine(string encryptedData, int rowIndex)
{
var parts = encryptedData.Split(':');
var iv = Convert.FromBase64String(parts[1]);
var raw = Convert.FromBase64String(parts[2]);
var ciphertext = raw[..^16];
var tag = raw[^16..];
var aad = Encoding.UTF8.GetBytes($"stream:{CustomerId}:{rowIndex}");
var plaintext = new byte[ciphertext.Length];
using var aes = new AesGcm(Key);
aes.Decrypt(iv, ciphertext, tag, plaintext, aad);
return Encoding.UTF8.GetString(plaintext);
}
}Validation & troubleshooting
- Key decodes to exactly 32 bytes
- AAD exactly
stream:{customer_id}:{row_index}(row index is zero-based) - IV is 12 bytes; tag is the last 16 bytes of the decoded blob
- Don’t reuse IV/AAD on your side when testing
What made this section unhelpful for you?
On this page
- Data Security and Decryption
Webhook Signature Verification
On this page
- Webhook Signature Verification
1. Patient Management
Patient Management
Core patient data and analytics for comprehensive patient relationship management.
Characteristics:
- Patient demographics - Core patient information and contact details
- Patient analytics - Lifetime value and business intelligence
- Follow-up management - Recall scheduling and tracking
Use Cases:
- Patient management systems
- Contact information updates
- Demographics analysis
- Marketing and communication
- Patient retention analysis
- Follow-up scheduling
On this page
- 1. Patient Management
2. Scheduling & Appointments
Scheduling & Appointments
Complete appointment management including scheduling, status tracking, and optimisation.
Characteristics:
- Core appointments - Main appointment records with intelligent processing
- Status management - Appointment statuses and types for workflow control
- Schedule optimisation - Cancellation tracking and waitlist management
- Communication - Reminder and notification systems
Use Cases:
- Appointment booking systems
- Schedule management
- Cancellation analysis
- Waitlist optimisation
- Patient communication
- Scheduling templates
On this page
- 2. Scheduling & Appointments
3. Financial Management
Financial Management
Comprehensive financial data including transactions, payments, invoicing, and accounts receivable.
Characteristics:
- Core financials - Main financial transaction records
- Payment processing - Receipts, deposits, and payment allocations
- Billing management - Invoices and discount tracking
- Accounts receivable - Debtor management and outstanding balances
- Pricing structure - Fee schedules and payment types
Use Cases:
- Financial reporting and analysis
- Revenue tracking
- Accounting system integration
- Business intelligence dashboards
- Payment reconciliation
- Debt collection workflows
On this page
- 3. Financial Management
4. Clinical & Treatment
Clinical & Treatment
Clinical data management including completed treatments and treatment planning.
Characteristics:
- Treatment records - Completed procedures and clinical work
- Treatment planning - Proposed treatments and care plans
- Service catalogue - Available procedures and service items
Use Cases:
- Clinical reporting and analysis
- Treatment outcome tracking
- Care plan management
- Service utilisation analysis
- Clinical decision support
- Quality assurance
On this page
- 4. Clinical & Treatment
5. Business Operations
Business Operations
Business expense management and operational cost tracking.
Characteristics:
- Expense tracking - Business expense records with date filtering
- Expense categorisation - Expense categories for financial reporting
Use Cases:
- Expense tracking and reporting
- Budget analysis
- Tax preparation
- Cost centre reporting
- Financial planning
- Operational efficiency analysis
On this page
- 5. Business Operations
6. Practice Setup & Configuration
Practice Setup & Configuration
Practice management configuration including staff, locations, and system setup.
Characteristics:
- Staff management - Practitioners and users with role definitions
- Location management - Practice locations and facilities
- System configuration - User roles and access control
- External relationships - Health funds, third parties, and referral sources
- Recall management - Recall types and follow-up configurations
Use Cases:
- Practice setup and configuration
- Staff management systems
- Location and facility management
- User access control
- External partner integration
- Follow-up system configuration
On this page
- 6. Practice Setup & Configuration
7. Advanced Tools
Advanced Tools
Advanced data extraction capabilities for power users and custom requirements.
Characteristics:
- Direct table access - Extract from any accessible database table
- CALL syntax - Advanced stored procedure execution
- Extended timeout - 180-second timeout for large extractions
- Flexible parameters - Custom table names and date filtering
Use Cases:
- Custom data extractions
- Ad-hoc reporting requirements
- Data migration projects
- Specialised analytics queries
- Direct database access
- Custom integration needs
On this page
- 7. Advanced Tools
8. Job Management
Job Management
Background job management for large data extractions and processing with comprehensive monitoring and control.
When Jobs Are Created:
- Large datasets: >100,000 rows automatically trigger job processing
- File size threshold: >50MB estimated output size
- Explicit job mode: User specifically requests job processing
- System load balancing: High system load triggers job queuing
- Complex queries: Resource-intensive extractions
- Scheduled extractions: Automated recurring data pulls
Job Lifecycle States:
- Queued - Job created and waiting for available processing resources
- Running - Data extraction in progress with real-time progress updates
- Completed - Data successfully extracted and available for download via S3
- Failed - Error occurred during processing with detailed error information
- Cancelled - Job manually cancelled by user or system timeout
Job Processing Features:
- Real-time progress tracking - Live percentage completion (0-100%)
- Row count monitoring - Current number of processed records
- Time estimation - Estimated completion time based on current progress
- Resource allocation - Dedicated processing resources for optimal performance
- Error handling - Detailed error messages and recovery suggestions
- Automatic retries - Built-in retry logic for transient failures
Download Management:
- S3 secure storage - Enterprise-grade cloud storage with encryption
- Presigned URLs - Time-limited, secure download links
- 12-hour expiry - URLs automatically expire for security
- Resume support - Partial download recovery for large files
- CDN acceleration - Global content delivery for faster downloads
- Bandwidth optimisation - Compressed files for efficient transfer
Monitoring & Notifications:
- Webhook integration - Real-time job completion alerts
- Email notifications - Optional email alerts for job status changes
- Progress callbacks - Periodic progress updates via webhooks
- Performance metrics - Execution time, throughput, and resource usage
- Audit logging - Complete job history and user actions
Security & Compliance:
- Data encryption - AES-256-GCM encryption for stored files
- Access control - Customer-specific job isolation
- Audit trail - Complete job lifecycle logging
- Automatic cleanup - Files removed after expiry for data protection
- IP restrictions - Optional IP-based access control
On this page
- 8. Job Management
9. Integrations
Integrations
Advanced webhook system for real-time notifications, system integration, and automated workflow triggers.
Supported Events:
job.completed- Job finished successfully with download URL and metadatajob.failed- Job encountered an error with detailed failure informationjob.cancelled- Job was manually cancelled or timed outjob.progress- Periodic progress updates during job execution (optional)system.maintenance- Scheduled maintenance notificationsapi.rate_limit- Rate limit threshold warnings
Webhook Security Features:
- HMAC-SHA256 signature - Cryptographic payload verification using shared secret
- Timestamp validation - Prevents replay attacks with time-based verification
- IP whitelisting - Optional source IP restrictions for enhanced security
- TLS encryption - All webhook deliveries use HTTPS/TLS 1.3
- Signature verification - Complete payload integrity checking
Delivery & Reliability:
- Automatic retries - Up to 3 retry attempts with exponential backoff
- Delivery tracking - Complete success/failure monitoring and logging
- Timeout handling - 30-second response timeout with configurable settings
- Dead letter queue - Failed deliveries stored for manual retry
- Circuit breaker - Automatic endpoint disabling for persistent failures
- Rate limiting - Configurable delivery rate limits to prevent overwhelming
Webhook Payload Structure:
{
"event": "job.completed",
"timestamp": "2024-09-26T10:02:15Z",
"signature": "sha256=abc123def456...",
"delivery_id": "del_1758630144658",
"attempt": 1,
"data": {
"job_id": "appointments_1758630144658_c7511999",
"customer_id": "your-customer-id",
"procedure_name": "appointments",
"status": "completed",
"rows_processed": 45678,
"execution_time_seconds": 64.75,
"file_size_bytes": 2048576,
"s3_url": "https://secure-download-url",
"expires_at": "2024-09-26T22:02:15Z",
"output_format": "ndjson",
"compression": "gzip",
"metadata": {
"from_date": "2024-01-01",
"to_date": "2024-12-31",
"date_modifier": "Created"
}
}
}
Signature Verification:
import hmac
import hashlib
def verify_webhook_signature(payload, signature, secret):
expected_signature = hmac.new(
secret.encode('utf-8'),
payload.encode('utf-8'),
hashlib.sha256
).hexdigest()
return hmac.compare_digest(f"sha256={expected_signature}", signature)
Configuration Options:
- Event filtering - Subscribe to specific events only
- Custom headers - Add custom HTTP headers to webhook requests
- Retry configuration - Customize retry attempts and backoff strategy
- Timeout settings - Configure response timeout values
- Batch delivery - Group multiple events into single webhook call
Monitoring & Debugging:
- Delivery logs - Complete webhook delivery history
- Response tracking - HTTP status codes and response times
- Error analysis - Detailed failure reasons and troubleshooting
- Performance metrics - Delivery success rates and latency statistics
- Test endpoints - Webhook testing and validation tools
Use Cases:
- Automated workflows - Trigger business processes when jobs complete
- Real-time notifications - Instant alerts when data extraction finishes
- System integration - Connect CipherStream to other business systems
- Data pipeline automation - Chain multiple data processing steps
- Monitoring and alerting - Track job completion and system health
- Business intelligence - Trigger report generation and dashboard updates
- Customer notifications - Inform end users when their data is ready
On this page
- 9. Integrations
10. Documentation
Documentation
API documentation and OpenAPI schemas for integration and development.
Available Documentation:
- Interactive Docs - Swagger UI for testing endpoints
- OpenAPI Schema - Machine-readable API specification
- Integration Guides - Code examples and best practices
Use Cases:
- Interactive API testing and exploration
- Code generation for client libraries
- API documentation for development teams
- Integration planning and validation
- Request/response format reference
On this page
- 10. Documentation