DAP CoreML β Panoramic Depth Estimation for Apple Silicon
CoreML export of DAP (Depth Any Panoramas), a foundation model for monocular depth estimation on equirectangular 360Β° panoramas. Optimized for on-device inference on iOS 18+ and macOS with Apple Silicon.
| Original model | DAP (Insta360 Research) |
| Architecture | Depth-Anything-V2 + DINOv3 (ViT-L) |
| Input | Equirectangular panorama, 2:1 aspect ratio (default 1024Γ512) |
| Output | Monocular depth map, float32, same resolution as input |
| CoreML size | ~1.2 GB |
| Deployment | iOS 18+, macOS 15+ (Apple Silicon) |
Validation
| Metric | Value |
|---|---|
| Max absolute difference | 5.54Γ10β»βΆ |
| Mean absolute difference | 4.50Γ10β»β· |
| Correlation | 1.000000 |
| CoreML inference (M-series) | ~650 ms |
Quick Start β CLI (macOS)
Compile and run DepthPredictor.swift as a standalone tool β no Xcode project needed:
# Compile
swiftc -O -o depth_predictor DepthPredictor.swift \
-framework CoreML -framework Vision -framework CoreImage \
-framework CoreGraphics -framework AppKit
# Generate a 16-bit grayscale depth map
./depth_predictor -m DAPModel.mlpackage -i panorama.jpg -o depth.png
# Colorized with jet colormap
./depth_predictor -m DAPModel.mlpackage -i panorama.jpg -o depth.png -c jet
# Turbo colormap
./depth_predictor -m DAPModel.mlpackage -i panorama.jpg -o depth.png -c turbo
Options:
| Flag | Description |
|---|---|
-m, --model PATH |
Path to DAPModel.mlpackage or .mlmodelc |
-i, --input PATH |
Input equirectangular panorama (2:1 aspect ratio) |
-o, --output PATH |
Output PNG file |
-c, --colormap STYLE |
grayscale (16-bit, default), jet, or turbo |
The model is automatically compiled on first use and cached for subsequent runs.
Quick Start β 360Β° Gaussian Splats (macOS)
Convert an equirectangular panorama directly into a 3D Gaussian splat .ply file β one Gaussian per pixel, compatible with standard 3DGS viewers:
# Compile
swiftc -O -o panorama_splat PanoramaSplat.swift \
-framework CoreML -framework Vision -framework CoreImage \
-framework CoreGraphics -framework AppKit
# Generate a Gaussian splat PLY
./panorama_splat -m DAPModel.mlpackage -i test/test.png -o scene.ply -r 5.0
Options:
| Flag | Description |
|---|---|
-m, --model PATH |
Path to DAPModel.mlpackage |
-i, --input PATH |
Input equirectangular panorama (2:1 aspect ratio) |
-o, --output PATH |
Output PLY file |
-r, --radius FLOAT |
Sphere radius in world units (default: 5.0) |
The PLY uses the same binary format as SHARP, with per-pixel positions projected onto a sphere using estimated depth, image-derived colors (SH0), uniform scale/opacity, and identity quaternions.
Quick Start β Xcode (iOS / macOS)
Add DAPModel.mlpackage to your Xcode project (Xcode auto-generates the DAPModel Swift class), then use the included DepthPredictor.swift:
import Foundation
import CoreML
import Vision
import CoreImage
// Load the model from a .mlpackage URL
let modelURL = Bundle.main.url(forResource: "DAPModel", withExtension: "mlpackage")!
let predictor = DepthPredictor(modelURL: modelURL)
// Run inference on a CGImage (equirectangular panorama)
predictor.predictDepth(from: cgImage) { depth in
guard let depth = depth else { return }
// `depth` is a DepthResult with raw Float32 values and a CIImage
// Colorize with jet colormap
let colorized = predictor.applyJetColormap(to: depth)
// Or access raw depth values directly
let values = depth.getDepthValues() // [Float32], row-major
}
Files
| File | Description |
|---|---|
DAPModel.mlpackage/ |
CoreML model (depth-only, ImageType input) |
model.pth |
Original DAP PyTorch weights |
export_and_validate_coreml.py |
Export + validation script |
DepthPredictor.swift |
Swift inference wrapper |
depth_anything_utils.py |
Image preprocessing utilities |
networks/ |
DAP model definition |
depth_anything_v2_metric/ |
Depth-Anything-V2 + DINOv3 backbone |
test/test.png |
Test panorama for validation |
test_output/ |
PyTorch vs CoreML comparison |
Export from Scratch
Reproduce the CoreML model from the PyTorch weights:
# Install dependencies
pip install -r requirements.txt
# Export and validate (produces DAPModel.mlpackage + test_output/)
python export_and_validate_coreml.py
# Custom resolution (must be multiples of 16)
python export_and_validate_coreml.py --height 768 --width 1536
# Skip export, only validate existing model
python export_and_validate_coreml.py --skip_export
Citation
@article{lin2025dap,
title={Depth Any Panoramas: A Foundation Model for Panoramic Depth Estimation},
author={Lin, Xin and Song, Meixi and Zhang, Dizhe and Lu, Wenxuan and Li, Haodong and Du, Bo and Yang, Ming-Hsuan and Nguyen, Truong and Qi, Lu},
journal={arXiv},
year={2025}
}
License
Original DAP weights and model architecture: MIT (Insta360 Research Team)
CoreML export and Swift wrapper: MIT
- Downloads last month
- 5
Model tree for pearsonkyle/DepthAnyPanorama-coreml
Base model
Insta360-Research/DAP-weights


