DAP CoreML — Panoramic Depth Estimation for Apple Silicon
CoreML export of DAP (Depth Any Panoramas), a foundation model for monocular depth estimation on equirectangular 360° panoramas. Optimized for on-device inference on iOS 18+ and macOS with Apple Silicon.
📦 Download the Core ML Model Only
pip install huggingface-hub
huggingface-cli download --include DAPModel.mlpackage/ --local-dir . pearsonkyle/DepthAnyPanorama-coreml
🧰 Clone the Full Repository
This will include the inference and model conversion/validation scripts.
brew install git-xet
git xet install
Clone the model repository:
git clone git@hf.co:pearsonkyle/DepthAnyPanorama-coreml
| Original model | DAP (Insta360 Research) |
| Architecture | Depth-Anything-V2 + DINOv3 (ViT-L) |
| Input | Equirectangular panorama, 2:1 aspect ratio (default 1536×768) |
| Output | Monocular depth map, float32, same resolution as input |
| CoreML size | ~1.2 GB |
| Deployment | iOS 18+, macOS 15+ (Apple Silicon) |
Validation
The CoreML export achieves near-perfect numerical fidelity to the original PyTorch implementation, with negligible differences in depth predictions and identical correlation. Changing the input resolution can somtimes lead to artifacts around the edges of the panorama.
| Metric | Value |
|---|---|
| Max absolute difference | 5.54×10⁻⁶ |
| Mean absolute difference | 4.50×10⁻⁷ |
| Correlation | 1.000000 |
| CoreML inference (M-series) | ~650 ms |
Command line on macOS
Compile and run DepthPredictor.swift as a standalone tool — no Xcode project needed:
# Compile
swiftc -O -o depth_predictor DepthPredictor.swift \
-framework CoreML -framework Vision -framework CoreImage \
-framework CoreGraphics -framework AppKit
# Generate a 16-bit grayscale depth map
./depth_predictor -m DAPModel.mlpackage -i panorama.jpg -o depth.png
# Colorized with jet colormap
./depth_predictor -m DAPModel.mlpackage -i panorama.jpg -o depth.png -c jet
# Turbo colormap
./depth_predictor -m DAPModel.mlpackage -i panorama.jpg -o depth.png -c turbo
Options:
| Flag | Description |
|---|---|
-m, --model PATH |
Path to DAPModel.mlpackage or .mlmodelc |
-i, --input PATH |
Input equirectangular panorama (2:1 aspect ratio) |
-o, --output PATH |
Output PNG file |
-c, --colormap STYLE |
grayscale (16-bit, default), jet, or turbo |
The model is automatically compiled on first use and cached for subsequent runs.
360° Gaussian Splat from Panorama
Convert an equirectangular panorama directly into a 3D Gaussian splat .ply file — one Gaussian per pixel, compatible with standard 3DGS viewers:
# Compile
swiftc -O -o panorama_splat PanoramaSplat.swift \
-framework CoreML -framework Vision -framework CoreImage \
-framework CoreGraphics -framework AppKit
# Generate a Gaussian splat PLY
./panorama_splat -m DAPModel.mlpackage -i test/test.png -o scene.ply -r 5.0
Options:
| Flag | Description |
|---|---|
-m, --model PATH |
Path to DAPModel.mlpackage |
-i, --input PATH |
Input equirectangular panorama (2:1 aspect ratio) |
-o, --output PATH |
Output PLY file |
-r, --radius FLOAT |
Sphere radius in world units (default: 5.0) |
The PLY uses the same binary format as SHARP, with per-pixel positions projected onto a sphere using estimated depth, image-derived colors (SH0), uniform scale/opacity, and identity quaternions.
Files
| File | Description |
|---|---|
DAPModel.mlpackage/ |
CoreML model (depth-only, ImageType input) |
model.pth |
Original DAP PyTorch weights |
export_and_validate_coreml.py |
Export + validation script |
DepthPredictor.swift |
Swift inference wrapper |
depth_anything_utils.py |
Image preprocessing utilities |
networks/ |
DAP model definition |
depth_anything_v2_metric/ |
Depth-Anything-V2 + DINOv3 backbone |
test/test.png |
Test panorama for validation |
test_output/ |
PyTorch vs CoreML comparison |
Export from Scratch
Reproduce the CoreML model from the PyTorch weights:
# Install dependencies
pip install -r requirements.txt
# Export and validate (produces DAPModel.mlpackage + test_output/)
python export_and_validate_coreml.py
# Custom resolution (must be multiples of 16)
python export_and_validate_coreml.py --height 768 --width 1536
# Skip export, only validate existing model
python export_and_validate_coreml.py --skip_export
Citation
@article{lin2025dap,
title={Depth Any Panoramas: A Foundation Model for Panoramic Depth Estimation},
author={Lin, Xin and Song, Meixi and Zhang, Dizhe and Lu, Wenxuan and Li, Haodong and Du, Bo and Yang, Ming-Hsuan and Nguyen, Truong and Qi, Lu},
journal={arXiv},
year={2025}
}
License
Original DAP weights and model architecture: MIT (Insta360 Research Team)
CoreML export and Swift wrapper: MIT
- Downloads last month
- 21
Model tree for pearsonkyle/DepthAnyPanorama-coreml
Base model
Insta360-Research/DAP-weights


