Plug a real camera, a video file, or your screen into the iOS Simulator. Finally.
The iOS Simulator has never supported a real camera. AVCaptureDevice is empty. Every app that touches the camera β QR scanners, barcode readers, document capture, ML pipelines, AR prototypes β either stubs out the camera path, runs only on device, or ships a brittle "use a photo instead" fallback.
SimulatorCamera is a tiny two-piece developer tool that fixes it:
- a macOS companion app that streams video frames over
localhost:9876using a compact binary protocol (SCMF β Simulator Camera Message Format), and - an iOS Swift Package with an
AVCaptureSession-shaped API. On device it compiles to a no-op.
Frames show up in your app. Vision, VisionKit, Core ML, barcode detection, custom pipelines β the SDK is designed to drive them in the Simulator at 25β30 FPS over localhost, no device, no cables, no private APIs.
Status: v0.2.0 is a preview cut. A recorded demo and independent benchmarks will land with the first tagged release; for now, the protocol and shim are best-effort and we're actively looking for early testers.
Every camera-using app today has one of these:
#if targetEnvironment(simulator)
// TODO: fake it somehow
#else
let session = AVCaptureSession()
// ...real code
#endifThis project deletes that TODO. Same API shape in the Simulator and on device.
- π₯ Live video into the Simulator at 30 FPS via
localhostTCP - π§© Drop-in SDK β
FrameSourcemirrorsAVCaptureSessionsemantics (start(),stop(), delegate,CVPixelBuffercallbacks) - π Sources on the Mac: test pattern (built-in), webcam, video file, screen region (roadmap)
- π¦ One-line install via Swift Package Manager
- π‘ No private APIs β
Network.framework+CoreVideo+ImageIO - π΅ Zero overhead on device β
#if targetEnvironment(simulator)-guarded - π Localhost-only by default
- π§ͺ Vision / Core ML ready β frames land as
CVPixelBuffer
dependencies: [
.package(url: "https://github.com/dautovri/SimulatorCamera.git", from: "0.2.0"),
],
targets: [
.target(
name: "MyApp",
dependencies: [
.product(name: "SimulatorCameraClient", package: "SimulatorCamera"),
]
),
]Or in Xcode: File β Add Package Dependenciesβ¦ β paste the repo URL.
Homebrew (recommended):
brew install --cask dautovri/tap/simulatorcamera
open -a SimulatorCameraServerOr grab the signed & notarized .dmg from Releases. Or build from source:
git clone https://github.com/dautovri/SimulatorCamera.git
cd SimulatorCamera/apps/MacServer
open SimulatorCameraServer.xcodeproj- Launch SimCameraServer.app on your Mac. Pick a source and click Start.
- In your iOS code:
import SimulatorCameraClient
final class CameraController: NSObject, FrameSourceDelegate {
private let source: FrameSource
override init() {
#if targetEnvironment(simulator)
source = SimulatorCameraSession(host: "127.0.0.1", port: 9876)
#else
source = AVCaptureFrameSource() // your existing AVCapture wrapper
#endif
super.init()
source.delegate = self
source.start()
}
func frameSource(_ source: FrameSource, didOutput pixelBuffer: CVPixelBuffer, at time: CMTime) {
// Feed to Vision, Core ML, preview layer, whatever.
}
}The shim now mirrors the whole AVCaptureSession β addInput β addOutput β startRunning dance. Your existing camera-setup code ports over by prefixing each type with Simulator:
import SimulatorCameraClient
SimulatorCamera.configure(host: "127.0.0.1", port: 9876)
let session = SimulatorCaptureSession()
session.sessionPreset = .hd1280x720
guard let device = SimulatorCaptureDevice.default(for: .video) else { return }
let input = try SimulatorCaptureDeviceInput(device: device)
session.addInput(input)
let output = SimulatorCameraOutput() // AVCaptureVideoDataOutput-shaped
output.setSampleBufferDelegate(self, queue: frameQueue)
session.addOutput(output)
session.startRunning() // kicks off the network sessionYour existing captureOutput(_:didOutput:from:) delegate fires with a valid CMSampleBuffer wrapping a CVPixelBuffer β same code path as the real device.
If you already have an AVCaptureVideoDataOutputSampleBufferDelegate,
swap the output for SimulatorCameraOutput inside a simulator guard
and keep your delegate code unchanged. The standard
captureOutput(_:didOutput:from:) method fires with a real
CMSampleBuffer β SimulatorCameraOutput is an AVCaptureVideoDataOutput
subclass, so the first argument is a genuine AV output, not a stand-in:
#if targetEnvironment(simulator)
let output = SimulatorCameraOutput()
output.setSampleBufferDelegate(self, queue: myQueue)
SimulatorCamera.start()
#else
let output = AVCaptureVideoDataOutput()
output.setSampleBufferDelegate(self, queue: myQueue)
session.addOutput(output)
#endifOr use the drop-in SwiftUI view:
import SwiftUI
import SimulatorCameraClient
struct ContentView: View {
var body: some View {
SimulatorCameraPreviewView()
}
}+--------+---------------+------------------+--------+---------+----------+
| magic | payloadLength | timestamp | width | height | jpegData |
| 4 B | 4 B uint32 LE | 8 B Float64 LE | 4 B LE | 4 B LE | N bytes |
| "SCMF" | |
+--------+---------------+------------------+--------+---------+----------+
Full spec: docs/PROTOCOL.md Β· architecture: docs/ARCHITECTURE.md Β· roadmap: docs/ROADMAP.md.
SimulatorCamera/
βββ Package.swift # SwiftPM manifest (exposes SimulatorCameraClient)
βββ Sources/SimulatorCameraClient/ # the iOS SDK
βββ Tests/SimulatorCameraClientTests/ # unit tests for the SCMF codec
βββ apps/
β βββ MacServer/ # SwiftUI macOS companion app
β βββ iOSDemo/ # sample iOS app using the SDK
βββ docs/
β βββ PROTOCOL.md # wire format
β βββ ARCHITECTURE.md # threading, transport, failure modes
β βββ ROADMAP.md
βββ Casks/simulatorcamera.rb # Homebrew cask formula
βββ scripts/
β βββ bootstrap.sh # swift build + test
β βββ build-release.sh # archive + codesign + notarize + .dmg/.zip
βββ .github/
β βββ FUNDING.yml # GitHub Sponsors / BMC
β βββ workflows/
β βββ ci.yml # SwiftPM CI on macos-14
β βββ release.yml # tag-driven signed release
βββ RELEASING.md # release runbook
./scripts/bootstrap.sh # swift build && swift testv0.2.0 β "Use my real camera." First stable release with a drop-in AVCaptureSession shim and live Mac webcam source. See CHANGELOG.md and docs/RELEASE_NOTES_v0.2.0.md.
See CONTRIBUTING.md. Good first issues are labelled on the tracker. For release mechanics, see RELEASING.md.
SimulatorCamera is fully MIT-licensed and maintained on donations. If it saves you a device-build loop, consider sponsoring or buying a coffee. No paid tier, no license keys, no telemetry β just a tip jar.
MIT β see LICENSE.