Integrating Device Camera in SwiftUI Apps

Integrating Device Camera in SwiftUI Apps

Learn how to bridge UIKit and AVFoundation with SwiftUI to integrate camera functionality.

When building modern iOS apps with SwiftUI, you'll eventually encounter scenarios where you need real-time camera functionality, whether for photo capture, document scanning, or augmented reality features. You might expect SwiftUI to provide native camera components, but here's the thing: SwiftUI doesn't include built-in camera views or direct camera access.

While SwiftUI excels at declarative UI design and state management, camera functionality requires the lower-level capabilities of AVFoundation, Apple's multimedia framework. This creates a gap that developers must bridge by integrating UIKit components into their SwiftUI projects.

The solution lies in understanding how to properly bridge UIKit's camera capabilities with SwiftUI's reactive architecture, creating a seamless integration that maintains SwiftUI's benefits while accessing the full power of iOS camera APIs.

Camera integration uses three key patterns to bridge UIKit and SwiftUI: 

  1. UIViewRepresentable: Protocol that wraps UIKit views for use in SwiftUI
  2. ObservableObject: SwiftUI's state management system for sharing data across views
  3. Delegate Pattern: Handles asynchronous camera operations and callbacks

While, the complete solution consists of four main components: 

  • CameraManager: Handles all AVFoundation logic and camera operations 
  • CameraPreview: Bridges AVFoundation's preview layer to SwiftUI via UIViewRepresentable 
  • CameraView: Provides the SwiftUI interface for camera interaction
  • PhotoView: Manages photo gallery and camera presentation

AVFoundation

AVFoundation is Apple's comprehensive multimedia framework that provides essential APIs for working with audiovisual media. It's essential for accessing camera hardware directly, capturing and processing images and video, displaying real-time camera previews, managing precise session configurations and lifecycles.

Here are the key AVFoundation components that power your camera integration:

Let's define a reusable class that wraps all camera logic using AVFoundation. This will act as the bridge layer between the low-level camera APIs and the SwiftUI view layer:

import AVFoundation
import UIKit

class CameraManager: NSObject, ObservableObject {
    // AVFoundation Components
    private let session = AVCaptureSession()
    private let output = AVCapturePhotoOutput()
    private var previewLayer: AVCaptureVideoPreviewLayer?

    // Published property for SwiftUI binding
    @Published var capturedImage: UIImage?
    
    func startSession() {
        guard let device = AVCaptureDevice.default(for: .video),
              let input = try? AVCaptureDeviceInput(device: device),
              session.canAddInput(input),
              session.canAddOutput(output) else { return }

        // Session Configuration Pattern
        session.beginConfiguration()
        session.addInput(input)
        session.addOutput(output)
        session.commitConfiguration()
        session.startRunning()
    }
    
    // Lifecycle Management
    func stopSession() {
        session.stopRunning()
        
        session.beginConfiguration()
        session.inputs.forEach { session.removeInput($0) }
        session.outputs.forEach { session.removeOutput($0) }
        session.commitConfiguration()
        
        previewLayer = nil
        capturedImage = nil
    }

    // Preview Layer Management
    func getPreviewLayer() -> AVCaptureVideoPreviewLayer {
        if let layer = previewLayer {
            return layer
        } else {
            let layer = AVCaptureVideoPreviewLayer(session: session)
            layer.videoGravity = .resizeAspectFill
            previewLayer = layer
            return layer
        }
    }
    
    func takePhoto() {
        let settings = AVCapturePhotoSettings()
        output.capturePhoto(with: settings, delegate: self)
    }
}

// Delegate Pattern for Async Operations
extension CameraManager: AVCapturePhotoCaptureDelegate {
    func photoOutput(_ output: AVCapturePhotoOutput,
                     didFinishProcessingPhoto photo: AVCapturePhoto,
                     error: Error?) {
        if let data = photo.fileDataRepresentation(),
           let image = UIImage(data: data) {
            DispatchQueue.main.async {
                self.capturedImage = image
            }
        }
    }
}

The CameraManager serves as the central coordinator for all camera operations. It's implemented as an ObservableObject to integrate seamlessly with SwiftUI's state management.

And now, let's break this code down:

  • Session Configuration Pattern: The beginConfiguration()/commitConfiguration() pattern is crucial for AVFoundation. It ensures all session modifications are applied atomically, preventing intermediate invalid states that could crash your app or cause unpredictable behavior
  • Lifecycle Management: Proper session cleanup prevents memory leaks and ensures the camera resource is available for other apps
  • Preview Layer Management: The preview layer uses lazy initialization to avoid creating unnecessary resources
  • Delegate Pattern for Async Operations: Photo capture is asynchronous, requiring delegate pattern implementation

Why UIKit Is Still Required

Now that we have our camera logic in place, we need to show the preview in a SwiftUI view. This is where UIKit comes in.

Understanding the differences between SwiftUI’s declarative model and UIKit’s imperative model helps clarify why integration is necessary and how to make it seamless.

  • SwiftUI has a declarative approach: You describe what the UI should look like based on state, and SwiftUI handles the how
  • UIKit has an imperative approach: You explicitly tell the system how to create and manage UI components

SwiftUI currently lacks native camera components because camera functionality requires imperative management of hardware resources, session lifecycles, and real-time rendering. These concepts don't naturally fit SwiftUI's declarative model. And that's why we need to bridge UIKit components into SwiftUI views.

Let's build the CameraPreview.
First, to embed the preview layer into a SwiftUI view, we wrap a UIKit's  UIView using UIViewRepresentable:

import SwiftUI
import AVFoundation

struct CameraPreview: UIViewRepresentable {
    let sessionLayer: AVCaptureVideoPreviewLayer

    func makeUIView(context: Context) -> UIView {
        let view = UIView()
        sessionLayer.videoGravity = .resizeAspectFill
        view.layer.addSublayer(sessionLayer)
        return view
    }

    func updateUIView(_ uiView: UIView, context: Context) {
        DispatchQueue.main.async {
            sessionLayer.frame = uiView.bounds
        }
    }
}

SwiftUI Camera Interface

Now we bring everything together into a SwiftUI interface that displays the live camera feed, captures a photo on tap, shows a preview with “Retake” or “Use Photo” options.

struct CameraView: View {
    @ObservedObject var cameraManager: CameraManager
    var onPhotoCaptured: (UIImage) -> Void
    @Environment(\.presentationMode) var presentationMode
    
    @State private var showPreview = false
    @State private var capturedPhoto: UIImage?
    
    var body: some View {
        ZStack {
            if let capturedPhoto, showPreview {
                Image(uiImage: capturedPhoto)
                    .resizable()
                    .scaledToFill()
                    .ignoresSafeArea()
                
                VStack {
                    Spacer()
                    HStack {
                        Button("Retake") {
                            self.capturedPhoto = nil
                            self.showPreview = false
                        }
                        Button("Use Photo") {
                            onPhotoCaptured(capturedPhoto)
                            presentationMode.wrappedValue.dismiss()
                        }
                    }
                    .padding()
                    .background(Color.white.opacity(0.8))
                    .cornerRadius(10)
                }
            } else {
                GeometryReader { _ in
                    CameraPreview(sessionLayer: cameraManager.getPreviewLayer())
                        .ignoresSafeArea()
                }
                VStack {
                    Spacer()
                    Button(action: {
                        cameraManager.takePhoto()
                    }) {
                        Circle()
                            .fill(Color.white)
                            .frame(width: 70, height: 70)
                            .overlay(Circle().stroke(Color.black, lineWidth: 2))
                            .padding(.bottom, 20)
                    }
                }
            }
        }
        .onAppear { cameraManager.startSession() }
        .onDisappear { cameraManager.stopSession() }
        .onChange(of: cameraManager.capturedImage) { image in
            if let img = image {
                self.capturedPhoto = img
                self.showPreview = true
            }
        }
    }
}

State Management

This implementation demonstrates the proper balance between @State and @ObservedObject:

  • @ObservedObject for cameraManager: Shares camera state across views and survives view updates
  • @State for local UI state: Manages preview mode and temporary photo storage that's specific to this view instance

Callback Pattern Implementation

The onPhotoCaptured callback allows parent views to handle captured photos without tight coupling, making the camera view reusable across different contexts.

Let’s see how to use this camera component in a SwiftUI interface, like a basic gallery of memories:

struct PhotoView: View {
    let columns = [GridItem(.flexible()), GridItem(.flexible())]
    
    @State private var showCamera = false
    @State private var images: [UIImage] = []
    @State private var tempCameraManager = CameraManager()
    
    var body: some View {
        NavigationView {
            VStack {
                if images.isEmpty {
                    VStack(spacing: 16) {
                        Text("You don't have photos!")
                            .font(.title3)
                        Text("Tap on **+** to start adding.")
                            .foregroundColor(.secondary)
                    }
                } else {
                    ScrollView {
                        LazyVGrid(columns: columns, spacing: 8) {
                            ForEach(images.indices, id: \.self) { index in
                                Image(uiImage: images[index])
                                    .resizable()
                                    .aspectRatio(contentMode: .fill)
                                    .frame(width: 195, height: 195)
                                    .cornerRadius(10)
                            }
                        }
                        .padding()
                    }
                }
            }
            .toolbar {
                ToolbarItem(placement: .navigationBarTrailing) {
                    Button("", systemImage: "plus") {
                        showCamera.toggle()
                    }
                }
            }
            .navigationTitle("Memories")
            .fullScreenCover(isPresented: $showCamera) {
                CameraView(cameraManager: tempCameraManager) { image in
                    images.append(image)
                }
            }
        }
    }

Using the .fullScreenCover modifier ensures the camera takes full advantage of the screen real estate and provides an immersive capture experience. The fresh CameraManager instance prevents session conflicts when reopening the camera.

To wrap everything up, SwiftUI doesn't yet offer a native camera API, but using the AVFoundation and UIKit frameworks, you can build fully functional, real-time photo capture features with live previews and reactive UI updates.

Using AVFoundation for hardware control and media capture, UIKit (via UIViewRepresentable) for preview rendering, SwiftUI for state management and app structure allows you to keep the modern SwiftUI development experience while accessing advanced device features.