Building Playground Composer with Apple's Image Playground Framework

Building Playground Composer with Apple’s Image Playground Framework

Apple’s Image Playground framework gives iOS apps a system-native way to create images from user concepts. For a portfolio project, that is much more interesting than building another generic image generation wrapper: the engineering challenge becomes how to make image creation feel like part of the operating system, while still keeping the app architecture testable and product-shaped.

This post walks through Playground Composer, a small SwiftUI app that demonstrates three integration paths:

  • A SwiftUI system sheet using imagePlaygroundSheet(...)
  • Programmatic generation using ImageCreator
  • A UIKit bridge using ImagePlaygroundViewController

The app also includes device capability checks, local result persistence, unsupported-device fallbacks, and focused unit tests.

Product Goal

Playground Composer is a compact sample app for exploring system-native image generation workflows on iOS. The goal is not to train a model or build a full creative suite. Instead, the goal is to show how Apple’s official framework can be surfaced in realistic iOS workflows.

Playground Composer has three tabs:

  • Compose: a primary SwiftUI workspace that opens Apple’s Image Playground sheet.
  • Create: a programmatic generation flow powered by ImageCreator.
  • Demos: reusable generation entry points embedded in avatar, sticker, and note illustration scenarios.

That structure intentionally separates system UI integration, async API usage, and reusable product embedding.

Core Data Model

The app uses a small request model to keep prompt validation and style selection explicit:

1
2
3
4
5
6
7
8
9
10
11
12
13
struct GenerationRequest: Equatable {
var prompt: String
var style: GenerationStyleOption
var limit: Int

var trimmedPrompt: String {
prompt.trimmingCharacters(in: .whitespacesAndNewlines)
}

var isValid: Bool {
!trimmedPrompt.isEmpty && (1...4).contains(limit)
}
}

The style enum wraps Apple’s generation styles in app-facing labels:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
enum GenerationStyleOption: String, CaseIterable, Identifiable {
case animation
case illustration
case sketch

var playgroundStyle: ImagePlaygroundStyle {
switch self {
case .animation:
.animation
case .illustration:
.illustration
case .sketch:
.sketch
}
}
}

This keeps UI copy, symbols, subtitles, and Apple API values in one place. It also makes tests simple because the app has stable identifiers like "animation", "illustration", and "sketch".

Building Concepts

Image Playground expects concepts as [ImagePlaygroundConcept]. The current app supports text concepts:

1
2
3
4
5
6
7
8
9
10
enum ImagePlaygroundConceptBuilder {
static func concepts(from prompt: String) -> [ImagePlaygroundConcept] {
let trimmedPrompt = prompt.trimmingCharacters(in: .whitespacesAndNewlines)
guard !trimmedPrompt.isEmpty else {
return []
}

return [.text(trimmedPrompt)]
}
}

This is deliberately small, but the boundary is useful. Later, the same builder could add extracted text, image concepts, or PencilKit drawings without changing the UI layer.

SwiftUI System Sheet Integration

The reusable GenerationLauncher component owns the system sheet integration. It takes a prompt binding, selected style binding, and shared image library:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
struct GenerationLauncher: View {
@Environment(\.supportsImagePlayground) private var supportsImagePlayground

let title: String
@Binding var prompt: String
@Binding var selectedStyle: GenerationStyleOption
let library: GeneratedImageLibrary

@State private var isPresented = false

private var request: GenerationRequest {
GenerationRequest(prompt: prompt, style: selectedStyle, limit: 1)
}
}

request is a computed property because prompt and selectedStyle are bindings. Every time the UI reads request, it reflects the latest input.

The button opens the official Image Playground sheet:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
Button {
isPresented = true
} label: {
Label(title, systemImage: "sparkles")
.frame(maxWidth: .infinity)
}
.disabled(!supportsImagePlayground || !request.isValid)
.imagePlaygroundGenerationStyle(
selectedStyle.playgroundStyle,
in: GenerationStyleOption.allCases.map(\.playgroundStyle)
)
.imagePlaygroundSheet(
isPresented: $isPresented,
concepts: ImagePlaygroundConceptBuilder.concepts(from: request.trimmedPrompt),
sourceImage: nil,
onCompletion: { url in
library.saveTemporaryImage(
at: url,
prompt: request.trimmedPrompt,
style: selectedStyle
)
},
onCancellation: {}
)

Two details matter here:

  1. The launcher checks supportsImagePlayground before enabling the button.
  2. The completion handler receives a temporary URL, so the app copies the image into its own sandbox before displaying it later.

Programmatic Generation with ImageCreator

The Create tab demonstrates the lower-level API:

1
2
3
4
5
6
7
protocol ProgrammaticImageGenerating: Sendable {
func generate(
concepts: [ImagePlaygroundConcept],
style: ImagePlaygroundStyle,
limit: Int
) async throws -> [CGImage]
}

The production implementation wraps ImageCreator:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
struct ImageCreatorGenerationService: ProgrammaticImageGenerating {
func generate(
concepts: [ImagePlaygroundConcept],
style: ImagePlaygroundStyle,
limit: Int
) async throws -> [CGImage] {
let creator = try await ImageCreator()
let images = creator.images(for: concepts, style: style, limit: limit)
var createdImages: [CGImage] = []

for try await image in images {
createdImages.append(image.cgImage)
}

return createdImages
}
}

The important part is that images(for:style:limit:) returns an async sequence. The app consumes that sequence with for try await, which gives a natural place to support cancellation, errors, and partial result handling in a larger version.

The UI model tracks a small state machine:

1
2
3
4
5
6
7
enum GenerationPhase: Equatable {
case idle
case generating
case completed(Int)
case failed(String)
case cancelled
}

That state powers the progress banner, failure message, completion count, and cancel button.

UIKit Bridge

Even in a SwiftUI-first app, UIKit interop is still valuable. The Demos tab includes a UIViewControllerRepresentable wrapper around ImagePlaygroundViewController:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
struct UIKitPlaygroundPresenter: UIViewControllerRepresentable {
let prompt: String
let style: GenerationStyleOption
let onCompletion: (URL) -> Void
let onCancellation: () -> Void

func makeUIViewController(context: Context) -> ImagePlaygroundViewController {
let viewController = ImagePlaygroundViewController()
viewController.delegate = context.coordinator
viewController.concepts = ImagePlaygroundConceptBuilder.concepts(from: prompt)
viewController.allowedGenerationStyles = GenerationStyleOption.allCases.map(\.playgroundStyle)
viewController.selectedGenerationStyle = style.playgroundStyle
return viewController
}
}

The coordinator receives the delegate callbacks:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
final class Coordinator: NSObject, ImagePlaygroundViewController.Delegate {
func imagePlaygroundViewController(
_ imagePlaygroundViewController: ImagePlaygroundViewController,
didCreateImageAt imageURL: URL
) {
onCompletion(imageURL)
}

func imagePlaygroundViewControllerDidCancel(
_ imagePlaygroundViewController: ImagePlaygroundViewController
) {
onCancellation()
}
}

This shows that the app can integrate the same framework from both SwiftUI and UIKit surfaces.

Persisting Results

Generated images from the system sheet arrive as temporary URLs. The app persists them through ImageStore:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
func persistTemporaryImage(
at temporaryURL: URL,
prompt: String,
styleID: String,
title: String
) throws -> GeneratedImageAsset {
try ensureDirectory()
let destinationURL = baseURL.appendingPathComponent("\(UUID().uuidString).png")
try fileManager.copyItem(at: temporaryURL, to: destinationURL)
return GeneratedImageAsset(
title: title,
prompt: prompt,
styleID: styleID,
fileURL: destinationURL
)
}

The app stores only generated image files and local metadata. It does not upload contacts, photo library contents, or personal data. The demo is intentionally privacy-friendly by default.

Unsupported Device Fallback

Image Playground availability depends on OS, hardware, and Apple Intelligence support. The app checks:

1
@Environment(\.supportsImagePlayground) private var supportsImagePlayground

When unsupported, the app disables real generation and shows a local sample result path. This makes the demo usable on simulators while still keeping real generation behind Apple’s capability check.

That distinction matters for portfolio work: a demo should not break just because the reviewer opens it on a simulator.

Keyboard Interaction Polish

The app also handles a common usability issue: keyboard dismissal. Each scrollable tab opts into interactive keyboard dismissal:

1
.scrollDismissesKeyboard(.interactively)

It also adds a keyboard toolbar with a Done button:

1
2
3
4
5
6
7
8
9
10
11
12
extension View {
func keyboardDismissToolbar() -> some View {
toolbar {
ToolbarItemGroup(placement: .keyboard) {
Spacer()
Button("Done") {
KeyboardDismissal.dismiss()
}
}
}
}
}

This is not the headline feature, but it makes the app feel much more complete during a recorded demo.

Testing Strategy

The test suite focuses on app-owned behavior:

  • Prompt trimming and request validation
  • Stable style identifiers
  • Empty prompt handling in the concept builder
  • Copying temporary result URLs into persistent storage

Example:

1
2
3
4
5
func testGenerationRequestTrimsPromptAndValidatesLimit() {
let valid = GenerationRequest(prompt: " test prompt ", style: .illustration, limit: 2)
XCTAssertEqual(valid.trimmedPrompt, "test prompt")
XCTAssertTrue(valid.isValid)
}

The official Image Playground UI is not unit-tested directly. Instead, the app keeps its own boundaries testable and leaves system framework behavior to Apple.

What This Project Demonstrates

Playground Composer is intentionally compact, but it demonstrates several production-relevant iOS skills:

  • SwiftUI composition
  • Official Apple framework integration
  • UIKit interop
  • Async sequence consumption
  • Capability-gated UI
  • Local file persistence
  • Keyboard and simulator demo polish
  • Testable app-owned logic

The project focuses on native platform integration rather than treating image generation as a standalone API call. It shows how generation can become a native product surface, not just a backend feature with a text box.


Building Playground Composer with Apple's Image Playground Framework
http://runningcoconut.com/2026/04/30/Image-Playground/
Author
Huajing Lu
Posted on
April 30, 2026
Licensed under