How MicMeter was built through AI-assisted development
The goal was simple: build a macOS menu bar app that shows real-time microphone levels and provides quick access to input device selection and volume control. No more keeping System Settings open on a side monitor.
The AI assistant created a comprehensive development plan covering:
The plan estimated ~320 lines of code across 4 main Swift files.
First roadblock: unclear instructions about configuring Info.plist entries for NSMicrophoneUsageDescription and LSUIElement. The Xcode UI has changed across versions, making generic instructions difficult to follow.
Swift compiler error in AudioManager.swift: Cannot convert value of type 'OSStatus' (aka 'Int32') to closure result type 'Void'. CoreAudio APIs return status codes that need proper handling.
Two distinct issues emerged:
UnsafeMutableRawPointer warnings when working with CFString referencesWith the app functional, focus shifted to usability:
The app crashed when switching between audio devices (Bluetooth to USB mic). CoreAudio threw errors like no object with given ID and IOWorkLoop: skipping cycle due to overload. This was a showstopper - the whole point of the app is quick device switching.
Same crash reproduced switching from Bluetooth to USB. The error logs showed AddInstanceForFactory: No factory registered for id and HAL proxy context overload. Required deeper investigation into AVAudioEngine lifecycle management.
After resolving the device switching issues, the app was built for release and packaged as a DMG for distribution.
Working with low-level audio APIs requires careful handling of device IDs, property listeners, and audio engine lifecycle. Race conditions and stale references are common pitfalls when devices change.
The vibe coding approach of describing problems in natural language and iterating with the AI was effective. Each error message led to a targeted fix, building understanding incrementally.
Functional code isn't enough - the 1px bars and low-contrast icons were technically "working" but practically unusable. User feedback on visual details was crucial.
Keeping markdown files for each conversation phase created a useful record of the development journey and made it easy to reference past decisions.
The critical device switching bug wasn't discovered until late in development. Testing with multiple audio devices should have been part of the initial verification.
The UI issues (thin bars, gradient problems) could have been caught earlier with mockups or prototypes before diving into Swift code.
Generic Xcode instructions were confusing. Future plans should specify the Xcode version and include screenshots or exact menu paths.
CoreAudio errors were handled reactively. A proactive error handling strategy for audio device changes would have prevented the crash bugs.
01_goal.md 02_plan.md 03_feedback.md 04_error.md 05_warnings.md 06_ui_improvements.md 07_complex_error.md 08_complex_again.md