counter easy hit

I built two apps with just my voice and a mouse – are IDEs already obsolete?

I built two apps with just my voice and a mouse – are IDEs already obsolete?
2
hand using computer mouse
MILANTE/iStock/Getty Images Plus

Follow ZDNET: Add us as a preferred source on Google.


ZDNET’s key takeaways

  • AI coding replaces edit and debug with instruct and guide.
  • Terminal plus AI replaces traditional development environments.
  • IDEs are reduced to build and deployment tools only.

My little dog Pixel does not like my couch keyboard. It often occupies a space on my lap that my 12-year-old Yorkipoo considers his sovereign territory. In his view, quality cuddling must not be compromised just because I want to get some writing or coding done.

That was the case last night, when my very good boy climbed up onto my shoulder, snuggled in, and fell asleep. It was also the moment when I realized powerful development environments like VS Code and Xcode are effectively obsolete.

My vibe coding projects

I am working on two Apple programming projects, each of which will run on iPhones, iPads, Macs, and Apple Watches. I’m building a total of eight binaries that will eventually be distributed via the Apple App Store.

Also: I used Gmail’s AI tool to do hours of work for me in 10 minutes – with 3 prompts

One project is a filament management project that helps me keep track of rolls of 3D printer filament. I have 120 spools that live on four storage racks, each with five shelves. The spools are constantly moved between the racks and my eight 3D printers. Five of the printers can use four spools at once, one can use eight spools, and two are limited to one spool each.

The iPhone app uses NFC tags to make it super easy to track the movement of these spools and uses the built-in camera to take images of each spool for reference. The Watch app examines and updates location, while the Mac app provides a desktop view of the filament inventory.

The second project was initially based on the filament management system, but has become so much more. This project manages both physical and digital sewing patterns. Many sewists, like my wife, collect patterns in the hundreds and thousands, and keeping track of them is often quite a challenge.

My code uses NFC tags and photos to manage the physical patterns. The code also adds a ton of device-side AI to parse patterns and discover the name, category, vendor, and other relevant field data. This approach prevents the user from having to type all that information into the program. The sewing app adds a deep set of cataloging tools and features to the more basic functionality of what started as the filament management app.

Also: I tried a Claude Code rival that’s local, open source, and completely free – how it went

The filament app is pretty far along. I’ve been using it actively for about three months, and I’m about ready to start adding the various in-app purchase features. The sewing patterns app is still in fairly early development. It’s taken a lot of time to get the on-device AI to work reliably, given the many variations in pattern and format and how each company does things differently. There are still many user interface elements that need to be designed and wired into all four app platforms.

The new vibe coding loop

Don’t ever let anyone tell you that you can vibe code just by saying a few words or snapping your fingers. As you can see above, these products are complex, even though the AI does the actual coding.

Also: I used Claude Code to vibe code an Apple Watch app in just 12 hours – instead of 2 months

Back in the old-school coding days, there existed a development loop that could be described as edit→build→test→debug, and then back to edit.

All interactive development environments (IDEs) are built around this loop. The bulk of the IDE interface is structured around a file tree and editing support tools, from syntax coloring to command completion and to nice vertical markers that point out the beginning and end of loops. The IDE also includes a debugger. You set breakpoints in the editor and cycle through the code line by line, watching the code run step by step.

Vibe coding also has a loop, and it’s very similar.

Instead of edit, it’s instruct, as in prompt the AI about what you want. Build stays the same. The code has to be turned into a working program. That step is accomplished by interpreting or compiling and then assembling the program.

The test stage also remains the same. But instead of doing the debugging yourself, you have to guide the AI. The AI can find and fix coding errors, but often needs guidance to find where the problem occurred. You can’t just say “fix it” and assume the AI can do it. For a fair number of bugs, it needs some experienced guidance.

So the loop becomes instruct→build→test→guide, and then cycle back to instruct.

Also: I built an iOS app in just two days with just my voice – and it was electrifying

Notice that the vibe coding loop doesn’t really have the edit and debug elements. Most people choose and customize their development environments to optimize editing and debugging, since that’s where most of the time has traditionally been spent.

With vibe coding, most of the time is spent in a chat interface, often just a terminal window. The only time you need to touch the development environment is to initiate a build. Then you run the program you’ve been working on, see what works, and go back to the chat or terminal interface to guide the AI.

Almost no time is spent using an IDE for what we have historically needed an IDE to do.

One-handed coding

This brings us back to Pixel. Last night, he snuggled onto my left shoulder, which meant my left arm and hand were occupied. That left me unable to use the physical keyboard. But I could control my mouse with my right hand, and talk to the AI with my voice.

I have a mouse button programmed to hit the return key, and another to launch Wispr Flow, the dictation software I’m using on my Mac.

Also: Inside Google’s vision to make Gmail your personal AI agent command center

All my actual coding work is done inside iTerm2, a free MacOS terminal program that I set up with multiple tabs, one for each project.

So here’s what my cycle was for two hours last night:

  • Scratch the dog’s back and say something soothing.
  • Bring iTerm2 to the front.
  • Hit the Wispr Flow button on my mouse, dictate instructions to the AI, then press the Return key on my mouse.
  • Wait for the AI to run, or switch to another tab and do the same for a different project.
  • When the AI is done running, switch to Xcode (an IDE) and do a build.
  • Wait for my program to build, switch to it when it’s done, and test.
  • Switch back to the terminal program, report the results to the AI, and give it a new set of instructions.

I did that process for two hours straight. I moved both projects along considerably. The only thing I ever did in the IDE was select the menu item that sends the test code through Apple to my device.

If I had been making something other than an Apple app, I could even have initiated the build using the AI. I wouldn’t have had to use an IDE at all.

I’m starting to think IDEs are obsolete

Tonight, I did some more work on my programs. Right now, Pixel is snuggled against my wife, so I have both hands free to work. I put in another couple of hours “coding” and still didn’t use the Xcode IDE for anything other than sending code to TestFlight, Apple’s code-testing gateway.

Also: I used Claude Code to vibe code a Mac app in 8 hours, but it was more work than magic

Last year, before I started to vibe code big projects in earnest, I thought I’d need an AI-enabled IDE. So, I moved all my coding from PhpStorm, a much-beloved IDE I used for my WordPress plugins, and onto VS Code. I wrote about that move in an article, making serious noises about how choosing the right IDE is important for fully using the AI features.

I had no idea how wrong that take would turn out to be.

For the past few days, I haven’t used the editor or the debugger once.

With one hand and voice dictation, I worked on two completely separate Mac applications. I worked in a simple terminal program with two color-coded windows and a third, which consolidates the two applications.

I’m no longer avoiding using the IDE because I don’t have a free hand. Tonight I’m not using the IDE because it’s completely unnecessary.

Also: 7 surprisingly useful ways to use ChatGPT’s voice mode, from a former skeptic

The terminal and voice dictation process is surprisingly chill, except for the slightly uneasy feeling that reminds me how odd this approach is, especially from someone who has decades of deep emotional connections to the entire concept of an IDE.

Have you also found yourself spending more time in chat interfaces than inside your IDE? Let us know in the comments below.


You can follow my day-to-day project updates on social media. Be sure to subscribe to my weekly update newsletter, and follow me on Twitter/X at @DavidGewirtz, on Facebook at Facebook.com/DavidGewirtz, on Instagram at Instagram.com/DavidGewirtz, on Bluesky at @DavidGewirtz.com, and on YouTube at YouTube.com/DavidGewirtzTV.

Featured

Leave A Reply

Your email address will not be published.