I'm excited to share my experience with a local, open-source alternative to Claude Code, a pricey AI coding tool. This free option has been gaining traction, and I wanted to see if it could truly compete with the big names in the industry.
Can a free AI stack replace expensive Claude Code?
I embarked on this journey with Goose, an open-source agent framework developed by Jack Dorsey's company, Block, and Qwen3-coder, a coding-centric language model. Together, they promise a fully free alternative to Claude Code. But is it as good as it sounds?
In this three-part series, I'll guide you through setting up these tools, understanding their roles, and attempting to build an iPad app. Let's dive in!
Downloading and Installing the Software
You'll need to download Goose and Ollama first. Later, you'll download the Qwen3-coder model within Ollama. Here's a step-by-step guide:
- Download Goose from GitHub.
- Install Ollama. I recommend using the app version for simplicity.
- Once Ollama is installed, click on the model to the right, which defaults to gpt-oss-20b. Choose Qwen3-coder:30b, which has about 30 billion parameters.
Note: The model will download when prompted, and it's a hefty 17GB, so ensure you have sufficient storage.
Making Ollama Visible
To make Ollama accessible to other applications, go to Settings in the Ollama menu and turn on "Expose Ollama to the network." I let Ollama install in the .ollama directory, but remember it's hiding a 17GB file!
I set my context length to 32K, but you can adjust based on your machine's RAM.
Installing Goose
Next, install Goose. Choose the appropriate version for your operating system. When you launch Goose, go to "Other Providers" and click "Go to Provider Settings." Scroll down to Ollama and hit "Configure." You'll be asked to choose a model; select qwen3-coder:30b and hit "Select Model." Congratulations, you've set up your local coding agent!
Testing Goose
To test Goose, input a prompt. It's helpful to specify the directory Goose will use. I used a temporary folder for my initial test. As a challenge, I asked Goose to build a simple WordPress plugin. Unfortunately, it failed the first three times, generating a non-functional plugin.
First Impressions
I was disappointed it took five tries for Goose to get it right. However, unlike chatbots, agentic coding tools work directly on the source code, so repeated corrections improve the codebase. My colleague, Tiernan Ray, found performance issues on his 16GB M1 Mac, but I'm running it on a 128GB RAM M4 Max Mac Studio with multiple applications open, and performance is quite good so far.
Final Thoughts
While these are initial impressions, I'm eager to test this free solution on a larger project to compare it to Claude Code's Max plan or OpenAI's Pro plan. Stay tuned for that analysis!
Have you tried running a coding-focused LLM locally? Share your experiences and hardware details in the comments. Let's discuss the pros and cons of local vs. cloud options!
And here's the controversial part...
While local LLMs offer privacy benefits, they may not match the performance and features of cloud-based options. Is the trade-off worth it? What are your thoughts on the future of local AI coding? I'd love to hear your opinions in the comments!