Log Colorizer: I Vibe-Coded This & You'll Love It!
I can’t code. I know, I know—these days, that sounds like an excuse. Anyone can code, right?! Grab some tutorials, maybe an O’Reilly book, download an example project, and jump in. It’s just a matter of learning how to break your project into small steps that you can make the computer do, then memorizing a bit of syntax. Nothing about that is hard! Perhaps you can sense my sarcasm (and sympathize with my lack of time to learn one more technical skill). The reality is, while I can follow pseudocode and grasp basic concepts like conditionals and loops, building a working application beyond “hello world” feels daunting. And frankly, I’ve lost the motivation to become proficient.
The Rise of AI-Assisted Coding
Thankfully, AI is changing the game. Like my colleague Benj Edwards, I can now leverage Large Language Models (LLMs) to tackle projects that would previously have been out of reach, without facing the scrutiny of seasoned developers on platforms like StackOverflow. So, I decided to give it a shot.
The Project: A Python-Based Log Colorizer
My project is a small Python-based log colorizer, built with the help of Claude Code. You can find a version of the project, without my specific customizations, on GitHub. The need arose from a desire to analyze large web server logs, and existing solutions lacked the customization I required. Essentially, I wanted a tool tailored to my exact needs – a true “vibe-coding” experience.
My Nginx log colorizer in action, showing Space City Weather traffic on a typical Wednesday afternoon. Here, I’m running two instances, one for IPv4 visitors and one for IPv6. (By default, all traffic is displayed, but splitting it this way makes things easier for my aging eyes to scan.)
Credit: Lee Hutchinson
Why a Log Colorizer?
There were two primary reasons for this project. First, I needed a way to efficiently sift through extensive web server logs. Off-the-shelf colorizers weren’t flexible enough. Second, the project’s size was manageable – a roughly 400-line, single-file Python script. This made it easy to audit, even for someone with limited coding experience, and fit comfortably within Claude Code’s context window.
The Context: Space City Weather Hosting
I manage the web hosting for Eric Berger’s Houston-area forecasting site, Space City Weather. It’s a self-hosted WordPress site running on an AWS EC2 t3a.large instance, fronted by Cloudflare using CF’s WordPress Automatic Platform Optimization. We also use self-hosted Discourse for commenting, integrated with WordPress via the WP-Discourse plugin.
The Intermittent Caching Issue
Since integrating Discourse in August 2025, we’ve experienced an intermittent issue where new posts sometimes get cached by Cloudflare with the old WordPress comment area instead of the new Discourse comments. This impacts hundreds of visitors until the cache expires. Edge cache invalidation *should* be automatic, but it wasn’t always reliable.
A Quick PHP Fix (Vibe-Coded!)
I used an LLM to create a small PHP mu-plugin that forces WordPress to add “DO NOT CACHE ME!” headers to post pages until Discourse comments are verified. This preemptively solved the problem, but didn’t address the root cause. After temporarily disabling the plugin, the issue resurfaced, confirming the underlying problem persisted.
The Pain of Intermittent Problems
Troubleshooting intermittent issues is incredibly frustrating. You question your skills and sanity, spiraling into a cycle of testing and uncertainty. In this case, I couldn’t reliably reproduce the problem, making diagnosis extremely difficult. My best hope lay in analyzing the server logs.
The Vibe Use Case: Nginx Log Analysis
Space City Weather uses Nginx as its web server. Nginx generates two log files: one for requests and another for errors. I wanted to monitor the access log in real-time while Eric published new posts. However, staring at a wall of text is inefficient. I needed syntax highlighting and colorization to identify important information. While tools like ccze exist, customizing its output is complex and time-consuming.
Building the Log Colorizer with Claude Code
That’s where Claude Code came in. I prompted it to create an Nginx log colorizer, prioritizing efficiency and performance. Claude suggested Python due to its regex support and readability. The “vibe-coding” process involved two sessions, limited by Claude Code credit constraints.
Visual Studio Code, with agentic LLM integration, making with the vibe-coding.
Credit: Lee Hutchinson
The Allure of Effortless Coding
The ease of making small changes and improvements with the LLM was intoxicating. It felt like collaborating with a team of experts. The ability to simply *ask* for a feature and have it implemented instantly is incredibly empowering. However, as Benj Edwards points out, this ease can be deceptive.
Features of the Log Colorizer
The final log colorizer boasts several features:
- Handles multiple Nginx and Apache log file formats
- Colorizes using 256-color ANSI codes
- Organizes hostname & IP addresses in fixed-length columns
- Colorizes HTTP status codes and cache status
- Applies different colors to the request URI based on resource
- Highlights non-HTTPS requests
- Filters output by IPv4 or IPv6 hosts
The final product. She may not look like much, but she’s got it where it counts, kid.
Credit: Lee Hutchinson
Root Cause Analysis: Apple News Bots
With the colorizer in hand, I quickly identified the root cause of the caching issue: Apple News bots requesting the page *before* Discourse comments were fully loaded. This resulted in Cloudflare caching a version without the comments. The solution I had implemented was a workaround, not a fix.
The Limits of LLM-Assisted Coding
While LLMs are powerful, they aren’t intelligent. They are tools that require careful guidance and validation. They excel at tasks you already understand, but can lead you astray if you’re unfamiliar with the problem space. I learned this the hard way when attempting to improve the colorizer’s performance by adding horizontal scrolling.
If you listen carefully, you can hear the sound of my expectations crashing hard into reality.
Credit: Paramount Television
The Scrolling Disaster
I attempted to add horizontal scrolling to the log output, but it resulted in a significant CPU load. Despite numerous iterations and prompts, the LLM couldn’t deliver a performant solution. I realized I was trying to solve a problem in the wrong way – the terminal application should handle scrolling, not the log colorizer.
Lessons Learned
This experience reinforced my view of LLMs: they are valuable tools for augmenting existing skills, but they cannot replace them. They can help you solve problems you mostly understand, but they can’t give you that understanding. Using them effectively requires critical thinking, validation, and a healthy dose of skepticism.
Despite the setbacks, I continue to experiment with vibe coding. It’s a fun and productive way to tackle projects I wouldn’t have attempted otherwise. The key is to be aware of the limitations and to use these tools responsibly. As AI continues to evolve, it will undoubtedly become an increasingly integral part of the software development process. And when screwing around with computers stops being fun, that’s when I’ll know I’ve truly become old.
GearTech is a leading source for technology news and reviews. Stay tuned for more insights into the world of AI and software development.