Code as Art and the Messy Middle

My favorite memories of programming are all the same. Consumed by code. Staring at text, trying to understand it, shape it, make it do what I want.

The friction was the point. You fought the machine. You won. The dopamine hit when broken things started working was real.

AI removes that friction. This is mostly good. But something is lost.

Coding Becomes Management

The shift is subtle. You stop writing code and start supervising it. The AI proposes. You review. You guide. You accept or reject.

This is management, not craft.

I don’t romanticize the old way. Debugging segfaults for hours wasn’t fun. But there was a deep satisfaction in sculpting messy solutions into clean code. Writing readable code was an art people devoted time to.

Now the AI writes readable code. It follows patterns. It’s competent. But it’s also generic.

The aesthetic decisions that made code personal are increasingly irrelevant.

Beautiful Code I Wrote By Hand

I want to share some projects I’m proud of. Not because they’re useful (they’re not). But because they represent hours of immersive coding that I loved.

Gale-Shapley Algorithm in Prolog

This is the algorithm that matches US medical residents to hospitals. Every year, thousands of graduating medical students rank hospitals. Hospitals rank students.

The algorithm finds stable pairings where no student and hospital would both prefer each other over their assigned match. It won Alvin Roth a Nobel Prize in 2012.

This Prolog implementation expresses that entire algorithm in about 50 lines.

prefer(Who,Yes,No) :-
    prefs(Who,Rank),member(Yes,Rank),not(member(No,Rank)).
prefer(Who,Yes,No) :-
    prefs(Who,Rank),member(Yes,Rank),member(No,Rank),
    append(_,[Yes|Rest],Rank),
    member(No,Rest).

Two clauses handle the preference logic. First clause: Yes is in the list, No isn’t. Second clause: both are in the list, Yes appears before No.

The append(_,[Yes|Rest],Rank) pattern elegantly finds Yes’s position and checks No is in the remainder.

Source: github.com/4meta5/gsa

Sequence Recognition in Prolog

You give it numbers. It tells you what generates them.

?- rec([1,4,9,16,25]).
Polynomial sequence: N^2

?- rec([1,1,2,3,5,8,13]).
Fibonacci Numbers

?- rec([2,6,18,54]).
Geometric series with ratio 3

It recognizes arithmetic/geometric series, Fibonacci-like recurrences, Catalan numbers. For any integer sequence, it can derive the polynomial formula using difference tables.

dif(L,0,L).
dif([X1,X2],1,[Y]) :- Y #= X2-X1.
dif([H1,H2|T1],1,[H3|T2]) :-
    dif([H1,H2],1,[H3]),
    dif([H2|T1],1,T2).

Three clauses build difference sequences. The magic is #=. It’s a constraint, not an assignment.

Give it [1,4,9,16,25] and it computes differences [3,5,7,9]. Give it differences and it can reconstruct the original. The same code runs forwards and backwards.

Source: github.com/4meta5/reconocer

Combinatorics Calculator in Prolog

An interactive quiz for learning combinatorics.

?- qst(nk_perms, 5, 2, Answer, Result).
"5 people sit in a row. How many arrangements have exactly 2 in their original seats?"
Answer = 20.
Result = correct.

?- second_stir(4, 2, X).
X = 7.
% 7 ways to partition {a,b,c,d} into 2 non-empty groups

It generates random problems about permutations, derangements, partitions. You answer. It checks your work.

second_stir(X,X,1).
second_stir(X,0,R) :- X > 0 -> R is 0.
second_stir(N,K,Result) :-
    N > K, K > 0,
    N1 is N - 1,
    second_stir(N1,K,R1),
    second_stir(N1,K-1,R2),
    Result is ((K*R1) + R2).

Stirling numbers count ways to partition n items into k groups.

The recurrence: either item n joins an existing group (K choices, times S(n-1,k)) or starts its own (S(n-1,k-1)).

The code is the mathematical definition. You can read the formula off the page.

I built this to learn combinatorics. Writing the math as Prolog forced me to understand it.

Source: github.com/4meta5/combinatorics-calc

Huffman Codec in Rust

Lossless compression means you compress data and get the exact original back when you decompress.

Huffman coding assigns short bit sequences to common characters and long ones to rare characters. “e” becomes “10”. “z” becomes “110101”. The file shrinks because frequent characters take fewer bits.

This implementation has zero dependencies and runs without the standard library.

pub fn new(s: &str) -> Self {
    fn map_to_heap(map: BTreeMap<char, i32>) -> BinaryHeap<Rc<Tree>> {
        let mut heap = BinaryHeap::new();
        map.into_iter().for_each(|(l, c)| {
            heap.push(Tree::new(l, c));
        });
        heap
    }

    fn heap_to_tree(mut heap: BinaryHeap<Rc<Tree>>) -> Rc<Tree> {
        while heap.len() > 1 {
            let (t1, t2) = (heap.pop().unwrap(), heap.pop().unwrap());
            heap.push(Tree::merge(t1, t2));
        }
        heap.pop().unwrap()
    }

    fn tree_to_codes(root: &Option<Rc<Tree>>, prefix: Vec<u8>, mut map: Dictionary) -> Dictionary {
        if let Some(ref tree) = *root {
            match tree.value {
                Some(t) => { map.insert(t, prefix); }
                None => {
                    let (mut prefix_l, mut prefix_r) = (prefix.clone(), prefix);
                    prefix_l.push(1u8);
                    let map = tree_to_codes(&tree.left, prefix_l, map);
                    prefix_r.push(0u8);
                    return tree_to_codes(&tree.right, prefix_r, map);
                }
            }
        }
        map
    }

    let f_map = frequency(s);
    let heap = map_to_heap(f_map);
    let tree = heap_to_tree(heap);
    Self(tree_to_codes(&Some(tree), Default::default(), Default::default()))
}

Three functions live inside new. Each transforms data into the next stage.

Rust lets you declare functions inside functions. I learned this from Prolog. It reads well.

And the decoder, where every optimization choice matters:

pub fn decode_iterator<'a, I>(&self, it: I) -> String
where
    I: Iterator<Item = &'a u8>,
{
    let mut rmap: Vec<(&[u8], char)> = self.0.iter().map(|(k, v)| (v.as_slice(), k)).collect();
    rmap.sort_unstable_by_key(|(k, _)| *k);

    #[inline(always)]
    fn binfind(map: &[(&[u8], char)], key: &[u8]) -> Option<char> {
        match map.binary_search_by_key(&key, |(k, _)| k) {
            Ok(index) => Some(unsafe { map.get_unchecked(index).1 }),
            Err(_) => None,
        }
    }

    let mut temp = Vec::<u8>::with_capacity(16);
    it.filter_map(|b| {
        temp.push(*b);
        if let Some(c) = binfind(rmap.as_slice(), &temp) {
            temp.clear();
            Some(c)
        } else {
            None
        }
    }).collect()
}

The decoder looks like chicken scratch but every choice matters.

sort_unstable avoids allocation. Binary search replaces hash lookup. unsafe { get_unchecked } skips bounds checking after the search already validated the index.

The iterator is generic over any byte source. I spent hours shaving allocations. That was the fun.

Source: github.com/4meta5/huffman-codec

The Messy Middle

The next few years will produce a lot of AI slop software.

AI lowers the barrier to writing code. It doesn’t lower the barrier to good taste.

People who couldn’t code before can now produce working software. But they don’t know what good software looks like.

We’ll see apps that seem to work but fail in subtle ways. Code that needs to be refactored but nobody wants to maintain.

The messy middle. Accessible to create, painful to use.

The Metrics Problem

I made my GitHub activity private. Open source metrics are meaningless now.

Commits, lines changed, contribution graphs. These have been gameable for years. AI code assistants have rendered them absolutely meaningless.

There was a time when employers actually looked at contribution graphs as a signal for activity. That time is over. A thousand commits might just all be authored by AI.

The Hardest Time to Learn

Here’s the tragedy: this is the hardest time ever to learn to code.

The friction that made programming hard is exactly what made it rewarding. You struggled, you figured it out, you got the dopamine. The struggle was the learning.

Remove the struggle and you remove the reward.

New programmers can produce working code on day one. But they don’t understand why it works. They can’t debug it when it breaks. They’ve skipped the part where you actually learn.

Low friction means low dopamine from overcoming obstacles. The motivation to persist evaporates when there’s no difficulty.

A Moment of Silence For the Lost Flow

I’m not against AI coding assistants. I use them constantly. They make me faster.

But I want to acknowledge what’s passing.

The creative flow of writing beautiful code for the sake of writing beautiful code. The late nights staring at a function until it clicked. The satisfaction of a clean diff.

There was a lot of fun in the process. The struggle, the breakthrough, the pride in code that was yours because you typed every character.

We’ve gained significant productivity. We’ve also lost something along the way.

I’m glad I got to experience both.