Second Brains vs AI Assistance

October 08, 2022

What makes a tool for thought? Where do we want friction? What work should AI do? How do we augment human intellect?

I recently had a conversation about Formable and its competitors that sparked some thoughts on what this kind of software should do for us. I was presented with the idea that these tools should create relations between nodes and ideas for us, since linking them ourselves creates a lot of friction. My immediate response was that this friction is a good thing.

I understand that argument. After all, our first brain works the same way. We don’t always link ideas together consciously. A lot of creativity happens subconsciously. Neurons just connect. It’s even helpful if we’re completely unconscious, i.e. asleep. So should our second brains, the systems and software we use to expand the functionality of our actual brains, just do the same? AI is making rapid progress, so can’t we have a second brain completely separate from ours that does the same job? No. We want a tool for our thoughts, not software that does things completely on its own.

Why is the friction of linking our thoughts ourselves a good idea? We know that just the act of writing things down, not necessarily reviewing what you’ve written, helps with remembering things. I’d argue that it’s the same with linking ideas together. That active process gets our brain working and creates a stronger mapping between what we see on the screen or on paper, and our complex neural circuits.

Practise more note-making, less note-taking.

AI can do some jobs for us, but we should leave some thinking to ourselves. I view AI like an assistant in that regard. It can help make suggestions, and get tasks done that I don’t want to do. But sometimes I want to be completely alone with my thoughts.

One area where I’ve personally experienced working along an AI is using GitHub Copilot for programming. It basically makes suggestions of which code to write, and I get to choose whether to accept it. Sometimes the suggestions are insanely good. What I don’t like about it is that it transforms the work of writing code into an act of reviewing. You’re constantly checking whether the AI’s suggestions make sense, instead of architecting solutions. So even if it makes me faster, I’m not sure whether I create better outcomes and whether this type of work is more fun.

Within personal knowledge management, quality of thoughts should always be above quantity. Sure, with the help of AI one could link more ideas, but saving every single idea and everything you ever read might not be a good idea. Copy pasting some text you’ve read once and then letting AI link it to other things you read does have potential, but it won’t have the same effect as writing it down with personal notes and linking it to the mind palace you built yourself.

There’s a clear difference between your second brain and an assistant that knows your thoughts very well.

Augmenting human intellect with the help of AI or other systems that connect thoughts for you is exciting. To me, building better pen and paper and fully utilising the digital realm is even more so.

There are many ways in which AI could help.
For example by adding another layer of structure on top of your unstructured data: If you write “Deep Work”, that node should automatically be categorised as a book, and additional data about it could be pulled in.
The more info or connections are added automatically, the further your PKM might be watered down. We should design clear dividers between our second brains, and collective ones.
Even if the AI only makes suggestions, it’s easy to just mindlessly accept them. They might not perfectly fit with the way you’d do something yourself.

AI assistance and second brains are different things with little overlap in my opinion. Quoting some lines out of “How to Take Smart Notes” by Sönke Ahrens:

  • “[We need an] external system to think in and organise your thoughts, ideas and collected facts”
  • “We need a reliable and simple external structure to think in that compensates for the limitations of our brains”

So touching on the title of this text, “Second Brains vs AI”; it’s two different problems. AI can certainly help you in your work, but it doesn’t help you think.

If you’re interested in another read on “designing friction back into our products and our lives”, I highly recommend this article on airbnb.design, which talks about the “the kinds of friction that lead to self-reflection, self-discovery, and personal growth”.

With humanity constantly automating more and more things, and AI tackling completely new fields, it’s a good exercise to ask yourself where you don’t want a smoothly paved, straight float road to nowhere.

I am hyped about AI. We just have to be deliberate about where we still want to put in work ourselves.

Thanks to CCC for reading an earlier draft and providing feedback.

Edit 2023-01-03: On trustworthy AI:

#Trust

According to Allen, it’s only “when you can clear your mind and organise your thoughts that you can truly achieve effective results and unleash your creative potential.”

If you rely on AI to organise your thoughts for you, are you clearing your mind?
Making lists of your own certainly allows you to do that.

You have to know where your stuff is, and be confident in being able to retrieve it. You have to have worked along that certain AI and know which outputs you can expect from it. You have to build trust.

Take this example: You write about the workouts you do every day in your journal. Completely unstructured data. At some point you want a summary of all your workouts.
If you created a list yourself, for example in Formable by adding a tag to all your workout entries, you’d know where to look. You’d know that you can find all your workouts in that list. You’d know that you can trust that list to be complete and accurate.
If on the other hand you relied on AI to generate a list of your workouts for you, you’d have to trust that the AI did its job correctly. You’d have to trust that it didn’t miss any workouts. You’d have to trust that it didn’t mess up any data along the way.
You’d have to trust that the AI is reliable.
If you don’t completely trust the AI, you won’t manage to clear your mind.

As AI tools get better and better, and people get used to working with them, they might become more reliable. For now, I wouldn’t suggest to rely on them for above example. The situation might change in the future though.