Am I Cheating Art? The Unexpected Role of ChatGPT in My Graphic Novel Journey

A laptop open on a generic chat page, on a messy and cluttered artist's desk surrounded by notes, sketches, and a big picture on a chalkboard of a drawing of a man's face with the words "Robert Smalls" next to it.
ChatGPT generated this image, too…

Hello, Unsung Superheroes Fans!

Yesterday, I had a rare day off work, where I had ALSO managed to complete all my other tasks like my correspondence with couples whose weddings I’m DJing and a bunch of housework, before the day started.

So, I spent all day fully immersing myself in what I’ve been chipping away at piecemeal for the past few weeks; restructuring and rewriting the Robert Smalls graphic novel.

Strap in: it’s a long read!

TL;DR

I spent yesterday continuing to restructure the Robert Smalls graphic novel, realising I need to significantly condense the content. To manage my vast, chaotically organised notes, I’ve been using ChatGPT as a research assistant. After hitting some limitations with ChatGPT’s memory in single sessions, I explored Google’s NotebookLM but ultimately created a custom GPT tailored to my project needs. This custom GPT acts as a comprehensive resource, helping organize timelines, translate dialogue into Gullah, and provide historical context. It enhances my creative process without replacing it, allowing me to focus on storytelling and drawing, but I realise some may see this way of working as controversial. What do you think?

The Full Story…

As I mentioned in a previous post, I only scripted a few chapters into the book. Because I was so eager to get started drawing, I left all the other pages as vague notes, mostly of all the events I’d researched about in chronological order. I’ve come to the (rather late-in-the-game) revelation that all the research I’ve compiled simply cannot go into this project. It will extend to several hundred pages and besides being, quite probably boring, I probably won’t live long enough to complete that. So that task before me has been to whittle down the novel by a lot; a daunting task, let me tell you! This has all been a big learning curve for me, and one is that I definitely should have had the whole thing scripted (or at least organised into page breakdowns) before I drew anything. That way, I would have known how long it was going to be and could have course corrected much earlier, while the research was fresher in my head.

I have been using ChatGPT to help with this. This is a fact I initially had some reticence to share, but now am actually quite excited to share.

Like a lot of people, I have been dabbling with ChatGPT and other Large Language Models, and it recently occurred to me that maybe I could utilise this technology in a meaningfully productive way for this project.

I have a huge quantity of notes, many of which I made during Lockdown a few years ago. I’d be lying if I said my memory of every single one was crystal clear! It doesn’t help that I was not formulaic or consistent about how I made these notes either so some are bullet points, others a few words, some are fully scripted pages, some are links to websites or simply a page number and a book title and so on. It certainly doesn’t help that they are in a software that I was told would be useful for helping script a book, Scrivener, but, ultimately, has a learning curve longer than is practical for me and so I struggle to utilise it effectively. I’m sure it’s amazingly powerful and useful if you take the time to utilise it properly, but when I started using it, I wanted to get to the creative part as soon as possible, not spend weeks learning how this database worked.

All of this means, that when I started trying to whittle down the story, I was hamstrung by having half-remembered notes scattered across a software I couldn’t use properly so my first hurdle was remembering the stuff I wanted to cut out!

What About AI?

This is when using Chat GPT occurred to me. I thought that if I could give it access to all my notes, I could utilise it as a research assistant that has instant recall of everything I ever told it. It could summarise all that I had researched in a much more user-friendly format than my often-scrappy notes, and I could start from here.

So, after taking the time to learn how to bulk extract all of my notes from the various individual documents they were in in Scrivener, and pasting them into a word processor document, off I went to ChatGPT to copy and paste them all into a chat and tell it to remember them. While I was aware that ChatGPT can’t pass information between individual chats, I figured if I simply kept using this chat, I’d be ok.

It seemed to work for a while and I was happily asking it to pull out my notes for a certain event and summarise specific events and it was complying pretty well… but the longer the chat went on, the more and more inaccurate it became. Luckily, I am familiar with a lot of the details so I spotted inaccuracies very quickly. I learnt that the way ChatGPT remembers information from earlier in the chat is by referring, not to a database where it stores everything you’ve told it, but by referring to the most recent few chat entries. The number of chat entries it refers to is quite long, but after a while, I had reached the limit of how far back it could refer. It had forgotten the mass of info I had put into it at the start and was now pulling the information from its own responses further down the chat. The longer this went on, the more and more inaccurate the responses got, so this wasn’t going to work long-term.

I looked into alternatives and learnt about Google’s software that is in Beta at the moment, called NotebookLM. It sounded perfect – software that you upload up to 20 documents of notes to, and then you can ask it questions about these notes that it will answer using LLM AI, even citing where in your notes it’s pulled the info from. It’s only available in the US so, I used a VPN to give it a quick go. It seems like with some more support, this software could really be something, but it just wasn’t giving as good responses as ChatGPT was. I relegated this to my backup solution but resolved to try to find a way to get ChatGPT to work for me the way I wanted it to.

That’s when I learnt about custom GPTs. With a paid subscription of about £20/month, you can create custom versions of ChatGPT for specific purposes. They take the existing ChatGPT as a basis (and it is the most advanced iteration of it, as opposed to the previous iteration that a free account lets you use – and the improvement is hugely noticeable!), but you can give them complex instructions on what their purpose is, how they should respond to inputs… AND you can upload documents to it that it will remember. You can then create as many chats with this specialised GPT as you want.

Photo of a laptop screen showing a screen from my custom Robert Smalls GPT in ChatGPT
It doesn’t look like much, but this is my supercharged Robert Smalls ChatGPT!!!

How I’m Using ChatGPT.

This, then, is what I am currently doing. At this point after all my tinkering with it, this GPT is probably just about the most complete one-stop resource you could probably hope to find about Robert Smalls.

My first step was to feed in the vast document I already had, with all my slightly chaotic notes about Robert Smalls’ life and tell it that this was to be its primary source of information about him. My notes include a wealth of information that isn’t easily accessed on the web. During Lockdown, I spent quite some time and money subscribing to historical document archives and regional historical newspaper archives, seeking out often hand-written documents and articles, and transcribing them. This is all stuff that ChatGPT does not have access to as standard, as well as my notes from various books I have bought and read about Smalls.

I tested out my fledgling GPT for quite a while, asking some quite obscure questions and it performed marvellously. Given that ChatGPT has been fed on millions of web pages from around the web, it was able to support information from my notes with contextualising information that isn’t in my notes. For example;

Robert’s stepdaughter, Charlotte, who was also enslaved, got pregnant at 15 while courting a man. This is information that is in my notes, but not easily found by searching the internet. I asked the GPT how common something like Charlotte’s pregnancy would have been, and how scandalous it would have been to both the white and enslaved black communities, and it synthesised information from both my uploaded notes and its wider knowledge, to give me an in-depth answer. I’m not trusting enough to simply accept its answer as unequivocal fact, but it sounded very reasonable and makes it easier to start researching this topic for myself.

Another very useful thing I’ve had it do is to keep on top of various people’s ages and locations at various points in the story, especially various people’s children. My notes sometimes had a birthdate for someone, sometimes not. Sometimes in my notes, I had a single note of someone’s age at one specific event because that’s exactly how I came across their age, and their age is never mentioned again across all my notes. Similarly, if someone moved into or out of someone’s home, I might have one note across thousands that mentions it, but then it is never mentioned again.

I asked the GPT to go through my notes and use the information in them to create family trees of some of the major characters, extrapolating birthdates as closely as possible from any information it finds if this information is not specified. I also asked it to create timelines of where certain characters were and when from the notes. It took a few prompt tweaks and nudges to get it, but within a few minutes, I had a document of all the major characters’ major life events; births, deaths, marriages, birth of children, and where I need to know it, where they were living at certain times. This document, which I keep updating if I think of more information I need the GPT to extract, I have now fed back into the GPT to use as its primary source of information for people in the story.

Why is this useful? With a lot of children being born, dying, moving in and out of people’s homes, having children of their own, etc., whenever I draw a scene in someone’s home, rather than scouring notes in a time-consuming search for the information, I can simply ask the GPT who was living there at the time and how old they were… and it will tell me so I know who I’m including in the background to scenes that are occurring and I can ensure their portrayal is consistent with the historical timeline.

ChatGPT has a knowledge of Gullah, the English-language dialect often spoken by enslaved people in the region. Previously, I had been using the few phrases of Gullah I had found and the couple of websites explaining some of the syntax to try to create Gullah dialogue myself, but I wasn’t at all sure I had it anywhere near correct. Now, I can simply ask my GPT to translate dialogue into Gullah for me. Again, I do intend to check this dialogue with someone who genuinely knows Gullah at a later time, but this gives me a good start.

Now I have all of that done, I have uploaded a document to the GPT that contains my planned outline for the graphic novel, with page and panel breakdowns and dialogue where I have them, and simply events I want to include for the rest. I can now ask the GPT to represent that outline in various different ways which is really helping me to see the shape of the graphic novel and the flow of it – seeing where I’m spending too much time on a particular event, or not allowing a more significant event the space it needs.

So far, the only real downside of this method is that ChatGPT imposes usage limits with its paid plans, and those limits are dynamic depending on server usage. So I’ll be working away in a good flow state when suddenly I hit the usage cap and have to wait half an hour to a couple of hours before I can use my GPT again.

But Does it Cheapen the Final Product?

Now, when I started using ChatGPT in this way, I was concerned that it might oversimplify the creative process or compromise the narrative’s authenticity. However, this tool has proven invaluable, not by replacing the creative process but by enhancing the accuracy and richness of the narrative background. It allows me to focus more on storytelling and artistic expression while handling the intricate historical details with precision. It is still me that is synthesising my notes into an actual story, and me that is writing and drawing the scenes. Because I basically do know the information it is giving back to me, it’s not replacing genuine research. It is helping me organise, collate, and continue my research more effectively. It will hopefully help me to get this project underway again and finished sooner, which I’m very happy about. As mentioned, I do want to reach out to someone with genuine knowledge of the Gullah dialect to check the translations I’m using are authentic so I’m not concerned that I’ll be missing out on a touchpoint of genuine connection there.

Finally, it is important for me to point out that AI generative art is most definitely not creating any of the images used within the actual graphic novel itself. They are all being digitally hand-drawn by me.

So there you go – this is my rather lengthy explanation of what I’m up to with this at the moment. Using my custom GPT, I am working out the structure of the rest of the graphic novel so that I can get started drawing it again. It does mean there’s little visible progress being made right now as I’m not putting out new pages or even work-in-progress pictures at the moment, but please be assured – I’m currently doing plenty of work behind the scenes!

If you read this far, what are your thoughts on this – positive or negative? I know “AI” can be quite a controversial and emotive subject, so I’d love to know your thoughts on my approach to this phase of the project. Let me know in the comments.

Would you be interested in me making the GPT public so you could use the Robert Smalls Knowledge Base GPT yourself? Again, let me know in the comments.

(Yes – the image for the top of this post is AI-generated, and I am slightly less happy with my use of AI-generated images, but I barely have time to draw the graphic novel itself, let alone start drawing images for posts! I’ll probably make a post about this in the future, but I’ve written enough for one post now! Feel free to add your comments on this too though.)

Thanks!

Curt

Unsung Superheroes is an ongoing project, creating graphic novels based on the stories of real-life historical figures whose stories may not be as well-known as they could be.

New pages are added to the current story on the Homepage.

If you would like to help support this project financially, please consider signing up to my Patreon. For less than the price of a cup of coffee per new page created, you can become a Patron of the Arts and help keep this project going.
There’s no minimum sign-up period. You can stop at any time.


Also available is the opportunity for you to appear as a character in the novel.

I’m donating 10% of all pledges to Anti-Slavery International to help them in their efforts to stop modern-day slavery and human trafficking.

Become a Patron!

Become a Patron!