Do you mind if the entire internet has access to your AI generated images? Because, with some providers, that’s exactly how it works. If you are concerned about the privacy of your AI creations, this guide is for you. Today we will delve into the public visibility of your AI art generations and steps you can take to retain your privacy.
Are Images Generated with AI Private?
The answer of this question depends on how you are generating images. AI images generated by a web service are not private in the least bit, but images generated on a local machine are. If you create AI art on your own computer then the only people who can see them are those who have access to your computer files.
But let’s dig into this question a bit deeper. Who exactly has access to the images that you generate when using an online service? We’ll have to compare each service provider because they all have slightly different methods of operation and guidelines.
Midjourney has, in my opinion, the oddest hosting situation of any provider on this list. To access Midjourney, you have to join their discord server. Once access is granted, you then send your prompts to the public discord AI bot which will generate your images and publish them on the server…for every user to see. So if you wanted to create something to keep for yourself, Midjourney’s basic service is not an option. You can pay extra ($10/month) to have your images generated privately via direct messaging with the bot. But, even then, all the images that you generate will be kept and stored (perhaps indefinitely?) by the host company.
Dall-E 2 by OpenAI has a bit more privacy on the surface; there are no public discord servers that you must access. However, you are still interacting with the company’s private servers. Furthermore, OpenAI monitors for violations of their community guidelines (content that is hateful, violent, sexual, etc). From this information we can assume that OpenAI personnel likely have access to the images you generate and can review your prompt history.
But there is no automatic public browse feature in Dall-E 2. So you and OpenAI appear to be the only ones with access to the images you generate.
DreamStudio is Stability AI’s online AI art service. According to their terms of service, they do monitor the prompts that are entered into their online services. In addition, they also have community guidelines against certain types of content.
So images created with their web service are not shared with other users, but they are most likely accessible to Stability AI personnel.
Google Colab is not an actual AI art service provider, but it’s definitely worth discussing when privacy is concerned. It’s entirely possible for you to install and run Stable Diffusion in Google Colab. Essentially, this service gives you access to a GPU in one of Google’s datacenters; the content that is generated using this GPU can then be saved to your Google cloud drive account. So if you want to use a more powerful graphics card remotely, Google colab is a very common option. However, don’t assume you have privacy just because you’re running the AI model yourself. You are still using Google’s hardware and saving the results to their cloud service. And that means anyone who could possibly access your cloud drive contents will have access to your AI art…including Google employees. This is not to say that Google necessarily will go snooping into your cloud storage, but they absolutely can if they want to.
How to Generate AI Art Privately That No One Else Can See
As you may realize at this point: the only way to securely and private generate AI images is to do it from your own hardware. Lucky for you, Stable Diffusion is there to do just that. The easiest way is to download Automatic1111’s web UI for Stable Diffusion (and follow these installation instructions).
Why Should I Care About AI Art Privacy?
For some reason, when it comes to AI art, people automatically associate the desire for privacy with negative connotations: usually, a person will assume that you want to make inappropriate images using the technology. But there are several legitimate reasons (besides NSFW content) as to why you should consider the matter of privacy for your AI image generation process:
- Commercial use. Anything that you generate via Stable Diffusion can be used for commercial purposes. If you intend to sell the images as products (an art book, prints, etc) or employ them as personalized stock art on your website, then you may not want to generate that content “in the wild” on a public Discord server. If you do, anybody could grab those images and start selling them before you do…and that might lead to duplication or infringement disputes later on.
- Personalized imagery. You may want to generate images of yourself or loved ones using a custom-trained model. If that’s the case…do you want random strangers on the internet snooping onto those images? Or what about a model trained specifically on your own art style? Do you mind if other people can access your content and emulate your personal style?
- Corporate data hoarding. A growing number of internet users are suspicious of corporate data harvesting. Many big internet companies will collect as much information about you as humanly possible and store that information, using it for a variety of purposes (advertising, selling to third parties, targeting products, or even just monitoring your interests/activities). If you’re a data privacy advocate and you dislike the idea of a corporate entity keeping tabs on you, then it’s only natural that you would prefer your art ideations to be free from such prying eyes.
There are many different services you can employ to create AI art, but very few of them provide total privacy. The only surefire way to have discretion with your image generation is to install your own version of an AI model on your own computer. Wanting privacy should not be considered a desire worth condemning. There are many reasons why a person would not want their AI art in the hands of other people or companies.
Thanks for stopping by. If you found this article helpful, here are a few others you may like: