Crafting fake images¶
How fake images are made—and how to spot them
Fake images can be used to manipulate, mislead, or intimidate. In intimate partner abuse (IPA) situations, an abuser might use them to create false memories, prove fake events, or ruin reputations. They might show you in a place you never were, with people you don’t know, or doing something you never did.
This guide helps you understand how these images are made, what to look for, and how to respond. The more you know, the better you can protect yourself and help others spot the signs.
Types of fake images¶
AI-generated portraits¶
AI can create completely new faces that look real but aren’t.
Example: A threatening message comes from a social media profile using a photo of a “person” who doesn’t exist.
Signs to watch for:
Asymmetry in the eyes or ears
Strange patterns in the background
Blurry or mismatched teeth or earrings
Photoshopped images¶
Someone takes a real photo and edits it—adds you in, removes someone else, or changes the setting.
Example: A photo shows you at a party you never attended.
Signs to watch for:
Lighting on faces doesn’t match
Shadows fall the wrong way
Hands, eyes, or reflections look unnatural
AI-altered scenes¶
AI tools can change facial expressions, remove clothes, or generate entirely new versions of real images.
Example: A harmless selfie becomes a sexual image.
Signs to watch for:
Slight warping around the edges
Misaligned hair or background textures
Repeating patterns where the image was altered
How are fake images made? (for workshop learning only!)¶
Please do not use real people. These steps are for training and awareness only.
1. Generate a fake face¶
Save an image of a person who doesn’t exist.
2. Edit an image¶
Use basic tools to change context or appearance.
Online editors:
Desktop tools:
GIMP (Windows/macOS/Linux) – powerful and free
Paint.NET (Windows) – simpler photo editing
Krita (Windows/macOS/Linux) – also useful for drawing edits
Use these to:
Copy/paste elements (e.g. your face onto someone else’s body)
Blur or blend to hide seams
Add false text messages, timestamps, or visual clues
3. AI tools for image editing or creation¶
Online services:
Canva – includes AI image generator and background remover
Fotor AI – turns text into images
Artbreeder – blends existing images into new ones
Remove.bg – removes backgrounds quickly
More advanced:
Runway ML – AI-powered video and image manipulation (browser-based)
Stable Diffusion – text-to-image model; often used via DreamStudio or Hugging Face Spaces
4. Make it look odd on purpose¶
Stretch or compress elements to create obvious errors
Blur just one side of the image
Add mismatched lighting or shadow effects
Insert repeating backgrounds (common AI glitch)
This helps participants learn to spot fake visual cues.
5. Use in a workshop¶
Show your fake image and ask:
What parts look fake? What looks real?
What made this convincing or not?
How could someone fall for it?
What protections or checks would help?
Explain how it was made:
Tool(s) used
Time it took
Skills needed (usually very little!)
Encourage participants to practice spotting fakes with fictional examples.
Responding to fake images¶
Don’t panic: Most people can’t tell what’s real on first glance either.
Keep a record: Save the image and where it appeared.
Report if safe: Many platforms allow you to flag AI or altered content.
Ask for help: A support worker or advocate can help document or respond.
Facilitator notes (for running sessions)¶
Let participants know: Being tricked doesn’t mean you’re gullible. These tools are designed to fool.
Use only fictional people or stock images.
Allow space for emotional responses—these exercises can feel invasive.
Highlight practical defences: reverse image search, metadata checking, and digital literacy.
Fake images are easy to make but easier to beat when you know the signs. This guide helps you build those skills, raise awareness, and help others feel more confident spotting visual manipulation.
Trust your instincts. Ask questions. Look twice.
Being a sceptic is a survival skill.