Colorize Black and White Photos: 5 AI and Manual Methods
Your grandparents lived in color. The camera just did not record it. That faded black-and-white portrait sitting in a shoebox captures the moment but strips the context — the blue of a favorite dress, the green of a backyard lawn, the warm tone of a wooden floor.
AI colorization can bring that context back in seconds. Neural networks trained on millions of color images predict what each region of a monochrome photo should look like, painting over the gray with plausible hues. The results range from eerily accurate to amusingly wrong, depending on the tool, the image, and the subject matter.
This guide covers five methods — three AI-powered tools, Photoshop's neural filter, and a fully manual approach in GIMP — so you can pick the right one for your photo and your budget.
Looking for the opposite direction? See our guide on how to make an image black and white.
How AI Colorization Actually Works
Before choosing a tool, it helps to understand what these models are doing — and why they sometimes get it wrong.
The Neural Network Approach
Modern colorization models use convolutional neural networks (CNNs) trained on pairs of images: the original color photo and its desaturated grayscale version. During training, the model learns statistical associations — grass is green, skies are blue, skin tones fall within a certain range, military uniforms in the 1940s were khaki or olive drab.
Most tools work in the Lab color space rather than RGB. Lab separates lightness (the L channel) from color information (the a and b channels). The grayscale input provides the L channel. The neural network's job is to predict the a and b channels — the color — from the luminance patterns alone.
DeOldify and MyHeritage InColor use a GAN (Generative Adversarial Network) architecture where two networks compete: the generator predicts colors, and the discriminator evaluates whether the result looks like a real color photo. This adversarial training pushes the generator toward more realistic output over thousands of iterations.
Palette.fm takes a different approach, using a diffusion-based model that starts with noise in the color channels and iteratively refines it toward plausible colors, guided by the luminance structure of the input image.
Why AI Gets Colors Wrong
The model is making educated guesses, not recovering lost information. The actual color data is gone — destroyed when the photo was taken with monochrome film or converted to grayscale. The AI cannot know that:
- Grandma's dress was red, not blue. Both colors map to similar gray values in many lighting conditions.
- That car was a specific shade of teal. The model defaults to statistically common car colors for the era.
- The wallpaper had a yellow floral pattern. Without texture cues, flat surfaces get a single best-guess color.
Colorization accuracy is highest for subjects with strong contextual signals — green leaves, blue sky, caucasian skin tones (models are trained on biased datasets, a real limitation). Accuracy drops for ambiguous subjects: clothing, painted surfaces, flowers, and anything the model has not seen enough examples of during training.
For historically important photos, treat AI colorization as a starting point and manually correct colors you know to be wrong.
Try it yourself
Reduce file size without visible quality loss — free, instant, no signup. Your images never leave your browser.
Comparison: AI Colorization Tools
| Feature | DeOldify | MyHeritage InColor | Palette.fm | Photoshop Neural Filters | GIMP (Manual) |
|---|---|---|---|---|---|
| Method | GAN (deep learning) | GAN (proprietary) | Diffusion model | Adobe Sensei CNN | Manual brush/layer |
| Cost | Free (open source) | Free (10 photos), then $179/yr Heritage plan | Free (unlimited, watermarked), $15/mo Pro | $22.99/mo (Creative Cloud) | Free |
| Max resolution | Unlimited (local) | 12 MP | 4096x4096 | Unlimited (local) | Unlimited (local) |
| Batch processing | Yes (script) | Yes (web UI) | No | Yes (Actions) | No |
| Runs locally | Yes (GPU recommended) | No (cloud) | No (cloud) | Yes | Yes |
| Privacy | Full (local) | Images uploaded to servers | Images uploaded to servers | Full (local) | Full (local) |
| Quality (typical) | High | High (faces) | Very high (scenes) | Medium-high | Depends on skill |
| License | MIT | Proprietary | Proprietary | Proprietary | GPL v3 |
| Version | v0.5.1 (2024) | Web (2025) | Web (2025) | Photoshop v25.12 (2025) | GIMP v2.10.38 |
| Best for | Privacy-conscious users, developers | Family photos, portraits | Landscapes, scenes, artistic results | Photographers already in Adobe ecosystem | Full control, historically accurate work |
Method 1: DeOldify (Free, Open Source)
DeOldify is an open-source GAN-based colorization model released under the MIT license. It runs locally on your machine, meaning your photos never leave your computer.
Setup (Python Required)
DeOldify v0.5.1 requires Python 3.8+ and benefits enormously from a CUDA-capable GPU. CPU-only inference works but takes 30-60 seconds per image instead of 2-3 seconds.
git clone https://github.com/jantic/DeOldify.git
cd DeOldify
pip install -r requirements.txt
Download the pretrained model weights (artistic and stable models are available — artistic produces more vivid colors, stable is more conservative):
mkdir models
# Artistic model (recommended for photos)
wget https://data.deepai.org/deoldify/ColorizeArtistic_gen.pth -O models/ColorizeArtistic_gen.pth
Colorize a Photo
from deoldify import device
from deoldify.device_id import DeviceId
device.set(device=DeviceId.GPU0) # Use DeviceId.CPU if no GPU
from deoldify.visualize import get_image_colorizer
colorizer = get_image_colorizer(artistic=True)
result = colorizer.plot_transformed_image(
path="old_photo.jpg",
render_factor=35, # Higher = more detail, more VRAM. Range: 7-45
display_render_factor=True
)
The render_factor parameter controls the resolution at which the neural network processes the image. Higher values capture finer color detail but require more GPU memory. Start at 35 and reduce if you hit out-of-memory errors.
When to Use DeOldify
Pick DeOldify when privacy matters (photos stay local), when you need batch processing (script it with a loop), or when you want to experiment with render factors to find the best result. The MIT license means you can use it commercially with no restrictions.
Method 2: MyHeritage InColor (Best for Portraits)
MyHeritage InColor is a cloud-based colorization tool built into the MyHeritage genealogy platform. It is specifically optimized for faces and family photos — the face detection and skin-tone rendering are noticeably better than general-purpose tools.
How to Use It
- Go to myheritage.com/incolor.
- Upload your black-and-white photo (drag and drop or browse).
- Wait 10-15 seconds for processing.
- Download the colorized result.
You get 10 free colorizations without an account. After that, a MyHeritage Complete plan ($179/year) includes unlimited colorizations along with genealogy features.
Strengths and Limitations
MyHeritage excels at portraits. Skin tones are consistently natural, eyes get realistic color, and hair looks convincing. The model handles formal portrait lighting (the kind used in old studio photos) particularly well.
It struggles more with landscapes, group photos where faces are small, and images with complex backgrounds. For non-portrait photos, Palette.fm generally produces better results.
Method 3: Palette.fm (Best Overall Quality)
Palette.fm uses a diffusion-based model that consistently produces the most natural-looking colorizations across a wide range of subjects. Landscapes, street scenes, architecture, and portraits all look convincing.
How to Use It
- Go to palette.fm.
- Upload your image.
- Browse the generated color variations — Palette.fm produces multiple palette suggestions.
- Download your preferred version.
The free tier adds a small watermark. The Pro plan ($15/month) removes watermarks and provides higher-resolution output up to 4096x4096 pixels.
The Palette Approach
What makes Palette.fm different is its suggestion system. Instead of a single colorized output, it generates several interpretations using different color palettes. One version might give a photo warm, golden tones (like late afternoon light), while another goes cooler and more muted. This is useful when you do not know the original colors — you can pick the interpretation that feels most historically plausible or most aesthetically pleasing.
The diffusion architecture also handles gradients and transitions between colored regions more smoothly than GAN-based tools, reducing the visible seams and color bleeding that sometimes appear in DeOldify or MyHeritage results.
Method 4: Adobe Photoshop Neural Filters (v25.12)
Photoshop's Colorize neural filter (part of Adobe Sensei) integrates directly into the Photoshop editing workflow. This matters because you can colorize, then immediately make manual corrections with Photoshop's full toolset — the best of both worlds.
Step-by-Step
- Open your grayscale image in Photoshop v25.12 (requires Creative Cloud, $22.99/month).
- Convert to RGB: Image > Mode > RGB Color (the filter requires RGB mode).
- Open Filter > Neural Filters.
- Enable Colorize (download the model on first use, ~300 MB).
- Adjust the Auto Color result using the focal point color pickers — click areas of the image and assign a color hint.
- Check Output as New Layer to keep the colorization non-destructive.
- Click OK to apply.
Manual Correction After AI
The real power of Photoshop's approach is what comes after the neural filter runs. Add a Hue/Saturation adjustment layer clipped to the colorized layer. Use the eyedropper to select specific color ranges and shift them. If the AI made grandma's dress blue but you know it was red, select the blue range and rotate the hue.
For more targeted corrections, paint on a layer mask to isolate specific regions and adjust their color independently.
This hybrid workflow — AI for the base colorization, manual for known corrections — produces the most historically accurate results when you have reference information about the original colors.
Method 5: Manual Colorization in GIMP (v2.10.38)
Manual colorization gives you complete control. Every color is a deliberate choice. This is the only method that can produce truly historically accurate results — if you know what the original colors were.
GIMP v2.10.38 is free and open source (GPL v3). It runs on Windows, macOS, and Linux.
Step-by-Step
- Open the grayscale image in GIMP v2.10.38.
- Convert to RGB: Image > Mode > RGB.
- Create a new transparent layer above the photo: Layer > New Layer > Transparency.
- Set the new layer's blend mode to Color (in the Layers panel dropdown).
- Select the Paintbrush tool, pick a color, and paint over a region.
- The Color blend mode applies your chosen hue and saturation while preserving the original luminance from the grayscale layer below.
- Create separate layers for skin, clothing, background, and other distinct regions. This lets you adjust each color independently later.
- Lower opacity on individual layers if colors look too intense.
Tips for Realistic Manual Colorization
- Use reference photos. Search for color photos from the same era, location, or subject to guide your color choices.
- Skin tones are subtle. Pure orange or pink looks fake. Sample skin tones from real photos: typical values land around HSL (20, 40-60%, 60-80%) for lighter skin.
- Shadows are not gray. In real photos, shadows take on the complement of the ambient light color. Daylight shadows lean slightly blue. Tungsten-lit shadows lean slightly cool.
- Desaturate backgrounds. Foreground subjects should have more color saturation than backgrounds. This creates natural depth.
- Vary hue within regions. A red dress is not one flat red — folds catch more or less light, creating warmer and cooler variations.
Manual colorization of a single photo typically takes 2-6 hours depending on complexity. For anything more than a handful of photos, AI tools are the practical choice.
Preparing Your Photo Before Colorization
AI colorization works best on clean, high-contrast input. Spending a few minutes on preparation dramatically improves results.
Scan quality matters. If you are digitizing a physical print, scan at a minimum of 600 DPI. Higher resolution gives the neural network more luminance detail to work with, which translates to better color predictions. Save as TIFF or PNG — JPEG compression destroys the subtle tonal variations that colorization models rely on.
Fix damage first. Scratches, tears, and stains confuse colorization models. The model might interpret a scratch as an edge between two objects and assign different colors on each side. Restore the image before colorizing. Our old photo restoration guide covers this in detail.
Adjust contrast. If the photo is heavily faded, boost the contrast so the model can distinguish between light and dark regions. Too flat and the model has nothing to work with. Too harsh and you lose the mid-tone transitions where color variation lives.
Once your photo is colorized, you may want to resize it for sharing or convert the format for web use. For quick adjustments to color balance after colorization, Pixotter's color tools can help fine-tune the output without a full editor.
Limitations You Should Know
AI colorization is impressive but not magic. Understanding the limitations helps you set realistic expectations and know when to intervene manually.
Color accuracy is probabilistic, not factual. The model assigns the most statistically likely color for each region. A military uniform gets olive drab because most military uniforms in the training data were olive drab — even if the specific uniform was navy blue. There is no way for the model to recover colors that were never recorded.
Skin tone bias exists. Most training datasets skew toward lighter skin tones, which means colorization accuracy is lower for darker skin. Results may appear washed out or unnatural. This is an active area of research, but no current tool has fully solved it.
Unusual lighting defeats the model. Photos taken under colored stage lighting, neon signs, or heavy shadow produce unreliable results because the luminance values do not correspond to the object's actual color.
Fine patterns and textures get muddy. Plaid shirts, patterned wallpaper, and multicolored details below a certain size threshold all tend to get painted a single averaged color. The model cannot resolve patterns smaller than its effective receptive field.
Consistency across a series is not guaranteed. Colorizing five photos of the same person may produce five different skin tones and hair colors. If consistency matters (a family album, for example), pick one result you like and use it as a color reference for manual corrections on the others.
For related color manipulation techniques, see our guides on grayscale image conversion and making images black and white.
FAQ
Can AI colorize any black-and-white photo?
AI can process any grayscale image, but quality varies. Clear subjects with good contrast produce the best results. Heavily damaged, low-resolution, or extremely dark photos yield poor colorization because the model has insufficient luminance data to predict colors from.
Are the colors historically accurate?
No. AI colorization is a statistical prediction, not a recovery of original colors. The model assigns the most probable colors based on its training data. For historically accurate results, use manual colorization with reference photos from the same era and location.
Which free tool produces the best results?
DeOldify (MIT license, free, runs locally) produces consistently good results across photo types. Palette.fm's free tier is excellent for quick one-off colorizations but adds a watermark. For portraits specifically, MyHeritage InColor's 10 free colorizations are hard to beat.
Do I need a GPU for AI colorization?
Only for DeOldify running locally. Cloud-based tools (MyHeritage, Palette.fm) handle processing on their servers. Photoshop's neural filter uses your GPU if available but falls back to CPU. GIMP's manual method has no GPU requirement.
Can I colorize a photo and then convert it back to black and white?
Yes. Colorize first, then convert to black and white using any grayscale method. The round-trip is not lossless — you will get a different grayscale than the original because the colorized version has different tonal relationships — but it works for creative experimentation.
How do I colorize a batch of photos?
DeOldify supports scripted batch processing through Python. Photoshop can batch-process via Actions (record the neural filter step, then run the Action on a folder). MyHeritage supports batch uploads through its web interface. Palette.fm and GIMP are single-image workflows.
Why does my colorized photo look washed out?
Three common causes: the input image had low contrast (fix with levels/curves before colorizing), the tool's render factor or quality setting was too low (increase it), or the image was JPEG-compressed before colorization (use TIFF or PNG input for best results).
Is it legal to colorize and publish someone else's old photo?
Copyright depends on the original photo, not the colorization process. Photos taken before 1929 are in the public domain in the United States. Photos from 1929 onward may still be under copyright. Adding color does not create a new copyright in the underlying image. If you are unsure, research the specific photo's copyright status before publishing.
What to Do After Colorizing
A freshly colorized photo often needs a few finishing touches:
- Adjust color balance. Most AI tools produce slightly cool results. A gentle warm shift (+5-10 on the temperature slider) makes photos from the pre-digital era feel more period-appropriate.
- Reduce saturation slightly. AI colorization tends to oversaturate. Pulling saturation back 10-15% yields more natural results, especially for skin tones.
- Sharpen edges. The colorization process can soften boundaries between color regions. Light unsharp masking (Amount: 50-80%, Radius: 1-2px) restores definition.
- Resize for your target platform. A 600 DPI scan colorized at full resolution produces a massive file. Resize to the dimensions you actually need.
- Save in the right format. For print, save as TIFF. For web sharing, convert to WebP or optimized JPEG for smaller file sizes without visible quality loss.
If you are working with a collection of old family photos, pair colorization with the full restoration workflow in our old photo restoration guide — fix damage first, then add color. The order matters: colorization models produce cleaner results on undamaged input.
Try it yourself
Resize to exact dimensions for any platform — free, instant, no signup. Your images never leave your browser.