Midjourney has recently made “character consistency” much more practical: use --cref (character reference) to pin the character down, then use --cw to control “how similar it should be.” If you’re drawing serialized storyboards or posters of the same protagonist in different scenes, this feature can significantly reduce the time spent rerolling results.
1) What is --cref: Keep the same character across multiple images
In Midjourney, it’s hard to guarantee you’ll get the exact same face every time using text alone. The idea behind --cref URL is: provide it with a “character image,” and subsequent generations will try to reuse that character’s facial features, body shape, and overall look.
Note that the official guidance also emphasizes it’s better suited for “characters generated by Midjourney,” and reproducing real photos may not reach photorealistic fidelity. Treat it as a “character design reference,” and your success rate will be higher.
2) Prepare a character reference image: Get a usable image URL
The most reliable approach is to first generate a character image in Midjourney that you’re satisfied with as the “base image.” In Discord, upload that image or directly reference the image you generated; open the original and copy the image link (URL).
If you’re using Midjourney on the web, you can usually achieve the same workflow by dragging the image into the prompt box and setting it as a “character reference”; the key is still to make the system clearly recognize that this image is the character reference.
3) Core usage: Put --cref at the end of the prompt
Midjourney parameters are recommended to be placed at the end of the prompt; otherwise parsing issues can occur. A directly copyable format is:
/imagine cinematic medium shot, rainy city night, character holding an umbrella and looking back, neon reflections, 35mm, shallow depth of field --cref your character image URL --cw 80
If you want to place the same character into different shots, just change the scene description at the front and keep the same --cref. For poster series, character turnarounds, or consecutive storyboard shots, Midjourney will be more “obedient.”
4) How to tune --cw: The key dial for similarity strength
--cw ranges from 0–100, and the default is effectively --cw 100, which more strongly reuses information from the reference image such as the face, hairstyle, and clothing. If you want “the same face but a different outfit,” lower --cw, for example --cw 20 or --cw 0, to make it focus more on facial traits.
A practical rule of thumb: use --cw 70–100 for everyday serialized character images; for changing outfits, hairstyles, or time-period settings, start with --cw 20–50. In Midjourney, it’s more efficient to get the “does it look like them?” part right first, then talk about style and details.
5) Advanced tips: Mixing multiple URLs and common pitfalls
Midjourney supports providing multiple URLs in --cref to blend features from different references (e.g., face shape from A, hair color and vibe from B). But the more you mix, the more “averaged out” it can become—start with two, confirm stability, then add more.
Also, character consistency isn’t “100% locked.” Lighting, exaggerated expressions, and occlusions can all affect results. For more stability, prioritize using a clear front-facing or half-body character image as --cref, and avoid stuffing in too many abstract style terms at the start—let Midjourney get the character firmly established first.