Titikey
HomeTips & TricksChatGPTMidjourney Character Reference (-cref) Is Live: Practical Workflow for Keeping the Same Person Consistent Across Multiple Images

Midjourney Character Reference (-cref) Is Live: Practical Workflow for Keeping the Same Person Consistent Across Multiple Images

3/5/2026
ChatGPT

Midjourney recently added a “character reference” feature, making it easier to keep the same person’s face consistent across a series of generated images. The core method is to use -cref to lock in the source character image, then use -cw to control similarity, so you can maintain consistency in facial features, body shape, and clothing across different scenes.

What pain points does this update solve?

In Midjourney, even with the same prompt, multiple generations can still “look different,” which is especially troublesome when creating serialized posters, storyboards, or brand characters. The goal of -cref is to pull the “character” out of randomness, letting Midjourney use the reference image you provide as an anchor while it varies the scene.

Note that this capability still leans toward “generating character consistency” and doesn’t guarantee a true photo-level replica. If you’re aiming for strict, real-person-level consistency, Midjourney may still drift in fine details.

How to use -cref: lock the character first

The workflow is straightforward: prepare a character image you approve of, upload it to Discord, and copy the image link. Then add “ -cref image_link ” to your /imagine prompt, and Midjourney will treat that image as the character reference.

Example: /imagine a female detective running through neon streets on a rainy night, cinematic lighting -cref https://... . If you want the same character to change outfits or hairstyles, state the “what changes” clearly—but don’t change too many things at once; stability will be better.

How to tune the -cw parameter: should it “look alike” or “perform”?

-cw adjusts how similar the result is to the reference image. You can think of it as “character constraint strength.” If you want Midjourney to stay closer to the original character, raise -cw; if you want Midjourney to keep the general traits but allow more stylized variation, lower -cw.

Example: /imagine the same female detective flipping through files in an office, film grain -cref https://... -cw 70. When making a series, it’s recommended to first use the same prompt structure and the same -cw to generate 3–5 images, pick the most stable one, and then use it as a new -cref for a second round of iteration.

Practical tips and common pitfalls

First, choose a reference image with a clear front-facing view, minimal occlusion, and normal lighting—Midjourney can capture stable features more easily. Second, avoid extreme jumps in scene/style (e.g., from realistic to strongly abstract), as Midjourney may mistake style changes for character changes.

If you find Midjourney still changes the face, check these three things first: whether the reference image resolution is too low, whether your prompt includes descriptions that alter facial features (such as “different age/different ethnicity”), and whether -cw is too low. Tighten these up first, then gradually reopen creative freedom—this is usually more stable.

HomeShopOrders