Titikey
HomeTips & TricksChatGPTMidjourney New Feature Breakdown: Use --cref to Lock in the Same Character and Reference Anything

Midjourney New Feature Breakdown: Use --cref to Lock in the Same Character and Reference Anything

3/2/2026
ChatGPT

This round of Midjourney’s new features brings the most practical improvement: “character consistency” and “reference capability” are noticeably stronger. In the past, when making serialized posters, storyboards, or multiple images of the same character, the biggest headache was facial and outfit drift. Below, I’ll explain the new Midjourney features in a way you can use right away.

Use --cref to “pin down” the same character across a series of images

Midjourney’s newly added character reference tag is --cref (character reference). Its core purpose is to let you reuse the same person’s facial features, body shape, and overall recognizability across different scenes. The method is simple: first prepare a character image you approve of, put the image link at the end of your prompt, and add --cref imageURL. If you want to generate multiple images with different compositions, as long as you include the same --cref each time, the character is more likely to remain consistent.

Example: cinematic portrait, rainy street, neon light --cref https://.../role.png. It’s recommended to choose a character image with a “front-facing view, clean lighting, and minimal occlusion”—the new Midjourney feature is more stable with this kind of reference image.

Use --cw to control “how similar,” so the character doesn’t stiffen into a cutout

Paired with --cref is the --cw tag, which adjusts the weight of the character reference (you can think of it as “how similar it should be”). If you find the character is the same person but you can’t change expressions, hairstyle, or clothing, you can lower --cw a bit to let the new Midjourney feature balance consistency and flexibility. Conversely, if you find the character starting to drift, turn --cw up a little.

Example: full body, in a medieval market, warm sunlight --cref https://.../role.png --cw 70. When making a series, first run three test images with the same --cw; once it’s stable, expand in bulk for higher efficiency.

Reference anything: put “this object/character” into your scene

Another noteworthy new Midjourney feature is “Omni-Reference (reference anything),” which extends referencing from “people” to objects, vehicles, or non-human creatures. In practice, you provide a reference image and then use the prompt to describe the scene and style you want; the system will be more inclined to preserve the key appearance traits of the referenced subject. This is especially friendly for e-commerce props, reusing product shapes, and serialized IP monster content.

Recommended approach: keep the reference image as clear as possible, with a single main subject and a clean background; in the prompt, be specific about the “environment, camera, and lighting,” and leave the “subject” to be constrained by the reference image—this makes it easier for the new Midjourney feature’s advantages to shine.

Common beginner pitfalls: why drift can still happen

Even with the new Midjourney features, drift can still occur. Common reasons include an unstable reference image (heavy filters, occlusion, side profile, exaggerated makeup) or prompts that contain conflicting descriptions (for example, asking for “short hair” while also requiring “long curly hair”). Also, if you want to achieve both “same face + major outfit changes,” it’s recommended to lock the face first and then change gradually: first fix --cref to produce a stable version, then iterate with clearer clothing keywords. Once you get this workflow down, you’ll find the new Midjourney features provide a very noticeable boost to series creation.

HomeShopOrders