Titikey
HomeNewsOpenaiMidjourney New Features Explained: Master Character Reference & Omni-Reference with Pro Tips

Midjourney New Features Explained: Master Character Reference & Omni-Reference with Pro Tips

4/26/2026
Openai

Have you ever tried generating multiple images with AI only to find the same character’s face shape and outfit completely mismatched? Midjourney’s recently launched Character Reference (-cref) and Omni-Reference features are designed to solve this exact pain point. Now, with just one reference image, you can keep a character highly consistent across different scenes, and even “transport” any object or living creature into your artwork. This article dives deep into both new features, along with practical parameter tips.

What is the Character Reference (-cref) Feature?

Character Reference is one of the most important updates since Midjourney V6. In the past, even when using the same prompt, the facial features, body proportions, and clothing of generated characters could change dramatically from one output to the next. Now, by adding --cref <image URL> at the end of your prompt, Midjourney extracts the character’s traits from the reference image, ensuring that subsequent scene generations retain the original face, body shape, and clothing style. For example, if you have a selfie and want to turn it into illustrations across different settings, -cref makes it easy.

Control Character Consistency with the -cw Parameter

Sharp-eyed users may notice that Character Reference isn’t a 100% copy—it allows you to adjust the reference weight with the --cw parameter. --cw ranges from 0 to 100, with a default of 0. At 0, Midjourney only references the character’s facial features; at 100, it preserves hair, clothing, pose, and other details. If you want to keep face consistency while freely changing the character’s outfit or scene, set --cw between 20 and 40. This ensures character recognition while leaving room for AI creativity.

Omni-Reference Feature Explained

Beyond human characters, Midjourney has also introduced the more powerful Omni-Reference system. This feature is no longer limited to people—you can upload images of any object, animal, or even vehicle. By using --sref or dragging the image into the interface, AI can “transplant” the reference object’s shape, texture, or style into a new image. For instance, use a photo of a plush toy as a reference to generate a panda with the same texture, or reference a vintage car’s design to create a retro-style spaceship. Omni-Reference greatly expands the creative boundaries of AI art, making it especially useful for character design, product prototyping, and concept art.

Practical Applications & Tips

To get the most out of these two new features, here are a few key tips. First, the quality of your reference image directly affects the output—choose front-facing, evenly lit, and clean-background shots. Second, if using both -cref and -sref at the same time, place the parameters carefully: --cref for the character, --sref for style or objects—they work together. Finally, don’t forget to leverage Midjourney’s official prompt templates, such as clicking “Give Me Inspiration” and selecting “Draw Me a Portrait,” which automatically embed the Character Reference parameter to lower the learning curve. Give it a try now and let your characters come to life across multiple story scenes.

HomeShopOrders