Titikey
HomeTips & TricksChatGPTInterpretation of Midjourney’s New Features: Style Reference and Advanced Editing Made Easier to Use

Interpretation of Midjourney’s New Features: Style Reference and Advanced Editing Made Easier to Use

2/27/2026
ChatGPT

Recently, Midjourney has refined two things on the web version—“reference” and “editing”—in a more granular way: Style Reference (Sref) makes styles reusable, while Advanced Editing turns local adjustments into a controllable workflow. This article takes a more hands-on approach to help you truly apply Midjourney’s new features to your actual image generation.

Style Reference (Sref) lets the style follow the image

Midjourney’s Style Reference (Sref) has become the default capability for V7 tasks: you can drag a reference image into the “Style Reference” area in the prompt bar, allowing Midjourney to inherit that image’s color palette, texture, and overall aesthetic direction. Another method is to append --sref URL at the end of the prompt (requires an accessible image link), which is better suited for templated, batch image generation.

In real work, Sref is best used for needs like “same-series posters / same-brand visuals”: first set a stable style master image, then let Midjourney expand with different copy and subjects—consistency will become noticeably stronger.

Erase and Restore in Advanced Editing: more reliable local edits

In Midjourney web’s Advanced Editing, the most commonly used tools are “Erase” and “Restore.” The logic of Erase is straightforward: paint over what you don’t want, and Midjourney will regenerate the blank area—essentially turning inpainting into a point-and-click operation.

If you erase too much or head in the wrong direction, just use Restore to pull the area back to its original state, then erase again. This combo is especially useful for high-frequency rework points like fixing hands, removing background clutter, and swapping props.

Smart Selection: lock the area you want to change with a single click

In the past, the most time-consuming part of local edits was “lassoing,” but Midjourney’s Smart Selection shortens this step to “click the target area.” You can click multiple times to add anchor points so the selection fits the outline more closely, then choose “Select/Keep” and enter the keywords you want to replace with.

For e-commerce images or portrait retouching, this feature is a big time-saver: for example, changing only the “shoe material” or only the “background setting” while keeping the subject unchanged—Midjourney’s controllability becomes closer to the workflow of design software.

Practical advice: three steps to revise an image until it’s deliverable

Step one: generate a base composition in Midjourney as usual. Step two: put the style master image into Sref to lock in the series feel. Step three: go into Advanced Editing and use Smart Selection + Erase to fine-tune flaws and key information points. If you want the image to feel more “alive,” you can also try the experimental aesthetics parameter --exp (0–100, default 0), but it’s recommended to increase it only slightly to avoid over-stylization.

Once you’ve smoothed out this workflow, Midjourney is no longer just “gacha-style image pulling,” but a production tool that supports continuous iteration, rollback, and reuse.

HomeShopOrders