Midjourney has recently been testing a brand-new image editor on its web version: it doesn’t just “generate images” anymore—it also lets you upload your existing pictures for outpainting, cropping, localized inpainting, and even one-click retexturing. For people who often make posters, character concepts, and e-commerce images, Midjourney has finally connected “generation + retouching” into a smoother workflow.
What exactly has been updated in this Midjourney editor
After entering Midjourney’s new editing interface, you can directly upload local images, then select areas on the canvas to edit. Common tools include “Erase” and “Restore,” which are suited for removing unwanted elements, filling in backgrounds, and refining edge details. While editing, you can also expand the canvas by adjusting the scale and aspect ratio, turning the original image from “just fits” into “actually usable.”
The right way to do localized inpainting with Midjourney on the web
It’s recommended to start with the finished image you’re most satisfied with. On the work page, click “Edit” to enter the editor, then upload an image or start editing directly based on that image. Use the Erase tool to brush out the area you want to change, then in the prompt clearly describe “what to add, what the material is, and where the light comes from.” Midjourney will repaint based on your text plus the selected region. If you erase too much, use the Restore tool to bring back parts that shouldn’t change—it’s more convenient than rerolling repeatedly.
Retexturing: keep the structure, swap out the “materials and lighting”
Midjourney has also added an “image retexturing mode.” It first estimates the scene’s shapes and forms, then reapplies textures and materials, giving the overall lighting and surface qualities a full refresh. Put simply: the structure stays largely the same, but everything from the “skin” to the light and shadow can be restyled. This is especially useful for turning realistic images into illustrations, or changing ordinary materials into metal/ceramic/fabric, and so on. Retexturing also supports prompt control—for example, specifying “matte metal, cool cinematic lighting, grainy film.”
Can it be used together with Midjourney’s style/character references?
Yes. Midjourney’s editor is compatible with capabilities like style reference, character reference, image prompts, and model personalization. For example, you can lock in a style with --sref URL in the prompt, and combine it with a personalized model --p, making the edited result more likely to stay consistent with your usual aesthetic. For character consistency, you can also continue using cref URL and adjust reference strength via cw, so Midjourney won’t “conveniently change the face” when you’re modifying clothes or hairstyles.
V2 moderation and rollout conditions: why you might not see the entry point
Because these features are still in testing, Midjourney is simultaneously testing a more fine-grained V2 AI moderation system, which performs end-to-end checks across the prompt, input image, masked region, and output image. The official rollout is also limited in the first phase—for instance, users who have generated a large number of images (e.g., reaching a certain cumulative amount and being annual subscribers) or monthly subscribers with longer continuous subscriptions may be more likely to receive access. If you don’t see the editor entry yet, first confirm your account status and subscription cycle, and then watch for Midjourney’s subsequent gradual expansion.