Recently, Midjourney has made “editing images” feel more like a complete workflow: you can upload an existing image, expand the canvas, crop, and inpaint within the same picture, and even switch the overall materials and lighting to a different style with one click. At the same time, Midjourney is also testing a more granular V2 moderation system, where prompts, masks, and outputs will all be checked together.
External image editor: expand, crop, and inpaint right after upload
This time, Midjourney’s external image editor is built around “start with an image, then refine it.” You can upload an image from your computer, then extend the frame (similar to filling in the edges), crop the composition, or use a region selection (mask) to specify the part you want to change, and then control the inpainting result with a text prompt.
In practice, it’s recommended to state your goal clearly first: what to change (subject/background/props), what to keep (composition/viewpoint/mood), and what elements you don’t want to appear. Midjourney’s inpainting relies heavily on “boundary descriptions”—the more specific you are, the less likely it is to mess things up.
Retexture mode: keep shapes and structure, switch materials and lighting overall
Midjourney’s new “image retexture mode” is more like reskinning an existing scene: it estimates the scene’s shapes and then reapplies textures, causing the materials, surface qualities, and lighting to change as a whole. It’s suitable for quickly turning the same product shot into multiple material options, or shifting a scene from “daylight realism” to a “cinematic night look.”
When writing prompts, try to put “materials and light” up front—for example, brushed metal, matte plastic, wet stone, hard light/soft light, rim/back lighting, etc. If you also want to preserve the original structural proportions, use fewer words that would heavily alter the composition.
The reference system still works: style reference, personalization, and character reference can stack
The good news is that this editing workflow isn’t isolated: Midjourney’s style reference (e.g., --sref), personalization model (--p), and character reference (cref used with cw strength) can all be used together with the editor. You can set the overall tone with a style reference first, then lock character consistency with a character reference, and finally make local fixes in the editor.


