Midjourney has recently launched a more “editable” workflow: you’re not just generating images anymore—you can upload existing pictures and directly outpaint, crop, and repaint local areas, and also “re-texture” with one click to swap materials and lighting. This article focuses only on how to use these new Midjourney features, what scenarios they fit, and common pitfalls.
What’s included in the update: a closed loop from generation to editing
The core change in this Midjourney update is expanding “generate images” into “edit images.” In the external image editor, you can make a selection and write prompts to add elements, replace backgrounds, fix details, and more—the interaction logic is closer to design software.
It also adds an “image re-texturing” mode, used to globally replace materials, surfaces, and lighting mood without significantly changing the composition. Another change is a more fine-grained V2 AI moderation system that performs more comprehensive checks from prompts and masks through to the final output.
Midjourney external image editor: outpaint, crop, and inpaint after uploading
The steps are straightforward: upload an image at the Midjourney editor entry point, first use crop/expand to set the canvas range, then use a brush or lasso for “area selection.” After selecting an area, enter a prompt—for example, “Turn the empty space on the left into a glass display window with neon reflections at night”—and Midjourney will change only the part you circled.
To improve hit rate, it’s recommended to clearly specify three things in the prompt: what to change (the object), what style to change it into (material/era/lens), and what must be preserved (composition/facial expressions/text areas). If the edit drifts off, shrink the selection first rather than lengthening the prompt.
Image re-texturing mode: change materials and lighting without changing composition
Re-texturing is suitable when “the composition is good but the look/feel is off”—for example, changing the same interior image from “Scandinavian wood” to “industrial metal,” or turning a daytime street scene into a “rainy night with wet reflective surfaces.” It first estimates the scene geometry and then reapplies textures and lighting, so the overall mood shift is more noticeable.


