Titikey
HomeTips & TricksChatGPTMidjourney External Editor and Retexturing Launch: A Practical Guide to the New Features

Midjourney External Editor and Retexturing Launch: A Practical Guide to the New Features

2/14/2026
ChatGPT

Midjourney has recently completed its “direct image editing” capabilities: an external image editor, image retexturing, and a more granular V2 moderation system are now being tested in parallel. For people who often need to extend images, change backgrounds, or swap props, this update is clearly much more actionable. Below, I’ll explain how to get started based on real-world usage.

In this Midjourney update, what are the core changes?

Previously, making local edits in Midjourney was more like “regenerating one that looks closer.” The external image editor now is closer to a conventional photo-editing workflow: first upload an existing image, then use selections plus text prompts to control which areas get repainted.

Another key point is “image retexturing mode.” It’s not just applying a filter; it first estimates the scene’s forms, then replaces the materials, lighting, and surface texture as a whole. You’ll find it easier to create a series of images with “the same composition but different materials.”

How to use the external image editor: upload, select, repaint

The workflow is straightforward: upload an image from your computer in Midjourney’s editor, then extend or crop it, or use repainting to replace local elements. What really determines the result is the scope of the area you select, and whether your prompt describes the “added/replaced content” with enough specificity.

Operationally, a two-step approach is recommended: first, draw the selection so it “covers only the part you want to change,” then clearly state in the prompt what you want and don’t want—for example, “replace the prop in the right hand with a metal key; keep the original hand pose and lighting direction.” This makes it easier for Midjourney to produce stable outputs.

Image retexturing: quickly swap materials and mood within the same scene

Image retexturing is suitable for scenarios like product material options, interior style variations, and testing character clothing fabrics. You can treat the “shape” as fixed and let Midjourney recompute the “skin,” for example switching from “matte ceramic” to “brushed metal,” with gloss and reflections changing along with it.

In addition, Midjourney’s editor supports combining this with tools like style references (--sref) and personalization models (--p). When producing serialized content, it’s recommended to lock in the visual language first with --sref, then use retexturing to run material variations—this will be more efficient than repeatedly rerolling.

The V2 AI moderation system is more granular: what to avoid in advance

Midjourney is testing a smarter V2 AI moderation system that checks prompts, input images, masks, and final outputs holistically. This means that even if “the prompt is fine,” an “non-compliant masked area” could still be blocked; it’s advisable to avoid overly detailed edits in sensitive areas.

Because the feature is still new, Midjourney set a threshold in the first rollout: accounts with higher cumulative generations and users with long-term continuous subscriptions are more likely to get access earlier. If you don’t see the entry point yet, first confirm your account permissions and feature toggles, then keep an eye on the pace of the official gradual rollout.

HomeShopOrders