Titikey
HomeTips & TricksChatGPTMidjourney External Image Editor Launches: Upload for Inpainting and Re-texturing

Midjourney External Image Editor Launches: Upload for Inpainting and Re-texturing

2/28/2026
ChatGPT

Midjourney has recently pushed from “able to generate” to “able to edit”: on the web version, you can upload your own images and directly perform localized inpainting, canvas expansion, and style reworks. For people making e-commerce images, posters, or character designs, Midjourney is no longer just an image-generation tool—it’s closer to an iterative workbench you can refine again and again.

Upload Any Image into the Editor: Not Just Tweaking Details, but Changing the Composition Too

The core change in this Midjourney image editor update is that it lets you upload external images and edit them, rather than only being able to modify images generated by Midjourney itself. After entering an image’s details page and clicking “Edit,” you can crop the frame, expand the aspect ratio, and even fill in content along the edges, turning composition from something “locked in at generation time” into something “adjustable in post.”

The most day-to-day-friendly part is: you don’t have to perfect your prompt upfront. Start with a basic image as a foundation, then use Midjourney to gradually refine it until you’re satisfied—the whole workflow feels much more like real design work.

Erase/Restore for Localized Inpainting: Fix What’s Wrong Exactly Where It’s Wrong

Midjourney’s edit mode provides local tools like “Erase” and “Restore.” After erasing elements you don’t want, Midjourney will inpaint the blank area based on your text prompt; if you overpaint, you can use Restore to roll that area back.

This “selection + text” approach is especially suitable for scenarios like fixing hands, swapping props, changing a logo’s position, or cleaning up background clutter—avoiding repeated full-image rerolls that cause the subject to drift.

Re-texturing Mode: Keep the Form, Replace Materials and Lighting Overall

Midjourney has also introduced a “image re-texturing” approach: it first estimates the scene’s shape and structural layout, then reapplies textures and materials so that lighting, surface feel, and stylistic expression change holistically. Simply put: “the composition barely moves, but all the rendering gets redone.”

It becomes smoother to produce multiple style variants of the same image (for example, switching from photoreal product shots to illustrated poster art, or from a white studio setup to cinematic lighting), and it aligns well with Midjourney’s longstanding style strengths.

Compatible with Style/Character/Personalization: A Single Reference System Can Plug into Editing

Midjourney’s editor isn’t an isolated feature—it can still be used together with style references and personalization capabilities, such as using --sref in prompts for style reference, or combining personalization models (some sources mention it can work with personalization parameters). If you rely on character consistency, you can also continue using cref and cw to control reference strength.

In addition, Midjourney is advancing upgrades to the “personalization settings” experience, including a faster setup flow and multiple preference profiles, enabling Midjourney to more consistently produce the aesthetic direction you commonly use.

A More Fine-Grained V2 Moderation System: Prompts, Masks, and Results Will All Be Checked

As editing capabilities grow stronger, Midjourney is simultaneously testing a smarter V2 moderation system that will perform holistic checks across the prompt, input image, mask regions, and final output. For creators, this means some borderline content will be blocked earlier, reducing the time wasted on “only realizing it’s unusable after generation.”

Note that the new editing capabilities will be rolled out in batches during the early phase. Some sources mention that the first stage prioritizes annual subscribers and high-usage users (for example, those with a relatively high cumulative generation count). If you don’t see the entry point yet, it’s most likely that your access hasn’t come up in the rollout.

HomeShopOrders