Titikey
HomeTips & TricksChatGPTGetting Started with New Midjourney Web Features: Reframe, Repaint, and Reference Image Workflows

Getting Started with New Midjourney Web Features: Reframe, Repaint, and Reference Image Workflows

3/13/2026
ChatGPT

Midjourney has recently filled a key gap on the web side and in model capability: not only are outputs more reliable, but post-editing also feels more like “editing directly on the image.” This article walks through Midjourney’s Reframe, Repaint, and the new ways to use Image/Style/Character Reference in the actual order you’d use them, so you can avoid detours.

Midjourney’s New Image Quality Upgrade: More Controllable Detail, Speed, and Text

If you often use Midjourney to generate portraits, you’ll clearly feel that limb continuity is better, and skin and textures are cleaner. The new version also strengthens detail handling—for example, eyes, small faces, and hands in the distance are more likely to come out as “usable images.”

Another practical change is improved generation efficiency: standard render speed is faster, making it suitable for Midjourney workflows that require lots of batch experimentation. When creating images with text, using quotation marks to indicate the text you want to display also improves text accuracy.

The Core of the Web Editor: How to Combine Upscale, Reframe, and Repaint

After you get a satisfactory draft on the Midjourney web app, Upscale it first to obtain a clearer base image; then moving into composition and touch-up is more reliable. Reframe is used to “recompose”—essentially expanding or adjusting the image boundaries—ideal for leaving blank space for posters, switching between landscape and portrait versions, or filling in backgrounds.

Repaint is “localized repainting”: without overturning the whole image, you can fix hands, change clothing edges, or add missing props. The correct way to use Midjourney is: use Reframe first to lock the layout, then use Repaint to handle flaws, and finally do one more slight Vary to produce multiple alternative options.

Three References Made More Useful: Image, Style, and Character Explained in One Go

Midjourney’s Image Reference is best for “borrowing composition and subject relationships”—using a reference image to lock in the visual focus. Style Reference is more like “borrowing brushwork and overall tone”; combined with Style Codes, it lets you quickly reuse textures and color tendencies you like.

The most popular is Character Reference: when you need the same character to keep a consistent face and vibe across different scenes, it’s more stable than prompts alone. It’s recommended to first finalize the character in Midjourney as a “standard portrait,” then use the character reference to generate storyboards or a series of images.

Don’t Stack Parameters: Trade-offs Among Fast/Turbo/Relax and --q 2

Midjourney’s Fast/Turbo/Relax directly affects queue speed and how usage is consumed: use Turbo for tight deadlines, Fast for everyday iteration, and Relax for pure style exploration or slow, careful selection. If you want richer textures and material feel, you can try --q 2, but it will be slower and may trade off a bit of coherence.

In addition, Aspect Ratio controls width-to-height, Style Raw makes the style more “restrained” and closer to the description; Stylization, Weirdness, and Variety control stylization strength, strangeness, and diversity, respectively. The most worry-free approach in Midjourney is to change only one or two parameters at a time, making it easier to pinpoint exactly which step brings improvement.

HomeShopOrders