Prompting Pixels

Quickly Change Styles, Clothes, and More with Precision in ComfyUI


Listen Later

AI Video Summary: The video demonstrates a workflow combining an IP adapter and a segmentation model to apply a Hawaiian shirt style to a selected part of an image, emphasizing fine-tuned adjustments, the ability to batch process and giving a detailed step-by-step guide on setting up and running the required custom nodes and models for this task, while also noting the process’s VRAM requirements and limitations in style transfer consistency.

Notes
Comments
ASSETS / DOWNLOAD / more

ComfyUI Workflow Download Link: Changing Styles with IPAdapter and Grounding Dino

When we combine anĀ IP adapterĀ with aĀ segmentation model, we can make fine-tuned adjustments to specific areas of an image. These adjustments can look significantly better than those achieved through traditional in-painting coupled with a ControlNet method.

The workflow featured in the video is broken into three components:

  • Basic workflow (loading checkpoint, prompts, KSampler, etc.)
  • IP Adapter nodes
  • Segmentation nodes
  • IP Adapter Nodes

    We’re using four different nodes in this area, which include the following:Ā Load Image,Ā IPAdapter Unified Loader,Ā Load Clip VisionĀ model, andĀ IPAdapter Advanced.

    The overall goal here is to understand the style of the reference image and pass that information along with the model so the output accurately understands what that reference image is about and applies it accordingly.

    While in the example provided in the video uses a Hawaiian shirt, the input image could be any image that you want to extract the style from. This could be other shirt or fabric styles, paintings, monograms, etc.

    Power-up: You can learn more about IPAdapters in this in-depth guide provided by HuggingFace.

    Segment Anything Nodes

    Borrowing from the popularĀ SD WebUI Segment Anything extension, theĀ Segment Anything custom nodesĀ for ComfyUI provided by storyicon allow you to provide a textual input and it will, if found, segment that object from an image. Both extensions are based onĀ Grounding Dino.

    The video demonstrates how to connect aĀ Load ImageĀ node along withĀ SAMModelLoader,Ā GroundingDinoModelLoader, and theĀ GroundingDinoSAMSegmentĀ nodes.

    One important variable to touch on that wasn’t included in the video is the threshold value in theĀ GroundingDinoSAMSegmentĀ node. Essentially, a lower value may result in more of an area being selected, whereas a higher value will be more specific. The problem is that too high of a value may result in nothing being selected at all.

    Generally, the default value of 0.30 works well for most cases. Use theĀ Convert Mask to ImageĀ node if you want to review the segment.

    Important Notes
    • Inpainting Checkpoint: Using an inpainting model, preferably SDXL, will improve results.
    • VRAM Requirements: The segmentation models used in this workflow can be VRAM intensive, so ensure sufficient VRAM is available.
    • Style Transfer Limitations: Transfers are not perfect representations, nor will they place details in specific areas (i.e., flower on sleeve in Hawaiian shirt may not show on sleeve of model).
    • Limited by Segment Bounds: Only the area that has been segmented will receive changes. Therefore, if this method is used for changing outfits, then length, details, etc. will not be transferred.
    • Textual Prompt: Just describe the item that is being transferred in the style.
    • 2 responses to ā€œQuickly Change Styles, Clothes, and More with Precision in ComfyUIā€
      Shawn
      June 18, 2024

      Have any questions on this workflow? Drop it below

      Reply
    • ACrazy
      July 18, 2024

      Thanks for the video and tutorial. I love the concept of this.

      Could you help with my issue please? Output completely ignoring clothes.

      SETUP:

      – I drew a mask around the eyes of my model and bypassed the DINOnodes (since the mask didn’t get all of the required area).
      – For the clothes, I selected sunglasses (specifically these : https://encrypted-tbn0.gstatic.com/shopping?q=tbn:ANd9GcQjL_O72XNdc8Ir2jR1qIRpeS_C2rugy7998U5fzRl7peSXxpL1SgoMvr7h2ORbDp1ZKof3EHdyzH8oRWSoyk9jCxXBzkjr7ovhdcmGvuvd8KE-Qb9349t7).
      – For the checkpoint model, I selected EpicRealism V5 Inpainting.
      – All other settings are the same as yours.

      Reply
      Leave a Comment Cancel reply

      Comment

      Name
      Email

      Save my name, email, and website in this browser for the next time I comment.

      Website

      Nodes:

      • ComfyUI IPAdapter
      • ComfyUI Segment Anything
      • The post Quickly Change Styles, Clothes, and More with Precision in ComfyUI appeared first on Prompting Pixels.

        ...more
        View all episodesView all episodes
        Download on the App Store

        Prompting PixelsBy Prompting Pixels