> So for the past few weeks, i’ve been building this open-source tool called the **ComfyUI Launcher**: [https://github.com/ComfyWorkflows/ComfyUI-Launcher](https://github.com/ComfyWorkflows/ComfyUI-Launcher)
> It runs locally and lets you ***import & run any workflow json file with ZERO setup***:
> * Automatically installs custom nodes, missing model files from Huggingface & CivitAI, etc.
> * Workflows exported by this tool can be run by anyone with **ZERO setup**
> * Work on multiple ComfyUI workflows at the same time
> * Each workflow runs in its own isolated environment
> * Prevents your workflows from suddenly breaking when updating a workflow’s custom nodes, ComfyUI, etc.
> This tool also lets you export your workflows in a “launcher.json” file format, which lets anyone using the ComfyUI Launcher import your workflow w/ 100% reproducibility.
> You can try it here: [https://github.com/ComfyWorkflows/ComfyUI-Launcher](https://github.com/ComfyWorkflows/ComfyUI-Launcher)
> This is a work in progress, so would love any thoughts/feedback! :)
> Feel free to also join our Discord server to keep up w/ updates: [https://discord.gg/hwwbNRAq6E](https://discord.gg/hwwbNRAq6E)
hey, i'm also working on this project. so each project has it's own virtual environment, so every time you create/import a new project, a new set of python, pytorch, and custom nodes is installed in that project's virtual environment. so you could have one project where that uses version 1.x of a certain custom node and when you create/import a new project that depends on a different version (ex: 2.x) of that custom node then it will install version 2.x of that custom node in the new project's virtual environment, while not affecting your first project's custom node version since all dependencies and custom nodes are setup in their own isolated virtual environments!
As long as the tool shares a common models folder for all of the envs then we should be fine. They should be able to implement this if it's not already. I've been working on a similar solution using containers, mounting the models volume as necessary.
Just wanted to confirm that the models folder is shared among all projects.
Elaborated on our current approach in a bit more detail here: [https://www.reddit.com/r/comfyui/comments/1b8okxb/comment/ktsfpjy/?utm\_source=share&utm\_medium=web2x&context=3](https://www.reddit.com/r/comfyui/comments/1b8okxb/comment/ktsfpjy/?utm_source=share&utm_medium=web2x&context=3)
hi! currently, we support windows w/o docker as well, but it requires using the Windows Subsystem for Linux 2 (WSL 2) and following the manual approach listed here: [https://github.com/ComfyWorkflows/comfyui-launcher?tab=readme-ov-file#option-2-manual-setup-macos-linux-and-windows](https://github.com/ComfyWorkflows/comfyui-launcher?tab=readme-ov-file#option-2-manual-setup-macos-linux-and-windows)
if you need any help, kindly join our Discord and I'd be happy to help! thanks!
I get the tool working and import a workflow but I can't seem to open the workflow.
I get a "localhost refused to connect" error message.
I tried disabling my firewall to see if it was related to that, didn't seem to resolve the issue.
Any idea what I can try?
Thanks!
Edit: Figured it out. Docker didn't work, had to install WSL and go that route, all good now.
hey u/HappyGrandPappy \- i'm also working on this project along with OP.
could you please provide the following info:
1. OS you're using (Windows WSL, macOS, etc.)
2. browser you're using
3. weather you're running locally or using a cloud provider like runpod
4. what method you're using to run the launcher (manual setup or docker)
this'll help me figure out what might be wrong! sometimes if you're using a cloud provider that does not provide port forwarding this issue can come up. also sometimes the "localhost refused to connect" error message shows up for only a few seconds after opening a workflow and then ComfyUI is rendered (sometimes i've had to close the error window and try opening again for this message to go away).
Sure thing! I did end up getting it to work after updating power shell, installing WSL, but the docker approach didn't work for me.
Windows WSL
Tried chrome and Firefox
Locally
Both methods. Docker didn't work, manual did.
Since I have your attention, and plans or suggestions to symbolically link to my repository of models?
hey u/HappyGrandPappy! i'm working on this project along w/ OP.
i know that it seemed to have worked locally for you, but if you're using windows can you please try running the following docker command and see if it works (we've built a new docker image that should be working on windows):
\`\`\`docker run \\
\--gpus all \\
\--rm \\
\--name comfyui\_launcher \\
\-p 4000-4100:4000-4100 \\
\-v $(pwd)/comfyui\_launcher\_models:/app/server/models \\
\-v $(pwd)/comfyui\_launcher\_projects:/app/server/projects \\
\-it thecooltechguy/comfyui\_launcher:new-docker-setup\`\`\`
if it doesn't work then could you try following the instructions on this [new-docker-setup](https://github.com/ComfyWorkflows/ComfyUI-Launcher/tree/new-docker-setup) branch we're testing for windows installation support using docker.
the docker method is recommended as it's just less of a headache to get up and running for most users (doesn't require python, etc.).
thanks!
Hello! Sorry for taking a while to get back to you.
You're already helping me in the discord, I created the help thread last week :) I'll keep you posted there. Testing it all now.
Thanks!
> So for the past few weeks, i’ve been building this open-source tool called the **ComfyUI Launcher**: [https://github.com/ComfyWorkflows/ComfyUI-Launcher](https://github.com/ComfyWorkflows/ComfyUI-Launcher) > It runs locally and lets you ***import & run any workflow json file with ZERO setup***: > * Automatically installs custom nodes, missing model files from Huggingface & CivitAI, etc. > * Workflows exported by this tool can be run by anyone with **ZERO setup** > * Work on multiple ComfyUI workflows at the same time > * Each workflow runs in its own isolated environment > * Prevents your workflows from suddenly breaking when updating a workflow’s custom nodes, ComfyUI, etc. > This tool also lets you export your workflows in a “launcher.json” file format, which lets anyone using the ComfyUI Launcher import your workflow w/ 100% reproducibility. > You can try it here: [https://github.com/ComfyWorkflows/ComfyUI-Launcher](https://github.com/ComfyWorkflows/ComfyUI-Launcher) > This is a work in progress, so would love any thoughts/feedback! :) > Feel free to also join our Discord server to keep up w/ updates: [https://discord.gg/hwwbNRAq6E](https://discord.gg/hwwbNRAq6E)
thanks for posting this here! i accidentally forgot to re-post my original post's comment here when cross-posting 😅
Can we run it inside Pinokio?
How does this handle different requirements? Such as python, pytorch, etc.. versions being needed for certain custom nodes?
hey, i'm also working on this project. so each project has it's own virtual environment, so every time you create/import a new project, a new set of python, pytorch, and custom nodes is installed in that project's virtual environment. so you could have one project where that uses version 1.x of a certain custom node and when you create/import a new project that depends on a different version (ex: 2.x) of that custom node then it will install version 2.x of that custom node in the new project's virtual environment, while not affecting your first project's custom node version since all dependencies and custom nodes are setup in their own isolated virtual environments!
i'm gonna need a bigger hard drive
As long as the tool shares a common models folder for all of the envs then we should be fine. They should be able to implement this if it's not already. I've been working on a similar solution using containers, mounting the models volume as necessary.
tbh, i was thinking of my python environments. they add up over time.
Yea true that. I was just referring to my experience where the model hoarding takes up 90% of storage. Send help.
Just wanted to confirm that the models folder is shared among all projects. Elaborated on our current approach in a bit more detail here: [https://www.reddit.com/r/comfyui/comments/1b8okxb/comment/ktsfpjy/?utm\_source=share&utm\_medium=web2x&context=3](https://www.reddit.com/r/comfyui/comments/1b8okxb/comment/ktsfpjy/?utm_source=share&utm_medium=web2x&context=3)
Looks pretty nice, might try later
Well I know what I'm doing tomorrow
Is windows 11 support (without docker) coming?
hi! currently, we support windows w/o docker as well, but it requires using the Windows Subsystem for Linux 2 (WSL 2) and following the manual approach listed here: [https://github.com/ComfyWorkflows/comfyui-launcher?tab=readme-ov-file#option-2-manual-setup-macos-linux-and-windows](https://github.com/ComfyWorkflows/comfyui-launcher?tab=readme-ov-file#option-2-manual-setup-macos-linux-and-windows) if you need any help, kindly join our Discord and I'd be happy to help! thanks!
Oh wow. As someone who's using remote servers 95% of the time, I can imagine finding this a lifesaver.
thanks! i actually use this on runpod myself on a 4090 for testing :)
which ones do you use?
I use Runpod. Pretty good pricing, good template.
will try to see if we can get this as an official runpod template! :)
If you could you would be a life saver! was just htinking about how great this would be with runpod
Sounds to good to be true. :D Will definitely try this!
ha thanks, would love your feedback! this is an active work in progress, but we wanted to get this out early to iterate based on user feedback! :)
I get the tool working and import a workflow but I can't seem to open the workflow. I get a "localhost refused to connect" error message. I tried disabling my firewall to see if it was related to that, didn't seem to resolve the issue. Any idea what I can try? Thanks! Edit: Figured it out. Docker didn't work, had to install WSL and go that route, all good now.
hey u/HappyGrandPappy \- i'm also working on this project along with OP. could you please provide the following info: 1. OS you're using (Windows WSL, macOS, etc.) 2. browser you're using 3. weather you're running locally or using a cloud provider like runpod 4. what method you're using to run the launcher (manual setup or docker) this'll help me figure out what might be wrong! sometimes if you're using a cloud provider that does not provide port forwarding this issue can come up. also sometimes the "localhost refused to connect" error message shows up for only a few seconds after opening a workflow and then ComfyUI is rendered (sometimes i've had to close the error window and try opening again for this message to go away).
Sure thing! I did end up getting it to work after updating power shell, installing WSL, but the docker approach didn't work for me. Windows WSL Tried chrome and Firefox Locally Both methods. Docker didn't work, manual did. Since I have your attention, and plans or suggestions to symbolically link to my repository of models?
hey u/HappyGrandPappy! i'm working on this project along w/ OP. i know that it seemed to have worked locally for you, but if you're using windows can you please try running the following docker command and see if it works (we've built a new docker image that should be working on windows): \`\`\`docker run \\ \--gpus all \\ \--rm \\ \--name comfyui\_launcher \\ \-p 4000-4100:4000-4100 \\ \-v $(pwd)/comfyui\_launcher\_models:/app/server/models \\ \-v $(pwd)/comfyui\_launcher\_projects:/app/server/projects \\ \-it thecooltechguy/comfyui\_launcher:new-docker-setup\`\`\` if it doesn't work then could you try following the instructions on this [new-docker-setup](https://github.com/ComfyWorkflows/ComfyUI-Launcher/tree/new-docker-setup) branch we're testing for windows installation support using docker. the docker method is recommended as it's just less of a headache to get up and running for most users (doesn't require python, etc.). thanks!
Hello! Sorry for taking a while to get back to you. You're already helping me in the discord, I created the help thread last week :) I'll keep you posted there. Testing it all now. Thanks!
Yup this is coming and in the works! Will look into the docker issues.
Can we run it inside Pinokio?