On this tutorial, we construct an entire Open WebUI setup in Colab, in a sensible, hands-on manner, utilizing Python. We start by putting in the required dependencies, then securely present our OpenAI API key by terminal-based secret enter in order that delicate credentials will not be uncovered immediately within the pocket book. From there, we configure the surroundings variables wanted for Open WebUI to speak with the OpenAI API, outline a default mannequin, put together a knowledge listing for runtime storage, and launch the Open WebUI server contained in the Colab surroundings. To make the interface accessible exterior the pocket book, we additionally create a public tunnel and seize a shareable URL that lets us open and use the applying immediately within the browser. By way of this course of, we get Open WebUI working end-to-end and perceive how the important thing items of deployment, configuration, entry, and runtime administration match collectively in a Colab-based workflow.
import os
import re
import time
import json
import shutil
import sign
import secrets and techniques
import subprocess
import urllib.request
from getpass import getpass
from pathlib import Path
print(“Putting in Open WebUI and helper packages…”)
subprocess.check_call([
“python”, “-m”, “pip”, “install”, “-q”,
“open-webui”,
“requests”,
“nest_asyncio”
])
print(“nEnter your OpenAI API key securely.”)
openai_api_key = getpass(“OpenAI API Key: “).strip()
if not openai_api_key:
elevate ValueError(“OpenAI API key can’t be empty.”)
default_model = enter(“Default mannequin to make use of inside Open WebUI [gpt-4o-mini]: “).strip()
if not default_model:
default_model = “gpt-4o-mini”
We start by importing all of the required Python modules for managing system operations, securing enter, dealing with file paths, working subprocesses, and accessing the community. We then set up Open WebUI and the supporting packages wanted to run the applying easily inside Google Colab. After that, we securely enter our OpenAI API key by terminal enter and outline the default mannequin that we would like Open WebUI to make use of.
os.environ[“ENABLE_OPENAI_API”] = “True”
os.environ[“OPENAI_API_KEY”] = openai_api_key
os.environ[“OPENAI_API_BASE_URL”] = “https://api.openai.com/v1”
os.environ[“WEBUI_SECRET_KEY”] = secrets and techniques.token_hex(32)
os.environ[“WEBUI_NAME”] = “Open WebUI on Colab”
os.environ[“DEFAULT_MODELS”] = default_model
data_dir = Path(“/content material/open-webui-data”)
data_dir.mkdir(mother and father=True, exist_ok=True)
os.environ[“DATA_DIR”] = str(data_dir)
We configure the surroundings variables that permit Open WebUI to attach correctly with the OpenAI API. We retailer the API key, outline the OpenAI base endpoint, generate a secret key for the net interface, and assign a default mannequin and interface title for the session. We additionally create a devoted knowledge listing within the Colab surroundings in order that Open WebUI has a structured location to retailer its runtime knowledge.
cloudflared_path = Path(“/content material/cloudflared”)
if not cloudflared_path.exists():
print(“nDownloading cloudflared…”)
url = “https://github.com/cloudflare/cloudflared/releases/newest/obtain/cloudflared-linux-amd64”
urllib.request.urlretrieve(url, cloudflared_path)
cloudflared_path.chmod(0o755)
print(“nStarting Open WebUI server…”)
server_log = open(“/content material/open-webui-server.log”, “w”)
server_proc = subprocess.Popen(
[“open-webui”, “serve”],
stdout=server_log,
stderr=subprocess.STDOUT,
env=os.environ.copy()
)
We put together the tunnel element by downloading the CloudFlare binary if it’s not already obtainable within the Colab surroundings. As soon as that’s prepared, we begin the Open WebUI server and direct its output right into a log file in order that we will examine its conduct if wanted. This a part of the tutorial units up the core utility course of that powers the browser-based interface.
local_url = “http://127.0.0.1:8080”
prepared = False
for _ in vary(120):
attempt:
import requests
r = requests.get(local_url, timeout=2)
if r.status_code < 500:
prepared = True
break
besides Exception:
go
time.sleep(2)
if not prepared:
server_log.shut()
with open(“/content material/open-webui-server.log”, “r”) as f:
logs = f.learn()[-4000:]
elevate RuntimeError(
“Open WebUI didn’t begin efficiently.nn”
“Current logs:n”
f”{logs}”
)
print(“Open WebUI is working domestically at:”, local_url)
print(“nCreating public tunnel…”)
tunnel_proc = subprocess.Popen(
[str(cloudflared_path), “tunnel”, “–url”, local_url, “–no-autoupdate”],
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT,
textual content=True
)
We repeatedly examine whether or not the Open WebUI server has began efficiently on the native Colab port. If the server doesn’t begin correctly, we learn the current logs and lift a transparent error in order that we will perceive what went mistaken. As soon as the server is confirmed to be working, we create a public tunnel to make the native interface accessible from exterior Colab.
public_url = None
start_time = time.time()
whereas time.time() – start_time < 90:
line = tunnel_proc.stdout.readline()
if not line:
time.sleep(1)
proceed
match = re.search(r”https://[-a-zA-Z0-9]+.trycloudflare.com”, line)
if match:
public_url = match.group(0)
break
if not public_url:
with open(“/content material/open-webui-server.log”, “r”) as f:
server_logs = f.learn()[-3000:]
elevate RuntimeError(
“Tunnel began however no public URL was captured.nn”
“Open WebUI server logs:n”
f”{server_logs}”
)
print(“n” + “=” * 80)
print(“Open WebUI is prepared.”)
print(“Public URL:”, public_url)
print(“Native URL :”, local_url)
print(“=” * 80)
print(“nWhat to do subsequent:”)
print(“1. Open the Public URL.”)
print(“2. Create your admin account the primary time you open it.”)
print(“3. Go to the mannequin selector and select:”, default_model)
print(“4. Begin chatting with OpenAI by Open WebUI.”)
print(“nUseful notes:”)
print(“- Your OpenAI API key was handed by surroundings variables.”)
print(“- Knowledge persists just for the present Colab runtime until you mount Drive.”)
print(“- If the tunnel stops, rerun the cell.”)
def tail_open_webui_logs(strains=80):
log_path = “/content material/open-webui-server.log”
if not os.path.exists(log_path):
print(“No server log discovered.”)
return
with open(log_path, “r”) as f:
content material = f.readlines()
print(“”.be part of(content material[-lines:]))
def stop_open_webui():
world server_proc, tunnel_proc, server_log
for proc in [tunnel_proc, server_proc]:
attempt:
if proc and proc.ballot() is None:
proc.terminate()
besides Exception:
go
attempt:
server_log.shut()
besides Exception:
go
print(“Stopped Open WebUI and tunnel.”)
print(“nHelpers obtainable:”)
print(“- tail_open_webui_logs()”)
print(“- stop_open_webui()”)
We seize the general public tunnel URL and print the ultimate entry particulars in order that we will open Open WebUI immediately within the browser. We additionally show the following steps for utilizing the interface, together with creating an admin account and deciding on the configured mannequin. Additionally, we outline helper capabilities for checking logs and stopping the working processes, which makes the general setup simpler for us to handle and reuse.
In conclusion, we created a completely useful Open WebUI deployment on Colab and linked it to OpenAI in a safe, structured method. We put in the applying and its supporting packages, offered authentication particulars by way of protected enter, configured the backend connection to the OpenAI API, and began the native internet server powering the interface. We then uncovered that server by a public tunnel, making the applying usable by a browser with out requiring native set up on our machine. As well as, we included helper capabilities for viewing logs and stopping the working companies, which makes the setup simpler to handle and troubleshoot throughout experimentation. Total, we established a reusable, sensible workflow that helps us shortly spin up Open WebUI in Colab, check OpenAI-powered chat interfaces, and reuse the identical basis for future prototyping, demos, and interface-driven AI tasks.
Take a look at the Full Codes right here. Additionally, be at liberty to observe us on Twitter and don’t neglect to hitch our 120k+ ML SubReddit and Subscribe to our E-newsletter. Wait! are you on telegram? now you’ll be able to be part of us on telegram as effectively.
Must accomplice with us for selling your GitHub Repo OR Hugging Face Web page OR Product Launch OR Webinar and many others.? Join with us
Michal Sutter is a knowledge science skilled with a Grasp of Science in Knowledge Science from the College of Padova. With a strong basis in statistical evaluation, machine studying, and knowledge engineering, Michal excels at remodeling complicated datasets into actionable insights.
