Flet is an opensource Python framework for building real, reactive user interfaces—web, desktop, and mobile—using simple Python code. It wraps Google’s Flutter engine under the hood, so you get native-feeling UI, smooth animations, and a rich set of ready-made controls without touching Dart. You write components, manage state, and handle events in Python, then run the same app in a browser, as a desktop app, or packaged for phones. It’s great for quickly turning scripts and data tools into polished, shareable apps.
sudo apt update && sudo apt install -y python3 python3-pip python3-venv
# macOS/Linux python3 -m venv .venv source .venv/bin/activate # Windows (PowerShell) python -m venv .venv .\.venv\Scripts\Activate.ps1
pip install --upgrade pip pip install flet==0.28.3Note: Flet is in very active development - we will be using the 0.28.3 release specifically.
.venv
.
flet
becomes a globally recognized command.
Create a new Flet project by running the following command in your workspace directory:
flet create --project-name MLE-flet-demo --description "becm33mle app demo"
Directory structure created by flet create
(with a local Python virtual environment .venv
at the project root).
├── README.md ├── pyproject.toml ├── .venv ├── src │ ├── assets │ │ └── icon.png │ └── main.py └── storage ├── data └── temp
Run the template main.py
app:
flet run
main.py
is the default entry-point of any Flet app. Different files can be launched using flet run <path_to_file>
. Flet apps can also be launched using python3 <path_to_file>
, though this approach is not recommended!
You should see the following interactive window:
Flet can be run as a desktop app:
flet run
Or a web app
flet run --web
Or on a mobile phone using the official companion app (iOS, Android):
# Android flet run --android # iOS flet run --iosNote: devices must be on the same local network.
flet run -d -r
. This will watch for any changes in all subdirectories.
Depending on the target platform Flet can be a multi-user app (web). For this purpose, Flet spawns a new Page
for each connected user (on desktop and mobile usually just a single page). Each Page can have multiple Views
, which are stacked in a list, acting as sort of navigation history. Views
can be appended (opening a new page) or poped (going back). Each View
has an assortment of Controls
, building the GUI of that particular View
.
The Page
exposes current route
and an on_route_change
event handler. This can be used to handle View poping/appending and unknown route handling (404 page not found).
/
symbol (inclusive) - i.e. mydomain.cz/route/to/somewhere
Each page below is a class inheriting from View
. The app builds the view stack in on_route_change
and handles back navigation in on_view_pop
.
import flet as ft class MainView(ft.View): def __init__(self, page: ft.Page): super().__init__( route="/", appbar=ft.AppBar(title=ft.Text("Main")), controls=[ ft.Text("This is the main page"), ft.Row([ ft.ElevatedButton("Go to 1", on_click=lambda _:page.go("/1")), ft.ElevatedButton("Go to 2", on_click=lambda _:page.go("/2")), ]) ], ) class PageOne(ft.View): def __init__(self, page: ft.Page): super().__init__( route="/1", appbar=ft.AppBar(title=ft.Text("Page 1")), controls=[ ft.Text("Hello from page 1"), ft.ElevatedButton("Home", on_click=lambda _:page.go("/")), ], ) class PageTwo(ft.View): def __init__(self, page: ft.Page): super().__init__( route="/2", appbar=ft.AppBar(title=ft.Text("Page 2")), controls=[ ft.Text("Hello from page 2"), ft.ElevatedButton("Home", on_click=lambda _:page.go("/")), ], ) def main(page: ft.Page): page.title = "Routing demo" def route_change(e: ft.RouteChangeEvent): page.views.clear() page.views.append(MainView(page)) if page.route == "/1": page.views.append(PageOne(page)) if page.route == "/2": page.views.append(PageTwo(page)) page.update() def view_pop(e: ft.ViewPopEvent): page.views.pop() page.go(page.views[-1].route) page.on_route_change = route_change page.on_view_pop = view_pop page.go(page.route) ft.app(target=main)
ft.app(target=main)
starts the app and creates a Page
(one per user)
Page
, View
, and anything placed in the Views
are so-called controls
(i.e. ft.Text
, ft.ElevatedButton
, …)
control
to take effect, we must call it's .update()
method or any of it's parent's .update()
methods.
Page
to our Views
.
controls
have a .page
property, which is only accessible if the control is currently “visible” on the page (usually not accessible in init).
Flet ships with many built-ins, but you can create reusable components by styling a built-in control or composing several controls into one. Pick the closest existing control (e.g., Container
, Row
, Text
, ElevatedButton
, or even View
) and inherit from it.
ft.View
— here you inherit from the most appropriate control and specialize it.
A primary button with consistent look & behavior used app-wide.
import flet as ft class PrimaryButton(ft.ElevatedButton): def __init__(self, **kwargs): super().__init__( bgcolor=ft.Colors.BLUE, color=ft.Colors.WHITE, style=ft.ButtonStyle(shape=ft.RoundedRectangleBorder(radius=8)), **kwargs )
A small component composed of a label + value + two buttons. We inherit from a Row
and manage internal state.
import flet as ft class Counter(ft.Row): def __init__(self, value: int = 0): super().__init__(alignment=ft.MainAxisAlignment.START, spacing=20) self._value = value self.value_text = ft.Text(str(self._value), weight=ft.FontWeight.BOLD) dec_btn = ft.IconButton(ft.Icons.REMOVE, on_click=lambda _:self._value_change(delta=-1)) inc_btn = ft.IconButton(ft.Icons.ADD, on_click=lambda _:self._value_change(delta=1)) self.controls = [ft.Text("Count:"), self.value_text, dec_btn, inc_btn] def _value_change(self, delta=0): self._value += delta self.value_text.value = str(self._value) self.update()
A tiny app demo using both custom controls.
import flet as ft def main(page: ft.Page): page.title = "Custom controls (modern API)" out = ft.Text("Click the button or use the counter.") page.add( PrimaryButton(text="Primary action", on_click=lambda e: (setattr(out, "value", "Clicked!"), page.update())), Counter(3), out ) ft.app(target=main)
Text
(e.g., timer, counter, clock,…).
Row
/ Column
.
Container
(most common).
View
(as in the routing example).
Two ways to run the demo:
gpu.fel.cvut.cz
and run Ollama remotely, then forward the API port to your machine.
You need a dedicated GPU with ~8GB of VRAM to run LLM localy!
ollama serve
ollama pull llama3:8b
From your laptop, open the tunnel first:
ssh -L 11434:localhost:11434 <your_ctu_username>@gpu.fel.cvut.czExplanation:
-L local_port:remote_host:remote_port
maps your local 11434
to remote localhost:11434
. After connecting, any request you send to http://localhost:11434
on your laptop goes through the SSH tunnel to the server’s Ollama.
You must be to the CTU network or connected via the CTU VPN. More info about the CTU GPU cluster can be found here.
Download the latest release (as of writing v0.12.4):
wget https://github.com/ollama/ollama/releases/download/v0.12.4-rc6/ollama-linux-amd64.tgz
Unpack to a folder in your home:
mkdir ollama tar -xvzf ollama-linux-amd64.tgz -C ollama
Run the server (keeps the process in the foreground):
./ollama/bin/ollama serve
In a new SSH tab, pull the model:
./ollama/bin/ollama pull llama3:8b(Adjust the tag if you prefer a different small variant.)
With the SSH tunnel still open, you can hit the remote API at your localhost:
# list models curl http://localhost:11434/api/tags # quick generate call curl http://localhost:11434/api/generate -d '{"model":"llama3:8b","prompt":"Say hello from CTU GPU."}'
localhost
+ SSH tunneling.
A Flet UI using the ollama
python package (pip install ollama
) API on http://localhost:11434
.
import threading from dataclasses import dataclass from typing import List, Dict, Optional import flet as ft try: import ollama except ImportError: ollama = None # ----------------------------- # Data structures # ----------------------------- @dataclass class ChatMsg: is_user: str content: str # ----------------------------- # UI Message Bubble # ----------------------------- class MessageBubble(ft.Container): def __init__(self, is_user: bool, text: str): super().__init__() self.padding = 12 self.margin = ft.margin.only( left=40 if is_user else 0, right=0 if is_user else 40, top=6, bottom=6, ) self.bgcolor = ft.Colors.BLUE_800 if is_user else ft.Colors.GREY_800 self.border_radius = ft.border_radius.all(14) self.content = ft.Text(text, selectable=True, color=ft.Colors.WHITE, size=14) def set_text(self, text: str): if isinstance(self.content, ft.Text): self.content.value = text # ----------------------------- # Main app # ----------------------------- class FletOllamaDemo: def __init__(self, page: ft.Page): self.page = page self.page.title = "Ollama Demo" self.page.theme_mode = ft.ThemeMode.DARK self.page.padding = 0 self.page.window.width = 480 self.page.window.height = 640 # chat state self.messages: List[ChatMsg] = [] self.streaming_thread: Optional[threading.Thread] = None self.stop_event = threading.Event() self.model = "llama3:8b" # controls self.chat_list = ft.ListView(expand=True, spacing=0, padding=16) self.input_field = ft.TextField( hint_text="Type and press Enter…", autofocus=True, shift_enter=True, min_lines=1, max_lines=5, expand=True, on_submit=self._on_send_clicked, ) self.send_btn = ft.FilledButton("Send", icon=ft.Icons.SEND_ROUNDED, on_click=self._on_send_clicked) self.stop_btn = ft.OutlinedButton("Stop", icon=ft.Icons.STOP, on_click=self._on_stop_clicked, disabled=True) bottombar = ft.Container( padding=12, content=ft.Row([self.input_field, self.send_btn, self.stop_btn], vertical_alignment=ft.CrossAxisAlignment.CENTER), ) body = ft.Column([ ft.Container(content=self.chat_list, expand=True), bottombar, ], expand=True) self.page.add(body) def _ensure_model(self) -> bool: if ollama is None: self.page.open(ft.SnackBar(ft.Text("Ollama not available: pip install ollama and run the server."), open=True)) self.page.update() return False try: have = {m.get("model") for m in (ollama.list() or {}).get("models", [])} if self.model not in have: ollama.pull(self.model) return True except Exception as ex: try: ollama.pull(self.model) return True except Exception: self.page.open(ft.SnackBar(ft.Text(f"Model setup error: {ex}"), open=True)) self.page.update() return False def _on_send_clicked(self, e: Optional[ft.ControlEvent] = None): text = (self.input_field.value or "").strip() if not text: return self.input_field.value = "" self._append_user_message(text) self._start_streaming() def _on_stop_clicked(self, e: Optional[ft.ControlEvent] = None): self.stop_event.set() def _append_user_message(self, text: str): self.messages.append(ChatMsg(is_user=True, content=text)) self.chat_list.controls.append(MessageBubble(is_user=True, text=text)) self.chat_list.controls.append(MessageBubble(is_user=False, text="")) # AI reply self.page.update() def _start_streaming(self): if self.streaming_thread and self.streaming_thread.is_alive(): return self.stop_event.clear() self.send_btn.disabled = True self.stop_btn.disabled = False self.page.update() def run(): err = None try: self._stream_from_ollama() except Exception as ex: err = ex finally: self.send_btn.disabled = False self.stop_btn.disabled = True if err: self.page.open(ft.SnackBar(ft.Text(f"Error: {err}"), open=True)) self.page.update() self.streaming_thread = threading.Thread(target=run, daemon=True) self.streaming_thread.start() def _collect_messages(self) -> List[Dict[str, str]]: msgs: List[Dict[str, str]] = [] for m in self.messages: msgs.append({"role": "user" if m.is_user else "assistant", "content": m.content}) return msgs def _stream_from_ollama(self): # find the last bubble to stream to if not self.chat_list.controls or not isinstance(self.chat_list.controls[-1], MessageBubble): return assistant_bubble: MessageBubble = self.chat_list.controls[-1] if not self._ensure_model(): assistant_bubble.set_text( "Model unavailable. Ensure Ollama client/server are installed and running.\n" f"- pip install ollama\n- ollama serve\n- ollama pull {self.model}" ) self.page.update() return msgs = self._collect_messages() full_text = "" try: stream = ollama.chat(model=self.model, messages=msgs, stream=True) for part in stream: if self.stop_event.is_set(): break delta = part.get("message", {}).get("content", "") if not delta: continue full_text += delta assistant_bubble.set_text(full_text) self.chat_list.scroll_to(offset=-1, duration=100) self.page.update() except Exception as ex: assistant_bubble.set_text(f"Error contacting model: {ex}") self.page.update() return if full_text.strip(): self.messages.append(ChatMsg(is_user=False, content=full_text)) else: self.chat_list.controls.pop() self.page.update() def main(page: ft.Page): FletOllamaDemo(page) if __name__ == "__main__": ft.app(target=main)
Run with:
flet runTry to run the code on your mobile device and/or in the browser.
export OLLAMA_HOST=http://127.0.0.1:12345
to change the expected API port.
Create project repo structure on FEE gitlab + README.md with required fields (up to 5p) The fields do not have to be finalized, as many features of your project are yet to be added. Focus on drafting out the general structure of your project.
Requirements for README.md:
Note about milestones: