New to Gradio? Start here: Getting Started
See the Release History
To install the Gradio Python Client from main, run the following command:
pip install 'gradio-client @ git+https://github.com/gradio-app/gradio@3b56b13268ddcaf74d4e85e5dbf5a616d7843995#subdirectory=client/python'Clients 1.0 Launch!
We’re excited to unveil the first major release of the Gradio clients. We’ve made it even easier to turn any Gradio application into a production endpoint thanks to the clients’ ergonomic, transparent, and portable design.
Ergonomic API 💆
Stream From a Gradio app in 5 lines
Use the submit method to get a job you can iterate over.
In python:
from gradio_client import Client
client = Client("gradio/llm_stream")
for result in client.submit("What's the best UI framework in Python?"):
print(result)In typescript:
import { Client } from "@gradio/client";
const client = await Client.connect("gradio/llm_stream")
const job = client.submit("/predict", {"text": "What's the best UI framework in Python?"})
for await (const msg of job) console.log(msg.data)Use the same keyword arguments as the app
In the examples below, the upstream app has a function with parameters called `message`, `system_prompt`, and `tokens`. We can see that the client `predict` call uses the same arguments.
In python:
from gradio_client import Client
client = Client("http://127.0.0.1:7860/")
result = client.predict(
message="Hello!!",
system_prompt="You are helpful AI.",
tokens=10,
api_name="/chat"
)
print(result)In typescript:
import { Client } from "@gradio/client";
const client = await Client.connect("http://127.0.0.1:7860/");
const result = await client.predict("/chat", {
message: "Hello!!",
system_prompt: "Hello!!",
tokens: 10,
});
console.log(result.data);Better Error Messages
If something goes wrong in the upstream app, the client will raise the same exception as the app provided that `show_error=True` in the original app's `launch()` function, or it's a `gr.Error` exception.
Transparent Design 🪟
Anything you can do in the UI, you can do with the client:
- 🔐Authentication
- 🛑 Job Cancelling
- ℹ️ Access Queue Position and API
- 📕 View the API information
Here's an example showing how to display the queue position of a pending job:
from gradio_client import Client
client = Client("gradio/diffusion_model")
job = client.submit("A cute cat")
while not job.done():
status = job.status()
print(f"Current in position {status.rank} out of {status.queue_size}")Portable Design ⛺️
The client can run from pretty much any python and javascript environment (node, deno, the browser, Service Workers).
Here's an example using the client from a Flask server using gevent:
from gevent import monkey
monkey.patch_all()
from gradio_client import Client
from flask import Flask, send_file
import time
app = Flask(__name__)
imageclient = Client("gradio/diffusion_model")
@app.route("/gen")
def gen():
result = imageclient.predict(
"A cute cat",
api_name="/predict"
)
return send_file(result)
if __name__ == "__main__":
app.run(host="0.0.0.0", port=5000)v1.0 Migration Guide and Breaking Changes
Python
- The `serialize` argument of the `Client` class was removed and has no effect.
- The `upload_files` argument of the `Client` was removed.
- All filepaths must be wrapped in the `handle_file` method. For example, `caption = client.predict(handle_file('./dog.jpg'))`.
- The `output_dir` argument was removed. It is not specified in the `download_files` argument.
Javascript
The client has been redesigned entirely. It was refactored from a function into a class. An instance can now be constructed by awaiting the `connect` method.
const app = await Client.connect("gradio/whisper")The app variable has the same methods as the python class (submit, predict, view_api, duplicate).