Published on

LangFlow - An Effortless Way to Experiment and Prototype LangChain Pipelines

Authors
  • avatar
    Twitter
ocean

📦 Installation

To install LangFlow locally, you can use pip:

pip install langflow

Once installed, you can run LangFlow using the following command:

python -m langflow

Alternatively, LangFlow can be deployed on Google Cloud Platform (GCP) or Jina AI Cloud. The deployment process for both platforms is well-documented and can be found in the LangFlow repository.

🚀 Deployment Options

LangFlow can be deployed on Google Cloud Platform (GCP) using Google Cloud Shell. A step-by-step guide is available in the LangFlow repository, which provides detailed instructions on how to deploy LangFlow on GCP.

Another deployment option is Jina AI Cloud, which integrates with LangFlow to provide a one-command deployment. To deploy LangFlow on Jina AI Cloud, you need to install langchain-serve using pip:

pip install -U langchain-serve

Once installed, you can deploy LangFlow on Jina AI Cloud using the following command:

langflow --jcloud

The deployment process will generate a link to access the LangFlow server, which usually takes a couple of minutes to start up. More information about managing the server can be found in the LangChain Serve repository.

🌐 API Usage

LangFlow provides an API that allows users to interact with the server. You can use LangFlow directly on your browser or use the API endpoints on Jina AI Cloud. Here is an example of how to use the API with Python:

import requests

BASE_API_URL = "https://langflow-e3dd8820ec.wolf.jina.ai/api/v1/predict"
FLOW_ID = "864c4f98-2e59-468b-8e13-79cd8da07468"
TWEAKS = {
  "ChatOpenAI-g4jEr": {},
  "ConversationChain-UidfJ": {}
}

def run_flow(message: str, flow_id: str, tweaks: dict = None) -> dict:
    api_url = f"{BASE_API_URL}/{flow_id}"

    payload = {"message": message}

    if tweaks:
        payload["tweaks"] = tweaks

    response = requests.post(api_url, json=payload)
    return response.json()

print(run_flow("Your message", flow_id=FLOW_ID, tweaks=TWEAKS))

The API allows you to send a message to the LangFlow server and receive a JSON response. You can customize the flow by adding tweaks to the payload.

🎨 Creating Flows

Creating flows with LangFlow is easy and intuitive. You can drag and drop components onto the canvas and connect them together to create your pipeline. LangFlow provides a wide range of components, including LLMs, prompt serializers, agents, and chains.

You can explore different components by editing prompt parameters, linking chains and agents, and tracking an agent's thought process. Once you have created your flow, you can export it as a JSON file and use it with LangChain. To export a flow, click the "Export" button in the top right corner of the canvas. In Python, you can load the flow using the load_flow_from_json function:

from langflow import load_flow_from_json

flow = load_flow_from_json("path/to/flow.json")
flow("Hey, have you heard of LangFlow?")

👋 Contributing

LangFlow is an open-source project, and contributions from developers of all levels are welcome. If you would like to contribute, please check the contributing guidelines in the LangFlow repository.

Join the LangFlow Discord server to ask questions, make suggestions, and showcase your projects.

📄 License

LangFlow is released under the MIT License. See the LICENSE file in the LangFlow repository for more details.

LangFlow is a powerful tool that simplifies the process of experimenting and prototyping LangChain pipelines. With its easy installation, flexible deployment options, and intuitive interface for creating flows, LangFlow is a valuable asset for AI enthusiasts and developers. Whether you are a beginner or an experienced user, LangFlow provides the tools and resources to explore the capabilities of LangChain and create innovative AI solutions.