Prompt Studio's primary reason for existence is so you can develop the necessary prompts for document data extraction super efficiently. It is a purpose-built environment that makes this not just easy for you—but, lot of fun! The document sample, its variants, the prompts you're developing, outputs from different LLMs, the schema you're developing, costing details of the extraction and various tools that let you measure the effectiveness of your prompts are just a click away and easily accessible. Prompt Studio is designed for effective and high speed development and iteration of prompts for document data extraction. Welcome to IDP 2.0!
Automate critical business processes that involve complex documents with a human in the loop. Go beyond RPA with the power of Large Language Models.
🌟 Step 1: Add documents to no-code Prompt Studio and do prompt engineering to extract required fields
🌟 Step 2: Configure Prompt Studio project as API deployment or configure input source and output destination for ETL Pipeline
🌟 Step 3: Deploy Workflows as unstructured data APIs or unstructured data ETL Pipelines!
- 8GB RAM (recommended)
- Linux or MacOS (Intel or M-series)
- Docker
- Docker Compose (if you need to install it separately)
- Git
Next, either download a release or clone this repo and do the following:
✅ ./run-platform.sh
✅ Now visit http://frontend.unstract.localhost in your browser
✅ Use username and password unstract
to login
That's all there is to it!
Follow these steps to change the default username and password.
See user guide for more details on managing the platform.
Another really quick way to experience Unstract is by signing up for our hosted version. It comes with a 14 day free trial!
Unstract comes well documented. You can get introduced to the basics of Unstract, and learn how to connect various systems like LLMs, Vector Databases, Embedding Models and Text Extractors to it. The easiest way to wet your feet is to go through our Quick Start Guide where you actually get to do some prompt engineering in Prompt Studio and launch an API to structure varied credit card statements!
Provider | Status | |
---|---|---|
Qdrant | ✅ Working | |
Weaviate | ✅ Working | |
Pinecone | ✅ Working | |
PostgreSQL | ✅ Working | |
Milvus | ✅ Working |
Provider | Status | |
---|---|---|
OpenAI | ✅ Working | |
Azure OpenAI | ✅ Working | |
Google PaLM | ✅ Working | |
Ollama | ✅ Working |
Provider | Status | |
---|---|---|
Unstract LLMWhisperer | ✅ Working | |
Unstructured.io Community | ✅ Working | |
Unstructured.io Enterprise | ✅ Working | |
LlamaIndex Parse | ✅ Working |
Provider | Status | |
---|---|---|
Snowflake | ✅ Working | |
Amazon Redshift | ✅ Working | |
Google BigQuery | ✅ Working | |
PostgreSQL | ✅ Working | |
MySQL | ✅ Working | |
MariaDB | ✅ Working | |
Microsoft SQL Server | ✅ Working |
Contributions are welcome! Please see CONTRIBUTING.md for further details to get started easily.
- On Slack, join great conversations around LLMs, their ecosystem and leveraging them to automate the previously unautomatable!
- Follow us on X/Twitter
- Follow us on LinkedIn
Do copy the value of ENCRYPTION_KEY
config in either backend/.env
or platform-service/.env
file to a secure location.
Adapter credentials are encrypted by the platform using this key. Its loss or change will make all existing adapters inaccessible!
In full disclosure, Unstract integrates Posthog to track usage analytics. As you can inspect the relevant code here, we collect the minimum possible metrics. Posthog can be disabled if desired by setting REACT_APP_ENABLE_POSTHOG
to false
in the frontend's .env file.