Open Source LLM Engineering Platform.
Traces, evals, prompt management and metrics to debug and improve your LLM application.
🪢 Open source LLM engineering platform: LLM Observability, metrics, evals, prompt management, playground, datasets. Integrates with LlamaIndex, Langchain, OpenAI SDK, LiteLLM, and more. YC W23
Prompt Engineering, Evaluation, and Observability for LLM apps.
Your End-to-End Collaborative Open Source End-to-End LLM Engineering Platform.
Agenta provides integrated tools for prompt engineering, versioning, evaluation, and observability—all in one place.
The open-source LLMOps platform: prompt playground, prompt management, LLM evaluation, and LLM Observability all in one place.
A Prompt Manager that focuses on On-Premise and developer experience.
Prompt optimization scratch.
Promptim is an experimental prompt optimization library to help you systematically improve your AI systems.
Promptim automates the process of improving prompts on specific tasks. You provide initial prompt, a dataset, and custom evaluators (and optional human feedback), and promptim runs an optimization loop to produce a refined prompt that aims to outperform the original.
Related contents:
Prompt design using JSX.
Priompt (priority + prompt) is a JSX-based prompting library. It uses priorities to decide what to include in the context window.
Priompt is an attempt at a prompt design library, inspired by web design libraries like React.
Guides, papers, lecture, notebooks and resources for prompt engineering.
Prompt engineering is a relatively new discipline for developing and optimizing prompts to efficiently use language models (LMs) for a wide variety of applications and research topics. Prompt engineering skills help to better understand the capabilities and limitations of large language models (LLMs).
BAML is an expressive language for structured text generation.
BAML is a domain-specific language to write and test LLM functions.
In BAML, prompts are treated like functions. An LLM function is a prompt template with some defined input variables, and a specific output type like a class, enum, union, optional string, etc.
With BAML you can write and test a complex LLM function in 1/10 of the time it takes to setup a python LLM testing environment.
Best ChatGPT Prompts & AI Prompts Community
automatically tests prompt injection attacks on ChatGPT instances.
Prompt injection is a type of security vulnerability that can be exploited to control the behavior of a ChatGPT instance. By injecting malicious prompts into the system, an attacker can force the ChatGPT instance to do unintended actions.
Your Guide to Communicating with Artificial Intelligence.
Learn how to use ChatGPT and other AI tools to accomplish your goals using our free and open source curriculum, designed for all skill levels!