LocalAI is the free, Open Source OpenAI alternative. LocalAI act as a drop-in replacement REST API that’s compatible with OpenAI API specifications for local inferencing. It allows you to run LLMs, generate images, audio (and not only) locally or on-prem with consumer grade hardware, supporting multiple model families and architectures. Does not require GPU. It is created and maintained by Ettore Di Giacinto.
A lightweight next-gen data explorer - Postgres, MySQL, SQLite, MongoDB, Redis, MariaDB, Elastic Search, and Clickhouse with Chat interface
The first and the best multi-agent framework. Finding the Scaling Law of Agents
Building Multi-Agent Systems for Task Automation.
CAMEL is an open-source community dedicated to finding the scaling laws of agents. We believe that studying these agents on a large scale offers valuable insights into their behaviors, capabilities, and potential risks. To facilitate research in this field, we implement and support various types of agents, tasks, prompts, models, and simulated environments.
CAMEL emerges as the earliest LLM-based multi-agent framework, and is now a generic framework to build and use LLM-based agents for real-world task solving. We believe that studying these agents on a large scale offers valuable insights into their behaviors, capabilities, and potential risks. To facilitate research in this field, we implement and support various types of agents, tasks, prompts, models, and simulated environments.
Google AI for Developers
MCP server for your browser.
Browser MCP is a Model Context Provider (MCP) server that allows AI applications to control your browser.
If you want to automate actions on a website, like repeatedly fill out a form, you normally can't do it with AI apps like Cursor or Claude because they don't have access to a web browser. With Browser MCP, you can connect AI apps to your browser so they can automate tasks on your behalf.
Turn any GitHub repository into a comprehensive AI-powered documentation hub.
Generate beautiful, world-class documentation from any GitHub repository — instantly.
Just replace hub with summarize in any GitHub URL to generate a live, interactive documentation hub.
An SDK for working with LLMs and AI Agents from Apache Airflow, based on Pydantic AI.
It allows users to call LLMs and orchestrate agent calls directly within their Airflow pipelines using decorator-based tasks. The SDK leverages the familiar Airflow @task syntax with extensions like @task.llm, @task.llm_branch, and @task.agent.
Transform Al Prototypes into Enterprise-Grade Products.
Langtrace is an Open Source Observability and Evaluations Platform for Al Agents.
Related contents:
LLMs for language and code + Time series and geospatial foundation models.
Achieve over 90% cost savings with Granite's smaller and open models, designed for developer efficiency.
Fit for purpose and open sourced, these enterprise-ready models deliver exceptional performance against safety benchmarks and across a wide range of enterprise tasks from cybersecurity to RAG.
Related contents:
A Copilot for Git.
Cocommit is a command-line tool that works with your HEAD commit and leverages an LLM of your choice to enhance commit quality.
A good commit consists of multiple elements, but at a minimum, it should have a well-crafted commit message. Cocommit analyzes the message from the last (HEAD) commit and suggests improvements, highlighting both strengths and areas for enhancement.
Your AI pair programmer.
Related contents:
Balance agent control with agency. Build resilient language agents as graphs.
Gain control with LangGraph to design agents that reliably handle complex tasks. Build and scale agentic applications with LangGraph Platform.
LangGraph — used by Replit, Uber, LinkedIn, GitLab and more — is a low-level orchestration framework for building controllable agents. While langchain provides integrations and composable components to streamline LLM application development, the LangGraph library enables agent orchestration — offering customizable architectures, long-term memory, and human-in-the-loop to reliably handle complex tasks.
Related contents:
We are uncovering a new way of building software by embracing AI, iteration, and human intuition.
This is a cleanroom deobfuscation of the official Claude Code npm package.
Related contents:
Fleur is the app store for Claude.
The easiest way to discover and install MCPs.
Fleur is a desktop application that serves as an app marketplace for MCPs. It allows you to discover, install, and manage apps that extend the functionality of Claude Desktop and Cursor.
All without having to use a command line. Fleur is made for non-technical users in mind, but is open-source and extensible so developers can make it their own.
Related contents:
A Datacenter Scale Distributed Inference Serving Framework.
NVIDIA Dynamo is a high-throughput low-latency inference framework designed for serving generative AI and reasoning models in multi-node distributed environments. Dynamo is designed to be inference engine agnostic (supports TRT-LLM, vLLM, SGLang or others) and captures LLM-specific capabilities.
Related contents: