<?xml version="1.0" encoding="UTF-8"?>
<feed xmlns="http://www.w3.org/2005/Atom">
    <title>gemma</title>
    <link rel="self" type="application/atom+xml" href="https://links.biapy.com/guest/tags/2009/feed"/>
    <updated>2026-04-30T19:23:30+00:00</updated>
    <id>https://links.biapy.com/guest/tags/2009/feed</id>
            <entry>
            <id>https://links.biapy.com/links/4496</id>
            <title type="text"><![CDATA[Unsloth AI]]></title>
            <link rel="alternate" href="https://unsloth.ai/" />
            <link rel="via" type="application/atom+xml" href="https://links.biapy.com/links/4496"/>
            <author>
                <name><![CDATA[Biapy]]></name>
            </author>
            <summary type="text">
                <![CDATA[Finetune AI &amp;amp; LLMs faster.
 Web UI for training and running open models like Gemma 4, Qwen3.6, DeepSeek, gpt-oss locally. 

Unslow AI training &amp;amp; finetuning Get 30x faster with unsloth.  5X faster 60% less memory QLoRA finetuning. Finetune Mistral, Gemma, Llama 2-5x faster with 70% less memory!

- [Unsloth @ GitHub](https://github.com/unslothai/unsloth).

Related contents:

- [S4E10 - Quel destin pour l’Apple Vision Pro ? @ Underscore_&amp;#039;s Acast :fr:](https://shows.acast.com/micode-underscore/episodes/s4e10-quel-destin-pour-lapple-vision-pro).
- [7 Lessons from building a small-scale AI application @ Richard Li](https://www.thelis.org/blog/lessons-from-ai).]]>
            </summary>
            <updated>2026-04-30T09:05:43+00:00</updated>
        </entry>
            <entry>
            <id>https://links.biapy.com/links/4497</id>
            <title type="text"><![CDATA[Ollama]]></title>
            <link rel="alternate" href="https://ollama.com/" />
            <link rel="via" type="application/atom+xml" href="https://links.biapy.com/links/4497"/>
            <author>
                <name><![CDATA[Biapy]]></name>
            </author>
            <summary type="text">
                <![CDATA[Get up and running with large language models, locally.

Run Llama 2, Code Llama, Mistral, Gemma, and other models. Customize and create your own.

- [Ollama @ GitHub](https://github.com/ollama/ollama).

Related contents:

- [Local RAG with Ollama, Mistral, and Turso @ Turso&amp;#039;s blog](https://turso.tech/blog/local-rag-with-ollama-and-turso-sqlite).
- [S4E10 - Quel destin pour l’Apple Vision Pro ? @ Underscore_&amp;#039;s Acast :fr:](https://shows.acast.com/micode-underscore/episodes/s4e10-quel-destin-pour-lapple-vision-pro).
- [Ollama Course – Build AI Apps Locally @ freeCodeCamp.org&amp;#039;s YouTube](https://www.youtube.com/watch?v=GWB9ApTPTv4).
- [Detecting Exposed LLM Servers: A Shodan Case Study on Ollama @ Cisco Blogs](https://blogs.cisco.com/security/detecting-exposed-llm-servers-shodan-case-study-on-ollama).
- [Ollama - 14 000 serveurs IA laissés en libre-service sur Internet @ Korben :fr:](https://korben.info/ollama-serveurs-vulnerabilites-secrete.html).
- [Faire tourner un LLM localement sur votre ordinateur @ Quoi de neuf les devs ? :fr:](https://happytodev.substack.com/p/brent-roose-est-linvite-du-n147-de?open=false#%C2%A7faire-tourner-un-llm-localement-sur-votre-ordinateur).
- [The Ultimate Beginner&amp;#039;s Guide to Self-Hosting Your Own AI @ Arsturn](https://www.arsturn.com/blog/the-ultimate-beginners-guide-to-self-hosting-your-own-ai).
- [How to Run and Customize LLMs Locally with Ollama @ freeCodeCamp](https://www.freecodecamp.org/news/run-and-customize-llms-locally-with-ollama/).
- [LLMs on Kubernetes Part 1: Understanding the threat model @ CNCF](https://www.cncf.io/blog/2026/03/30/llms-on-kubernetes-part-1-understanding-the-threat-model/).
- [Faire tourner un modèle IA chez soi avec Ollama @ DomoPi :fr:](https://domopi.eu/faire-tourner-un-modele-ia-chez-soi-avec-ollama/).
- [Using AI for Terraform: running locally with Langflow, OpenSearch, &amp;amp; Ollama @ Rosemary Wang&amp;#039;s dev.to](https://dev.to/joatmon08/using-ai-for-terraform-running-a-locally-with-langflow-opensearch-ollama-5co6).]]>
            </summary>
            <updated>2026-04-08T05:59:35+00:00</updated>
        </entry>
    </feed>
