<?xml version="1.0" encoding="UTF-8"?>
<feed xmlns="http://www.w3.org/2005/Atom">
    <title>flink</title>
    <link rel="self" type="application/atom+xml" href="https://links.biapy.com/guest/tags/2023/feed"/>
    <updated>2026-05-01T00:30:35+00:00</updated>
    <id>https://links.biapy.com/guest/tags/2023/feed</id>
            <entry>
            <id>https://links.biapy.com/links/4559</id>
            <title type="text"><![CDATA[Apache Beam®]]></title>
            <link rel="alternate" href="https://beam.apache.org/" />
            <link rel="via" type="application/atom+xml" href="https://links.biapy.com/links/4559"/>
            <author>
                <name><![CDATA[Biapy]]></name>
            </author>
            <summary type="text">
                <![CDATA[The Unified Apache Beam Model. The easiest way to do batch and streaming data processing. Write once, run anywhere data processing for mission-critical production workloads.

Apache Beam is a unified programming model for Batch and Streaming data processing.
Apache Beam is a unified model for defining both batch and streaming data-parallel processing pipelines, as well as a set of language-specific SDKs for constructing pipelines and Runners for executing them on distributed processing backends, including Apache Flink, Apache Spark, Google Cloud Dataflow, and Hazelcast Jet. 

- [Beam @ GitHub](https://github.com/apache/beam).]]>
            </summary>
            <updated>2025-08-29T04:36:39+00:00</updated>
        </entry>
            <entry>
            <id>https://links.biapy.com/links/6707</id>
            <title type="text"><![CDATA[Delta Lake]]></title>
            <link rel="alternate" href="https://delta.io/" />
            <link rel="via" type="application/atom+xml" href="https://links.biapy.com/links/6707"/>
            <author>
                <name><![CDATA[Biapy]]></name>
            </author>
            <summary type="text">
                <![CDATA[Delta Lake is an open-source storage framework that enables building a
Lakehouse architecture with compute engines including Spark, PrestoDB, Flink, Trino, and Hive and APIs for Scala, Java, Rust, Ruby, and Python.]]>
            </summary>
            <updated>2025-08-29T10:34:47+00:00</updated>
        </entry>
    </feed>
