This session has ended. Feel free to explore other areas of the event platform.

Building Data Pipelines With Java And Open Source
Session details

A few years ago, moving data between applications and data stores included expensive monolithic stacks from large software vendors with little flexibility.

Now, with frameworks such as Apache Beam and Apache Airflow, we can schedule and run data processing jobs for both streaming and batch with the same underlying code.

This presentation demonstrates the concepts of how this can glue your applications together and shows how we can run data pipelines as Java code, the use cases for such pipelines, and how we can move them from local machines to the cloud solutions by changing just a few lines of Java in our Apache Beam code.

Rustam Mehmandarov
Chief Engineer
Exchange business card with test 1..

Exchange business card with test 1..asdads