Our Blog
Data Integration Platform
Blogs about Data Integration Platform
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
We care about your data in our Privacy Policy.
Data Integration Platform
5
min read
Use Spark interims to troubleshoot and polish low-code Spark pipelines: Part 2
In Part 1, we learned an easy way to troubleshoot a data pipeline using historical, read-only metadata. Now, I want to dig in and polish my individual spark data frames.
Data Integration Platform
6
min read
Getting started with low-code SQL
Empowering business data users to quickly and easily build scalable data pipelines on the lakehouse
Data Integration Platform
6
min read
How to implement ETL on Apache Spark
Learn key steps to follow, how to define the ETL pipeline using Spark APIs and dataframes, and best practices for testing and optimizing pipelines for maximum efficiency
Data Integration Platform
5
min read
Hitting data driven home runs: How the Texas Rangers win by harnessing Prophecy in their data mesh architecture
Learn how the Texas Rangers use Prophecy and Databricks Lakehouse as the foundation of their data mesh architecture to gain a competitive advantage with low-code data engineering.
Data Integration Platform
11
min read
Empower all business data users with interactive SQL development
Leverage Prophecy’s low-code approach and integration with dbt Core to build SQL data pipelines without compromising on engineering best practices
Data Integration Platform
15
min read
PySpark hands-on tutorial using a visual IDE
Let's get started with PySpark using a visual interface to generate open-source Spark code.