All your teams successful & productive
on Apache Spark & Apache Airflow
with Low-Code Development, Scheduling & Metadata
Prophecy enables many more users - including visual ETL developers and Data Analysts. All you need to do is point-and-click and write a few SQL expressions to create your pipelines.
Low-Code Spark for workflow development. Low-Code Airflow for Scheduling. Metadata Search and Column Level Lineage for Management and Governance. We support complete Development & Deployment lifecycle.
As you use the Low-Code designer to build your workflows - you are developing high quality, readable code for Spark and Airflow that is committed to your Git.
Prophecy gives you a gem builder - for you to quickly develop and rollout your own Frameworks. Examples are Data Quality, Encryption, new Sources and Targets that extend the built-in ones.
Prophecy works with your existing infrastructure. Our Data Engineering System gives your the primitives (Gems) that support Spark sources, targets and transformations. Each Gem generates high quality code on Git that you can see in the code editor. Search and Column-Level Lineage give you visibility for operation and Governance. Low Code Airflow enables you to deploy these workflows quickly and monitor them.
Prophecy Data Engineering system can be extended with Gem Builder to add your own standard sources, targets and transforms to Spark and roll it out to your entire team. You can build your own frameworks with custom Data Quality library, or auditing library. You can also add custom subgraphs that are templates for common operations with a few fill-in-the-blanks for the end user.
Prophecy brings the agility of cloud and DevOps to Data Engineering, increasing the speed of analytics & your business.
Prophecy provides best practices and infrastructure as managed services – making your life and operations simple!
With Prophecy, your workflows are high performance and use scale-out performance & scalability of the cloud.