You see a column in a dataset, and want to understand how it came to be? Well, how and where was this value computed? what does it represent? - let’s debug why it is incorrect.
In Prophecy, you just type the name of the column - and the search will match it with datasets - see if a column match. It will search inside the code of all Spark workflows and see if there is a read or write of that column value and point you to the exact expression.
You look at some data and want to reformat it - how easy can it be on Spark?
In Prophecy, you click the dataset to see sample data - so it has first & name - let's combine that, and amount has too many decimal points.
You add a reformat gem on the canvas, and in it start typing a function - and ExpressionBuilder comes and says - looks like you want reformat - should I add it? and helps you along. You add the transform, run and check the output, all in less than a minute.
How simple are your Schedules to develop & deploy? Prophecy gives you low-code Airflow!
In Prophecy, you can see the workflows deployed to Airflow and how they've been doing - succeeding or failing. Click on one and it opens a visual graph - that starts with an S3 sensor and runs a Spark workflow when a new file shows up.
Click the code tab and you'll see the generated code in Python - prettier than you could ever write it!