If you do not have an existing init script, this is how you create one in Databricks:
Go to workspace and press "Create -> File"
Paste the DataFlint snippet & save. Afterward in your cluster config, to to Advanced -> init scripts and add your newly created init script:
Install on DataBricks from a notebook
This method supports both DataBricks Community and the DataBricks paid version
Go to your cluster to "library" tab and click on the "Install new" blue button:
Choose "Maven" and enter in the coordinates for dataflint
image is using version 0.1.0, you should use the newest dataflint version :)
In your notebook or app, run this 2 lines (if in python notebook, also add %scala)
Now DataFlint is installed! now you can go to "Spark UI" tab and see a "dataflint" tab. It's highly recommended to use the "Open in new tab" link so you get the best experience.
Dataflint is only supported when the cluster is running, after the cluster is running dataflint will no longer be available