Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
This overview summarizes the Spark monitoring APIs available in Microsoft Fabric. It is intended for developers and data engineers who require robust monitoring and diagnostic capabilities for Spark applications.
Fabric Spark Monitoring APIs
Fabric provides APIs to monitor Spark applications at both the workspace and item levels, as well as detailed diagnostics for individual Spark applications.
Workspace and Item-Level APIs
APIs | Description |
---|---|
Spark applications in a workspace | Retrieve a list of Spark applications in the workspace. |
Spark applications for a notebook | Retrieve a list of Spark applications associated with a notebook. |
Spark applications for a Spark Job Definition | Retrieve a list of Spark applications associated with a Spark Job Definition. |
Spark applications for a Lakehouse | Retrieve a list of Spark applications associated with a Lakehouse. |
Single Spark Application APIs
These APIs are used for deep-dive diagnostics, providing comprehensive details, metrics, and logs for individual Spark applications.
APIs | Description |
---|---|
Notebook Run | Retrieve detailed information for the Spark application that executed a specific notebook run. |
Spark Job Definition Submission | Retrieve detailed information for Spark applications initiated via Spark Job Definitions. |
Lakehouse Operation | Retrieve detailed information for the Spark application triggered by a Lakehouse operation. |
Spark Open-source metrics APIs | Fully aligned with the Spark History Server APIs for collecting Spark metrics. |
Livy Log | Retrieve Spark Livy logs for detailed session-level information. |
Driver Log | Access driver logs for debugging application-level issues. |
Executor Log | Retrieve executor logs for troubleshooting distributed execution issues. |
Resource usage APIs | Monitor Spark resource usage information. |
Next steps
Use the following resources to quickly access APIs for listing Livy sessions and detailed diagnostics for Spark applications:
- Workspace and ItemLevel APIs
List all completed and active Livy sessions.
- Workspace - List Sessions (Spark)
- Notebook - List Sessions (Notebook)
- Spark Job Definition - List Sessions (Spark Job Definition)
- Lakehouse - List Sessions (Lakehouse)
Single Spark Application APIs
a. Get application details
- Notebook - Get Livy Session (Notebook)
- Spark Job Definition - Get Livy Session (SparkJobDefinition)
- Lakehouse - Get Livy Session (Lakehouse)
b. Retrieve logs and metrics