diff --git a/data-explorer/create-table-wizard.md b/data-explorer/create-table-wizard.md index ad20e83827..f4b3c2238f 100644 --- a/data-explorer/create-table-wizard.md +++ b/data-explorer/create-table-wizard.md @@ -1,14 +1,15 @@ --- -title: Create a table in Azure Data Explorer +title: Create a Table in Azure Data Explorer description: Learn how to easily create a table and manually define the schema in Azure Data Explorer with the table creation wizard. ms.reviewer: aksdi ms.topic: how-to -ms.date: 11/13/2024 +ms.date: 02/02/2026 # Customer intent: As a data engineer, I want to create an empty table in Azure Data Explorer so that I can ingest data and query it. --- + # Create a table in Azure Data Explorer -Creating a table is an important step in the process of [data ingestion](ingest-data-overview.md) and [query](/azure/data-explorer/kusto/query/tutorials/learn-common-operators) in Azure Data Explorer. The following article shows how to create a table and schema mapping quickly and easily using the Azure Data Explorer web UI. +Creating a table is an important step in the process of [data ingestion](ingest-data-overview.md) and [query](/azure/data-explorer/kusto/query/tutorials/learn-common-operators) in Azure Data Explorer. The following article shows how to create a table and schema mapping quickly and easily by using the Azure Data Explorer web UI. > [!NOTE] > To create a new table based on existing data, see [Get data from file](get-data-file.md) or [Get data from Azure storage](get-data-storage.md). @@ -21,7 +22,7 @@ Creating a table is an important step in the process of [data ingestion](ingest- * Sign in to the [Azure Data Explorer web UI](https://dataexplorer.azure.com/) and [add a connection to your cluster](web-query-data.md#add-clusters). > [!NOTE] -> To enable access between a cluster and a storage account without public access (restricted to private endpoint/service endpoint), see [Create a Managed Private Endpoint](security-network-managed-private-endpoint-create.md). +> To enable access between a cluster and a storage account without public access (restricted to private endpoint or service endpoint), see [Create a Managed Private Endpoint](security-network-managed-private-endpoint-create.md). ## Create a table @@ -29,31 +30,33 @@ Creating a table is an important step in the process of [data ingestion](ingest- 1. Select **+ Add** > **Table** or right-click on the database where you want to create the table and select **Create table**. +:::image type="content" source="media/create-table-wizard/add-table.png" alt-text="Screenshot of the Add Table option in the left navigation pane."::: + ## Destination tab The **Create table** window opens with the **Destination** tab selected. -1. The **Cluster** and **Database** fields are prepopulated. You can select different values from the dropdown menu. +1. The portal prepopulates the **Cluster** and **Database** fields. You can select different values from the dropdown menu. 1. In **Table name**, enter a name for your table. > [!TIP] - > Table names can be up to 1024 characters including alphanumeric, hyphens, and underscores. Special characters aren't supported. + > Table names can be up to 1,024 characters including alphanumeric, hyphens, and underscores. Special characters aren't supported. -1. Select **Next: Schema** +1. Select **Next: Schema**. ## Schema tab -1. Select **Add new column** and the **Edit columns** panel opens. +1. Select **Add new column**. The **Edit columns** panel opens. 1. For each column, enter **Column name** and **Data type**. Create more columns by selecting **Add column**. :::image type="content" source="media/create-table-wizard/edit-columns.png" alt-text="Screenshot of Edit columns pane, in which you input the column name and data type in Azure Data Explorer."::: -1. Select **Save**. The schema is displayed. +1. Select **Save**. The portal displays the schema. 1. Select **Next: Create table**. - :::image type="content" source="media/create-table-wizard/create-table.png" alt-text="Screenshot of create emptytable wizard with schema input in Azure Data Explorer."::: + :::image type="content" source="media/create-table-wizard/create-table.png" alt-text="Screenshot of create empty table wizard with schema input in Azure Data Explorer."::: -A new table is created in your target destination, with the schema you defined. +The portal creates a new table in your target destination, using the schema you defined. ## Related content diff --git a/data-explorer/dashboard-visuals.md b/data-explorer/dashboard-visuals.md index 24d4a47df2..267a79e79c 100644 --- a/data-explorer/dashboard-visuals.md +++ b/data-explorer/dashboard-visuals.md @@ -1,13 +1,14 @@ --- -title: Dashboard-specific visuals +title: Dashboard-specific Visuals description: Visualizations available in Azure Data Explorer web UI or dashboards ms.reviewer: gabil ms.topic: how-to -ms.date: 02/21/2024 +ms.date: 02/02/2026 --- + # Dashboard-specific visuals -All visualizations that are created in the context of the [render operator](/azure/data-explorer/kusto/query/render-operator) are available in dashboard visualizations. However, the following visualizations are only available in [Azure Data Explorer dashboards](azure-data-explorer-dashboards.md), Real-Time Dashboards, or [Azure Data Explorer web UI](/azure/data-explorer/add-query-visualization), and not with the render operator. +All visualizations that you create in the context of the [render operator](/azure/data-explorer/kusto/query/render-operator) are available in dashboard visualizations. However, the following visualizations are only available in [Azure Data Explorer dashboards](azure-data-explorer-dashboards.md), Real-Time Dashboards, or [Azure Data Explorer web UI](/azure/data-explorer/add-query-visualization), and aren't available by using the render operator. To learn how to customize any dashboard visuals, see [Customize Azure Data Explorer dashboard visuals](dashboard-customize-visuals.md) @@ -15,9 +16,9 @@ For general information on dashboards in Azure Data Explorer, see [Visualize dat ## Funnel chart -A funnel chart visualizes a linear process that has sequential, connected stages. Each funnel stage represents a percentage of the total. So, in most cases, a funnel chart is shaped like a funnel, with the first stage being the largest, and each subsequent stage smaller than its predecessor. +A funnel chart visualizes a linear process that has sequential, connected stages. Each funnel stage represents a percentage of the total. In most cases, a funnel chart is shaped like a funnel, with the first stage being the largest, and each subsequent stage smaller than its predecessor. -The following example visualizes the progression of Server requests, showing the total number of sessions, requests, and their completion status. It highlights the drop-off from sessions to requests and the proportion of completed versus incomplete requests. +The following example visualizes the progression of server requests, showing the total number of sessions, requests, and their completion status. It highlights the drop-off from sessions to requests and the proportion of completed versus incomplete requests. ### Example query @@ -56,13 +57,13 @@ funnelData A heatmap shows values for a main variable of interest across two axis variables as a grid of colored squares. -To render a heatmap, the query must generate a table with three columns. The data used for the value field must be numeric. The columns that will be used for x and y values use the following rules: +To render a heatmap, the query must generate a table with three columns. The data used for the value field must be numeric. The columns that you use for x and y values must follow these rules: - If the values in column *x* are in the `string` format, the values in the column *y* must also be in the `string` format. - If the values in column *x* are in the `datetime` format, the values in the column *y* must be numeric. > [!NOTE] -> We recommend specifying each data field, instead of letting the tool infer the data source. +> Specify each data field instead of letting the tool infer the data source. The following example shows the distribution of the five most frequent SQL metrics across different metric types. It highlights which metric types are most common for each SQL metric, making it easy to identify activity patterns in the top metrics. @@ -72,7 +73,6 @@ The following example shows the distribution of the five most frequent SQL metri let topMetrics = TransformedServerMetrics | summarize TotalCount = count() by SQLMetrics | top 5 by TotalCount; // pick only the 5 most common metrics - TransformedServerMetrics | where SQLMetrics in (topMetrics | project SQLMetrics) | summarize Count = count() by SQLMetrics, MetricType diff --git a/data-explorer/get-data-file.md b/data-explorer/get-data-file.md index 244b93389c..f0080f612c 100644 --- a/data-explorer/get-data-file.md +++ b/data-explorer/get-data-file.md @@ -1,5 +1,5 @@ --- -title: Get data from a file +title: Get Data From a File description: Learn how to get data from a local file in Azure Data Explorer. ms.reviewer: sharmaanshul ms.topic: how-to @@ -8,7 +8,7 @@ ms.custom: sfi-image-nochange --- # Get data from file -Data ingestion is the process used to load data from one or more sources into a table in Azure Data Explorer. Once ingested, the data becomes available for query. In this article, you learn you how to get data from a local file into either a new or existing table. +Data ingestion is the process of loading data from one or more sources into a table in Azure Data Explorer. Once ingested, the data is available for query. In this article, you learn how to get data from a local file into either a new or existing table. For general information on data ingestion, see [Azure Data Explorer data ingestion overview](ingest-data-overview.md). @@ -20,37 +20,37 @@ For general information on data ingestion, see [Azure Data Explorer data ingesti ## Get data -1. From the left menu, select **Query**. +1. From the left navigation pane, select **Query**. 1. Right-click on the database where you want to ingest the data. Select **Get data**. :::image type="content" source="media/get-data-file/get-data.png" alt-text="Screenshot of query tab, with right-click on a database and the get options dialog open." lightbox="media/get-data-file/get-data.png"::: -## Source +## Select data source In the **Get data** window, the **Source** tab is selected. Select the data source from the available list. In this example, you're ingesting data from a **Local file**. -:::image type="content" source="media/get-data-file/select-data-source.png" alt-text="Screenshot of get data window with source tab selected." lightbox="media/get-data-file/select-data-source.png"::: +:::image type="content" source="media/get-data-file/source.png" alt-text="Screenshot of get data window with source tab selected." lightbox="media/get-data-file/source.png"::: [!INCLUDE [ingestion-size-limit](includes/cross-repo/ingestion-size-limit.md)] -## Configure +## Configure data ingestion -1. Select a target database and table. If you want to ingest data into a new table, select **+ New table** and enter a table name. +1. Select a target database and table. To ingest data into a new table, select **+ New table** and enter a table name. > [!NOTE] - > Table names can be up to 1024 characters including spaces, alphanumeric, hyphens, and underscores. Special characters aren't supported. + > Table names can be up to 1,024 characters, including spaces, alphanumeric characters, hyphens, and underscores. Special characters aren't supported. -1. Either drag files into the window, or select **Browse for files**. +1. Drag files into the window, or select **Browse for files**. > [!NOTE] - > You can add up to 1,000 files. Each file can be a max of 1 GB uncompressed. + > You can add up to 1,000 files. Each file can be a maximum of 1 GB uncompressed. - :::image type="content" source="media/get-data-file/configure-tab.png" alt-text="Screenshot of configure tab with new table entered and one sample data file selected." lightbox="media/get-data-file/configure-tab.png"::: + :::image type="content" source="media/get-data-file/configure.png" alt-text="Screenshot of configure tab with new table entered and one sample data file selected." lightbox="media/get-data-file/configure.png"::: -1. Select **Next** +1. Select **Next**. ## Inspect @@ -58,12 +58,10 @@ The **Inspect** tab opens with a preview of the data. To complete the ingestion process, select **Finish**. -:::image type="content" source="media/get-data-file/inspect-data.png" alt-text="Screenshot of the inspect tab." lightbox="media/get-data-file/inspect-data.png"::: +:::image type="content" source="media/get-data-file/inspect.png" alt-text="Screenshot of the inspect tab." lightbox="media/get-data-file/inspect.png"::: Optionally: -* Select **Command viewer** to view and copy the automatic commands generated from your inputs. -* Use the **Schema definition file** dropdown to change the file that the schema is inferred from. * Change the automatically inferred data format by selecting the desired format from the dropdown. See [Data formats supported by Azure Data Explorer for ingestion](ingestion-supported-formats.md). * [Edit columns](#edit-columns). * Explore [Advanced options based on data type](#advanced-options-based-on-data-type). @@ -78,7 +76,7 @@ Optionally: ## Summary -In the **Data preparation** window, all three steps are marked with green check marks when data ingestion finishes successfully. You can view the commands that were used for each step, or select a card to query, visualize, or drop the ingested data. +In the **Data preparation** window, all three steps show green check marks when data ingestion finishes successfully. You can view the commands that each step used, or select a card to query, visualize, or drop the ingested data. :::image type="content" source="media/get-data-file/summary.png" alt-text="Screenshot of summary page with successful ingestion completed." lightbox="media/get-data-file/summary.png"::: diff --git a/data-explorer/integrate-overview.md b/data-explorer/integrate-overview.md index 1dc0c16c10..726aea69e6 100644 --- a/data-explorer/integrate-overview.md +++ b/data-explorer/integrate-overview.md @@ -1,13 +1,13 @@ --- -title: Integrations overview +title: Integrations Overview description: Learn about the available data connectors, tools, and integrations, and their capabilities. ms.reviewer: aksdi -ms.topic: integration -ms.date: 02/04/2026 +ms.topic: conceptual +ms.date: 02/02/2026 --- # Integrations overview -There are many data connectors, tools, and integrations that work seamlessly with the platform for ingestion, orchestration, output, and data query. This document is a high level overview about the available connectors, tools, and integrations. Detailed information is provided for each connector, along with links to its full documentation. +Many data connectors, tools, and integrations work seamlessly with the platform for ingestion, orchestration, output, and data query. This article provides a high-level overview of the available connectors, tools, and integrations. For each connector, you can find detailed information and links to its full documentation. For overview pages on a specific type of integration, select one of the following buttons. @@ -28,7 +28,7 @@ For overview pages on a specific type of integration, select one of the followin ## Comparison tables -The following tables summarize the capabilities of each item. Select the tab corresponding to connectors or tools and integrations. Each item name is linked to its [detailed description](#detailed-descriptions). +The following tables summarize the capabilities of each item. Select the tab corresponding to connectors or tools and integrations. Each item name links to its [detailed description](#detailed-descriptions). ### [Connectors](#tab/connectors) @@ -46,6 +46,7 @@ The following table summarizes the available connectors and their capabilities: | [Azure Event Hubs](#azure-event-hubs) | :heavy_check_mark: | | | | | [Azure Functions](#azure-functions) | :heavy_check_mark: | | | :heavy_check_mark: | | [Azure IoT Hubs](#azure-iot-hubs) | :heavy_check_mark: | | | | +| [Azure Monitor](#azure-monitor) | :heavy_check_mark: | | | | | [Azure Stream Analytics](#azure-stream-analytics) | :heavy_check_mark: | | | | | [Cribl Stream](#cribl-stream) | :heavy_check_mark: | | | | | [Fluent Bit](#fluent-bit) | :heavy_check_mark: | | | | @@ -62,7 +63,6 @@ The following table summarizes the available connectors and their capabilities: | [Splunk](#splunk) | :heavy_check_mark: | | | | | [Splunk Universal Forwarder](#splunk-universal-forwarder) | :heavy_check_mark: | | | | | [Telegraf](#telegraf) | :heavy_check_mark: | | | | -| [Azure Monitor](#azure-monitor) | :heavy_check_mark: | | | | ### [Tools and integrations](#tab/integrations) @@ -80,7 +80,7 @@ The following table summarizes the available tools and integrations and their ca | [KQL Parser](#kql-parser) | | :heavy_check_mark: | | | | | | | [Kusto.Explorer](#kustoexplorer) | | :heavy_check_mark: | | | | | :heavy_check_mark: | | [Kusto CLI](#kusto-cli) | :heavy_check_mark: | :heavy_check_mark: | | | | :heavy_check_mark: | | -| [Lightingest](#lightingest) | :heavy_check_mark: | | | | | | | +| [LightIngest](#lightingest) | :heavy_check_mark: | | | | | | | | [Microsoft Purview](#microsoft-purview) | | | | | :heavy_check_mark: | | | | [Monaco editor](#monaco-editor-pluginembed) | | :heavy_check_mark: | | | | | | | [PowerShell](#powershell) | | | | | | :heavy_check_mark: | | @@ -92,13 +92,13 @@ The following table summarizes the available tools and integrations and their ca ## Detailed descriptions -The following are detailed descriptions of connectors and tools and integrations. Select the tab corresponding to connectors or tools and integrations. All available items are summarized in the [Comparison tables](#comparison-tables) above. +The following sections provide detailed descriptions of connectors, tools, and integrations. Select the tab corresponding to connectors or tools and integrations. ### [Connectors](#tab/connectors) ### Apache Kafka -[Apache Kafka](https://kafka.apache.org/documentation/) is a distributed streaming platform for building real-time streaming data pipelines that reliably move data between systems or applications. Kafka Connect is a tool for scalable and reliable streaming of data between Apache Kafka and other data systems. The Kafka Sink serves as the connector from Kafka and doesn't require using code. This is gold certified by Confluent - has gone through comprehensive review and testing for quality, feature completeness, compliance with standards, and for performance. +[Apache Kafka](https://kafka.apache.org/documentation/) is a distributed streaming platform for building real-time streaming data pipelines that reliably move data between systems or applications. Kafka Connect is a tool for scalable and reliable streaming of data between Apache Kafka and other data systems. The Kafka Sink serves as the connector from Kafka and doesn't require using code. This connector is gold certified by Confluent, which means it goes through comprehensive review and testing for quality, feature completeness, compliance with standards, and performance. * **Functionality:** Ingestion * **Ingestion type supported:** Batching, Streaming @@ -110,7 +110,7 @@ The following are detailed descriptions of connectors and tools and integrations ### Apache Flink -[Apache Flink](https://flink.apache.org/) is a framework and distributed processing engine for stateful computations over unbounded and bounded data streams. The connector implements data sink for moving data across Azure Data Explorer and Flink clusters. Using Azure Data Explorer and Apache Flink, you can build fast and scalable applications targeting data driven scenarios. For example, machine learning (ML), Extract-Transform-Load (ETL), and Log Analytics. +[Apache Flink](https://flink.apache.org/) is a framework and distributed processing engine for stateful computations over unbounded and bounded data streams. The connector implements data sink for moving data across Azure Data Explorer and Flink clusters. By using Azure Data Explorer and Apache Flink, you can build fast and scalable applications targeting data driven scenarios. For example, machine learning (ML), Extract-Transform-Load (ETL), and Log Analytics. * **Functionality:** Ingestion * **Ingestion type supported:** Streaming @@ -121,7 +121,7 @@ The following are detailed descriptions of connectors and tools and integrations ### Apache Log4J 2 -[Log4J](https://logging.apache.org/log4j/2.x/) is a popular logging framework for Java applications maintained by the Apache Foundation. Log4j allows developers to control which log statements are output with arbitrary granularity based on the logger's name, logger level, and message pattern. The Apache Log4J 2 sink allows you to stream your log data to your database, where you can analyze and visualize your logs in real time. +[Log4J](https://logging.apache.org/log4j/2.x/) is a popular logging framework for Java applications maintained by the Apache Foundation. Log4j developers can control which log statements are output with arbitrary granularity based on the logger's name, logger level, and message pattern. By using the Apache Log4J 2 sink, you can stream your log data to your database, where you can analyze and visualize your logs in real time. * **Functionality:** Ingestion * **Ingestion type supported:** Batching, Streaming @@ -133,7 +133,7 @@ The following are detailed descriptions of connectors and tools and integrations ### Apache Spark -[Apache Spark](https://spark.apache.org/) is a unified analytics engine for large-scale data processing. The [Spark connector](spark-connector.md) is an open source project that can run on any Spark cluster. It implements data source and data sink for moving data to or from Spark clusters. Using the Apache Spark connector, you can build fast and scalable applications targeting data driven scenarios. For example, machine learning (ML), Extract-Transform-Load (ETL), and Log Analytics. With the connector, your database becomes a valid data store for standard Spark source and sink operations, such as read, write, and writeStream. +[Apache Spark](https://spark.apache.org/) is a unified analytics engine for large-scale data processing. The [Spark connector](spark-connector.md) is an open source project that can run on any Spark cluster. It implements data source and data sink for moving data to or from Spark clusters. By using the Apache Spark connector, you can build fast and scalable applications targeting data driven scenarios. For example, machine learning (ML), Extract-Transform-Load (ETL), and Log Analytics. By using the connector, your database becomes a valid data store for standard Spark source and sink operations, such as read, write, and writeStream. * **Functionality:** Ingestion, Export * **Ingestion type supported:** Batching, Streaming @@ -143,9 +143,19 @@ The following are detailed descriptions of connectors and tools and integrations * **Documentation:** [Apache Spark connector](spark-connector.md) * **Community Blog:** [Data preprocessing for Azure Data Explorer for Azure Data Explorer with Apache Spark](https://techcommunity.microsoft.com/t5/azure-data-explorer-blog/data-pre-processing-for-azure-data-explorer-with-apache-spark/ba-p/2727993/) +### Apache Spark for Azure Synapse Analytics + +[Apache Spark](https://spark.apache.org/) is a parallel processing framework that supports in-memory processing to boost the performance of big data analytic applications. [Apache Spark in Azure Synapse](/azure/synapse-analytics/spark/apache-spark-overview) Analytics is one of Microsoft's implementations of Apache Spark in the cloud. You can access a database from [Synapse Studio](/azure/synapse-analytics/) by using Apache Spark for Azure Synapse Analytics. + +* **Functionality:** Ingestion, Export +* **Ingestion type supported:** Batching +* **Use cases:** Telemetry +* **Underlying SDK:** [Java](/kusto/api/java/kusto-java-client-library?view=azure-data-explorer&preserve-view=true) +* **Documentation:** [Connect to an Azure Synapse workspace](/azure/synapse-analytics/quickstart-connect-azure-data-explorer) + ### Azure Cosmos DB -The [Azure Cosmos DB](/azure/cosmos-db/) change feed data connection is an ingestion pipeline that listens to your Cosmos DB change feed and ingests the data into your database. +The [Azure Cosmos DB](/azure/cosmos-db/) is an ingestion pipeline that listens to your Cosmos DB change feed and ingests the data into your database. * **Functionality:** Ingestion * **Ingestion type supported:** Batching, Streaming @@ -154,7 +164,7 @@ The [Azure Cosmos DB](/azure/cosmos-db/) change feed data connection is an inges ### Azure Data Factory -[Azure Data Factory](/azure/data-factory) (ADF) is a cloud-based data integration service that allows you to integrate different data stores and perform activities on the data. +[Azure Data Factory](/azure/data-factory) (ADF) is a cloud-based data integration service that you can use to integrate different data stores and perform activities on the data. * **Functionality:** Ingestion, Export * **Ingestion type supported:** Batching @@ -163,7 +173,7 @@ The [Azure Cosmos DB](/azure/cosmos-db/) change feed data connection is an inges ### Azure Event Grid -Event Grid ingestion is a pipeline that listens to Azure storage, and updates your database to pull information when subscribed events occur. You can configure continuous ingestion from Azure Storage (Blob storage and ADLSv2) with an [Azure Event Grid](/azure/event-grid/overview) subscription for blob created or blob renamed notifications and streaming the notifications via Azure Event Hubs. +Event Grid ingestion is a pipeline that listens to Azure storage and updates your database to pull information when subscribed events occur. You can configure continuous ingestion from Azure Storage (Blob storage and ADLSv2) by using an [Azure Event Grid](/azure/event-grid/overview) subscription for blob created or blob renamed notifications and streaming the notifications via Azure Event Hubs. * **Functionality:** Ingestion * **Ingestion type supported:** Batching, Streaming @@ -180,7 +190,7 @@ Event Grid ingestion is a pipeline that listens to Azure storage, and updates yo ### Azure Functions -[Azure Functions](/azure/azure-functions/functions-overview) allow you to run serverless code in the cloud on a schedule or in response to an event. With input and output bindings for Azure Functions, you can integrate your database into your workflows to ingest data and run queries against your database. +[Azure Functions](/azure/azure-functions/functions-overview) allows you to run serverless code in the cloud on a schedule or in response to an event. By using input and output bindings for Azure Functions, you can integrate your database into your workflows to ingest data and run queries against your database. * **Functionality:** Ingestion, Export * **Ingestion type supported:** Batching @@ -197,6 +207,19 @@ Event Grid ingestion is a pipeline that listens to Azure storage, and updates yo * **Use cases:** IoT data * **Documentation:** [IoT Hub data connection](ingest-data-iot-hub-overview.md) +### Azure Monitor + +Azure Monitor Agent can send the following virtual machine data to Azure Data Explorer by using a data collection rule: + +- Performance counters +- IIS logs +- Windows event logs +- Linux system logs +- Custom text logs +- Custom JSON logs + +For more information, see [Collect data from virtual machines to Azure Data Explorer](/azure/azure-monitor/vm/send-fabric-destination). + ### Azure Stream Analytics [Azure Stream Analytics](/azure/stream-analytics/stream-analytics-introduction) is a real-time analytics and complex event-processing engine that's designed to process high volumes of fast streaming data from multiple sources simultaneously. @@ -236,7 +259,7 @@ Java Database Connectivity (JDBC) is a Java API used to connect to databases and ### Logic Apps -The [Microsoft Logic Apps](/azure/logic-apps/logic-apps-what-are-logic-apps) connector allows you to run queries and commands automatically as part of a scheduled or triggered task. +By using the [Microsoft Logic Apps](/azure/logic-apps/logic-apps-what-are-logic-apps) connector, you can run queries and commands automatically as part of a scheduled or triggered task. * **Functionality:** Ingestion, Export * **Ingestion type supported:** Batching @@ -264,7 +287,7 @@ MATLAB is a programming and numeric computing platform used to analyze data, dev ### NLog -NLog is a flexible and free logging platform for various .NET platforms, including .NET standard. NLog allows you to write to several targets, such as a database, file, or console. With NLog you can change the logging configuration on-the-fly. The NLog sink is a target for NLog that allows you to send your log messages to your database. The plugin provides an efficient way to sink your logs to your cluster. +NLog is a flexible and free logging platform for various .NET platforms, including .NET standard. NLog allows you to write to several targets, such as a database, file, or console. By using NLog, you can change the logging configuration on-the-fly. The NLog sink is a target for NLog that allows you to send your log messages to your database. The plugin provides an efficient way to sink your logs to your cluster. * **Functionality:** Ingestion * **Ingestion type supported:** Batching, Streaming @@ -314,7 +337,7 @@ Open Database Connectivity ([ODBC](/sql/odbc/reference/odbc-overview)) is a wide ### Serilog -Serilog is a popular logging framework for .NET applications. Serilog allows developers to control which log statements are output with arbitrary granularity based on the logger's name, logger level, and message pattern. The Serilog sink, also known as an appender, streams your log data to your database, where you can analyze and visualize your logs in real time. +Serilog is a popular logging framework for .NET applications. By using Serilog, developers can control which log statements are output with arbitrary granularity based on the logger's name, logger level, and message pattern. The Serilog sink, also known as an appender, streams your log data to your database, where you can analyze and visualize your logs in real time. * **Functionality:** Ingestion * **Ingestion type supported:** Batching, Streaming @@ -348,7 +371,7 @@ Serilog is a popular logging framework for .NET applications. Serilog allows dev ### Telegraf -Telegraf is an open source, lightweight, minimal memory foot print agent for collecting, processing and writing telemetry data including logs, metrics, and IoT data. Telegraf supports hundreds of input and output plugins. It's widely used and well supported by the open source community. The output plugin serves as the connector from Telegraf and supports ingestion of data from many types of input plugins into your database. +Telegraf is an open source, lightweight agent with a minimal memory footprint for collecting, processing, and writing telemetry data, including logs, metrics, and IoT data. Telegraf supports hundreds of input and output plugins. It's widely used and well supported by the open source community. The output plugin serves as the connector from Telegraf and supports ingestion of data from many types of input plugins into your database. * **Functionality:** Ingestion * **Ingestion type supported:** Batching, Streaming @@ -358,18 +381,6 @@ Telegraf is an open source, lightweight, minimal memory foot print agent for col * **Documentation:** [Ingest data from Telegraf](ingest-data-telegraf.md) * **Community Blog:** [New Azure Data Explorer output plugin for Telegraf enables SQL monitoring at huge scale](https://techcommunity.microsoft.com/t5/azure-data-explorer-blog/new-azure-data-explorer-output-plugin-for-telegraf-enables-sql/ba-p/2829444) -### Azure Monitor - -Azure Monitor Agent can send the following virtual machine data to Azure Data Explorer using a data collection rule: - -- Performance counters -- IIS logs -- Windows event logs -- Linux system logs -- Custom text logs -- Custom JSON logs - -For more information, see [Collect data from virtual machines to Azure Data Explorer](/azure/azure-monitor/vm/send-fabric-destination). ### [Tools and integrations](#tab/integrations) @@ -380,6 +391,13 @@ Azure CLI lets you manage Kusto resources. * **Functionality:** Administration * **Documentation:** [az kusto](/cli/azure/kusto?view=azure-cli-latest&preserve-view=true) +### Azure Synapse Analytics + +Azure Synapse Data Explorer provides an interactive query experience to unlock insights from log and telemetry data. To complement existing SQL and Apache Spark analytics runtime engines, the Data Explorer analytics runtime is optimized for efficient log analytics by using powerful indexing technology to automatically index free-text and semi-structured data commonly found in telemetry data. + +* **Functionality:** Ingestion, Query, Visualization +* **Documentation:** [What is Azure Synapse Data Explorer?](/azure/synapse-analytics/data-explorer/data-explorer-overview) + ### Azure Data Lake Azure Data Explorer integrates with Azure Blob Storage and Azure Data Lake Storage (Gen1 and Gen2), providing fast, cached, and indexed access to data stored in external storage. @@ -389,18 +407,25 @@ Azure Data Explorer integrates with Azure Blob Storage and Azure Data Lake Stora ### Azure Data Share -There are many traditional ways to share data, such as through file shares, FTP, e-mail, and APIs. These methods require both parties to build and maintain a data pipeline that moves data between teams and organizations. With Azure Data Explorer, you can easily and securely share your data with people in your company or external partners. Sharing occurs in near-real-time, with no need to build or maintain a data pipeline. All database changes, including schema and data, on the provider side are instantly available on the consumer side. +Many traditional ways exist to share data, such as through file shares, FTP, e-mail, and APIs. These methods require both parties to build and maintain a data pipeline that moves data between teams and organizations. By using Azure Data Explorer, you can easily and securely share your data with people in your company or external partners. Sharing occurs in near-real-time, with no need to build or maintain a data pipeline. All database changes, including schema and data, on the provider side are instantly available on the consumer side. * **Functionality:** Share data * **Documentation:** [Azure Data Share](data-share.md) ### Azure Monitor -The Azure Data Explorer supports cross-service queries between Azure Data Explorer, [Application Insights (AI)](/azure/azure-monitor/app/app-insights-overview), and [Log Analytics (LA)](/azure/azure-monitor/platform/data-platform-logs). You can query your Log Analytics or Application Insights workspace using Azure Data Explorer query tools and in a cross-service query. +Azure Data Explorer supports cross-service queries between Azure Data Explorer, [Application Insights (AI)](/azure/azure-monitor/app/app-insights-overview), and [Log Analytics (LA)](/azure/azure-monitor/platform/data-platform-logs). You can query your Log Analytics or Application Insights workspace by using Azure Data Explorer query tools and in a cross-service query. * **Functionality:** Query * **Documentation:** [Azure Monitor](query-monitor-data.md) +### Azure Notebooks + +Kqlmagic is a command that extends the capabilities of the Python kernel in Azure Data Studio notebooks. You can combine Python and Kusto query language (KQL) to query and visualize data by using the rich Plotly library integrated with render commands. Kqlmagic brings you the benefit of notebooks, data analysis, and rich Python capabilities all in the same location. Supported data sources with Kqlmagic include Azure Data Explorer, Application Insights, and Azure Monitor logs. + +* **Functionality:** Query, Visualization +* **Documentation:** [Azure Notebooks](/sql/azure-data-studio/notebooks/notebooks-kqlmagic?context=%252fazure%252fdata-explorer%252fcontext%252fcontext%253fcontext%253d%252fazure%252fdata-explorer%252fcontext%252fcontext) + ### Azure Pipelines [Azure DevOps Services](https://azure.microsoft.com/services/devops/) provides development collaboration tools such as high-performance pipelines, free private Git repositories, configurable Kanban boards, and extensive automated and continuous testing capabilities. [Azure Pipelines](https://azure.microsoft.com/services/devops/pipelines/) is an Azure DevOps capability that enables you to manage CI/CD to deploy your code with high-performance pipelines that work with any language, platform, and cloud. @@ -411,7 +436,7 @@ The Azure Data Explorer supports cross-service queries between Azure Data Explor ### DeltaKusto -Delta Kusto is a Command-line interface (CLI) enabling Continuous Integration / Continuous Deployment (CI / CD) automation with Kusto objects (for example: tables, functions, policies, security roles) in Azure Data Explorer databases. It can work on a single database, multiple databases, or an entire cluster. Delta Kusto also supports multitenant scenarios. +Delta Kusto is a command-line interface (CLI) that enables continuous integration and continuous deployment (CI/CD) automation with Kusto objects, such as tables, functions, policies, and security roles, in Azure Data Explorer databases. It can work on a single database, multiple databases, or an entire cluster. Delta Kusto also supports multitenant scenarios. * **Functionality:** Source control * **Documentation:** [Delta Kusto](https://github.com/microsoft/delta-kusto) @@ -420,7 +445,7 @@ Delta Kusto is a Command-line interface (CLI) enabling Continuous Integration / [Jupyter Notebook](https://jupyter.org/) is an open-source web application that allows you to create and share documents containing live code, equations, visualizations, and narrative text. It's useful for a wide range of tasks, such as data cleaning and transformation, numerical simulation, statistical modeling, data visualization, and machine learning. -[Kqlmagic](https://github.com/microsoft/jupyter-Kqlmagic) extends the capabilities of the Python kernel in Jupyter Notebook so you can run [Kusto Query Language (KQL)](/kusto/query/index?view=azure-data-explorer&preserve-view=true) queries natively. You can combine Python and KQL to query and visualize data using the rich Plot.ly library integrated with the [render](/kusto/query/render-operator?view=azure-data-explorer&preserve-view=true) operator. The kqlmagic extension is compatible with Jupyter Lab, and Visual Studio Code Jupyter extension, and supported data sources include Azure Data Explorer, Azure Monitor logs, and Application Insights. +[Kqlmagic](https://github.com/microsoft/jupyter-Kqlmagic) extends the capabilities of the Python kernel in Jupyter Notebook so you can run [Kusto Query Language (KQL)](/kusto/query/index?view=azure-data-explorer&preserve-view=true) queries natively. You can combine Python and KQL to query and visualize data using the rich Plot.ly library integrated with the [render](/kusto/query/render-operator?view=azure-data-explorer&preserve-view=true) operator. The kqlmagic extension is compatible with Jupyter Lab, Visual Studio Code Jupyter extension, and Azure Data Studio. Supported data sources include Azure Data Explorer, Azure Monitor logs, and Application Insights. * **Functionality:** Query, Visualization * **Documentation:** [Notebooks with Kqlmagic](kqlmagic.md) @@ -435,11 +460,11 @@ The open source repo contains C# parser and a semantic analyzer as well as a tra ### Kusto.Explorer -Kusto.Explorer is free software for download and use on your Windows desktop. Kusto.Explorer allows you to query and analyze your data with Kusto Query Language (KQL) in a user-friendly interface. +Kusto. Explorer is free software that you can download and use on your Windows desktop. Kusto. Explorer enables you to query and analyze your data by using Kusto Query Language (KQL) in a user-friendly interface. * **Functionality:** Query, Visualization * **Documentation:** [Installation and user interface](/kusto/tools/kusto-explorer?view=azure-data-explorer&preserve-view=true), [using Kusto.Explorer](/kusto/tools/kusto-explorer-using?view=azure-data-explorer&preserve-view=true) - * Additional articles include [options](/kusto/tools/kusto-explorer-options?view=azure-data-explorer&preserve-view=true), [troubleshooting](/kusto/tools/kusto-explorer-troubleshooting?view=azure-data-explorer&preserve-view=true), [keyboard shortcuts](/kusto/tools/kusto-explorer-shortcuts?view=azure-data-explorer&preserve-view=true), [code features](/kusto/tools/kusto-explorer-code-features?view=azure-data-explorer&preserve-view=true) + * Other articles include [options](/kusto/tools/kusto-explorer-options?view=azure-data-explorer&preserve-view=true), [troubleshooting](/kusto/tools/kusto-explorer-troubleshooting?view=azure-data-explorer&preserve-view=true), [keyboard shortcuts](/kusto/tools/kusto-explorer-shortcuts?view=azure-data-explorer&preserve-view=true), [code features](/kusto/tools/kusto-explorer-code-features?view=azure-data-explorer&preserve-view=true) ### Kusto CLI @@ -449,9 +474,9 @@ on a Kusto cluster. * **Functionality:** Query * **Documentation:** [Kusto CLI](/kusto/tools/kusto-cli?view=azure-data-explorer&preserve-view=true) -### Lightingest +### LightIngest -Lightingest is a command-line utility for ad-hoc data ingestion into Azure Data Explorer. +LightIngest is a command-line utility for ad-hoc data ingestion into Azure Data Explorer. * **Functionality:** Ingestion * **Ingestion type supported:** Batching @@ -468,7 +493,7 @@ Microsoft Purview simplifies data governance by offering a unified service to ma ### Monaco editor (plugin/embed) -You can integrate the [Monaco Editor](https://microsoft.github.io/monaco-editor) with Kusto Query Language support (*monaco-kusto*) into your app. Integrating *monaco-kusto* into your app offers you an editing experience such as completion, colorization, refactoring, renaming, and go-to-definition +You can integrate the [Monaco Editor](https://microsoft.github.io/monaco-editor) with Kusto Query Language support (*monaco-kusto*) into your app. Integrating *monaco-kusto* into your app offers you an editing experience such as completion, colorization, refactoring, renaming, and go-to-definition. * **Functionality:** Query * **Repository:** [Monaco Editor](https://microsoft.github.io/monaco-editor) @@ -494,7 +519,7 @@ Real-Time Intelligence is a fully managed big data analytics platform optimized ### SyncKusto Sync Kusto is a tool that enables users to synchronize various Kusto schema entities, such as table schemas and stored functions. This synchronization is done between the local file -system, an Azure Data Explorer database, and Azure repos. +system, an Azure Data Explorer database, and Azure Repos. * **Functionality:** Source control * **Repository:** [SyncKusto](https://github.com/microsoft/delta-kusto) diff --git a/data-explorer/media/adx-dashboards/heatmap.png b/data-explorer/media/adx-dashboards/heatmap.png index 81987db141..e8183f942b 100644 Binary files a/data-explorer/media/adx-dashboards/heatmap.png and b/data-explorer/media/adx-dashboards/heatmap.png differ diff --git a/data-explorer/media/create-table-wizard/add-table.png b/data-explorer/media/create-table-wizard/add-table.png new file mode 100644 index 0000000000..c7179a29aa Binary files /dev/null and b/data-explorer/media/create-table-wizard/add-table.png differ diff --git a/data-explorer/media/get-data-file/configure.png b/data-explorer/media/get-data-file/configure.png new file mode 100644 index 0000000000..0f8c4ee734 Binary files /dev/null and b/data-explorer/media/get-data-file/configure.png differ diff --git a/data-explorer/media/get-data-file/edit-columns.png b/data-explorer/media/get-data-file/edit-columns.png index e91a58fef6..cc52f16d26 100644 Binary files a/data-explorer/media/get-data-file/edit-columns.png and b/data-explorer/media/get-data-file/edit-columns.png differ diff --git a/data-explorer/media/get-data-file/get-data.png b/data-explorer/media/get-data-file/get-data.png index c3f5119988..d69efd2e06 100644 Binary files a/data-explorer/media/get-data-file/get-data.png and b/data-explorer/media/get-data-file/get-data.png differ diff --git a/data-explorer/media/get-data-file/inspect.png b/data-explorer/media/get-data-file/inspect.png new file mode 100644 index 0000000000..3957c4f8cd Binary files /dev/null and b/data-explorer/media/get-data-file/inspect.png differ diff --git a/data-explorer/media/get-data-file/source.png b/data-explorer/media/get-data-file/source.png new file mode 100644 index 0000000000..675ca479e9 Binary files /dev/null and b/data-explorer/media/get-data-file/source.png differ diff --git a/data-explorer/media/get-data-file/summary.png b/data-explorer/media/get-data-file/summary.png index ea2920f46e..af539a28fb 100644 Binary files a/data-explorer/media/get-data-file/summary.png and b/data-explorer/media/get-data-file/summary.png differ diff --git a/data-explorer/media/using-metrics/add-diagnostic-settings.png b/data-explorer/media/using-metrics/add-diagnostic-settings.png new file mode 100644 index 0000000000..f476b2117c Binary files /dev/null and b/data-explorer/media/using-metrics/add-diagnostic-settings.png differ diff --git a/data-explorer/media/using-metrics/diagnostic-settings.png b/data-explorer/media/using-metrics/diagnostic-settings.png new file mode 100644 index 0000000000..77d4b1c93c Binary files /dev/null and b/data-explorer/media/using-metrics/diagnostic-settings.png differ diff --git a/data-explorer/media/using-metrics/metrics-pane.png b/data-explorer/media/using-metrics/metrics-pane.png index de80a6701f..35ad839d95 100644 Binary files a/data-explorer/media/using-metrics/metrics-pane.png and b/data-explorer/media/using-metrics/metrics-pane.png differ diff --git a/data-explorer/monitor-data-explorer.md b/data-explorer/monitor-data-explorer.md index 6954ebbfa7..76b5213f6c 100644 --- a/data-explorer/monitor-data-explorer.md +++ b/data-explorer/monitor-data-explorer.md @@ -1,7 +1,7 @@ --- title: Monitor Azure Data Explorer description: Learn how to monitor Azure Data Explorer using Azure Monitor, including data collection, analysis, and alerting. -ms.date: 12/09/2024 +ms.date: 02/01/2026 ms.custom: horz-monitor ms.topic: how-to author: spelluru @@ -19,19 +19,19 @@ This table describes how you can collect data to monitor your service, and what |Data to collect|Description|How to collect and route the data|Where to view the data|Supported data| |---------|---------|---------|---------|---------| -|Metric data|Metrics are numerical values that describe an aspect of a system at a particular point in time. Metrics can be aggregated using algorithms, compared to other metrics, and analyzed for trends over time.|- Collected automatically at regular intervals.
- You can route some platform metrics to a Log Analytics workspace to query with other data. Check the **DS export** setting for each metric to see if you can use a diagnostic setting to route the metric data.|[Metrics explorer](/azure/azure-monitor/essentials/metrics-getting-started)| [Azure Data Explorer metrics supported by Azure Monitor](monitor-data-explorer-reference.md#metrics)| +|Metric data|Metrics are numerical values that describe an aspect of a system at a particular point in time. Aggregate metrics by using algorithms, compare metrics to other metrics, and analyze metrics for trends over time.|- Collected automatically at regular intervals.
- You can route some platform metrics to a Log Analytics workspace to query with other data. Check the **DS export** setting for each metric to see if you can use a diagnostic setting to route the metric data.|[Metrics explorer](/azure/azure-monitor/essentials/metrics-getting-started)| [Azure Data Explorer metrics supported by Azure Monitor](monitor-data-explorer-reference.md#metrics)| |Resource log data|Logs are recorded system events with a timestamp. Logs can contain different types of data, and be structured or free-form text. You can route resource log data to Log Analytics workspaces for querying and analysis.|[Create a diagnostic setting](/azure/azure-monitor/essentials/create-diagnostic-settings) to collect and route resource log data.| [Log Analytics](/azure/azure-monitor/learn/quick-create-workspace)|[Azure Data Explorer resource log data supported by Azure Monitor](monitor-data-explorer-reference.md#resource-logs) | |Activity log data|The Azure Monitor activity log provides insight into subscription-level events. The activity log includes information like when a resource is modified or a virtual machine is started.|- Collected automatically.
- [Create a diagnostic setting](/azure/azure-monitor/essentials/create-diagnostic-settings) to a Log Analytics workspace at no charge.|[Activity log](/azure/azure-monitor/essentials/activity-log)| | [!INCLUDE [azmon-horz-supported-data](~/../reusable-content/ce-skilling/azure/includes/azure-monitor/horizontals/azmon-horz-supported-data.md)] -## Built in monitoring for Azure Data Explorer +## Built-in monitoring for Azure Data Explorer Azure Data Explorer offers metrics and logs to monitor the service. -### Monitor Azure Data Explorer performance, health, and usage with metrics +### Monitor Azure Data Explorer performance, health, and usage by using metrics -Azure Data Explorer metrics provide key indicators as to the health and performance of the Azure Data Explorer cluster resources. Use the metrics to monitor Azure Data Explorer cluster usage, health, and performance in your specific scenario as standalone metrics. You can also use metrics as the basis for operational [Azure Dashboards](/azure/azure-portal/azure-portal-dashboards) and [Azure Alerts](/azure/azure-monitor/alerts/alerts-types#metric-alerts). +Azure Data Explorer metrics provide key indicators about the health and performance of the Azure Data Explorer cluster resources. Use the metrics to monitor Azure Data Explorer cluster usage, health, and performance in your specific scenario as standalone metrics. You can also use metrics as the basis for operational [Azure Dashboards](/azure/azure-portal/azure-portal-dashboards) and [Azure Alerts](/azure/azure-monitor/alerts/alerts-types#metric-alerts). To use metrics to monitor your Azure Data Explorer resources in the Azure portal: @@ -44,7 +44,7 @@ In the metrics pane, select specific metrics to track, choose how to aggregate y The **Resource** and **Metric Namespace** pickers are preselected for your Azure Data Explorer cluster. The numbers in the following image correspond to the numbered list. They guide you through different options in setting up and viewing your metrics. - :::image type="content" source="media/using-metrics/metrics-pane.png" alt-text="Screenshot shows different options for viewing metrics."::: + :::image type="content" source="media/using-metrics/metrics-pane.png" alt-text="Screenshot shows different options for viewing metrics." lightbox="media/using-metrics/metrics-pane.png"::: 1. To create a metric chart, select **Metric** name and relevant **Aggregation** per metric. For more information about different metrics, see [supported Azure Data Explorer metrics](monitor-data-explorer-reference.md#metrics). 1. Select **Add metric** to see multiple metrics plotted in the same chart. @@ -52,14 +52,14 @@ The **Resource** and **Metric Namespace** pickers are preselected for your Azure 1. Use the time picker to change the time range (default: past 24 hours). 1. Use [**Add filter** and **Apply splitting**](/azure/azure-monitor/platform/metrics-getting-started#apply-dimension-filters-and-splitting) for metrics that have dimensions. 1. Select **Pin to dashboard** to add your chart configuration to the dashboards so that you can view it again. -1. Set **New alert rule** to visualize your metrics using the set criteria. The new alerting rule includes your target resource, metric, splitting, and filter dimensions from your chart. Modify these settings in the [alert rule creation pane](/azure/azure-monitor/platform/metrics-charts#create-alert-rules). +1. Set **New alert rule** to visualize your metrics by using the set criteria. The new alerting rule includes your target resource, metric, splitting, and filter dimensions from your chart. Modify these settings in the [alert rule creation pane](/azure/azure-monitor/platform/metrics-charts#create-alert-rules). ### Monitor Azure Data Explorer ingestion, commands, queries, and tables using diagnostic logs Azure Data Explorer is a fast, fully managed data analytics service for real-time analysis on large volumes of data streaming from applications, websites, IoT devices, and more. [Azure Monitor diagnostic logs](/azure/azure-monitor/platform/diagnostic-logs-overview) provide data about the operation of Azure resources. Azure Data Explorer uses diagnostic logs for insights on ingestion, commands, query, and tables. You can export operation logs to Azure Storage, event hub, or Log Analytics to monitor ingestion, commands, and query status. Logs from Azure Storage and Azure Event Hubs can be routed to a table in your Azure Data Explorer cluster for further analysis. > [!IMPORTANT] -> Diagnostic log data may contain sensitive data. Restrict permissions of the logs destination according to your monitoring needs. +> Diagnostic log data might contain sensitive data. Restrict permissions of the logs destination according to your monitoring needs. [!INCLUDE [azure-monitor-vs-log-analytics](includes/azure-monitor-vs-log-analytics.md)] @@ -82,48 +82,51 @@ Diagnostic logs can be used to configure the collection of the following log dat ### [Commands and Queries](#tab/commands-and-queries) -- **Commands**: These logs have information about admin commands that have reached a final state. -- **Queries**: These logs have detailed information about queries that have reached a final state. +- **Commands**: These logs contain information about admin commands that reach a final state. +- **Queries**: These logs contain detailed information about queries that reach a final state. > [!NOTE] > The command and query log data contains the query text. ### [Tables](#tab/tables) -- **TableUsageStatistics**: These logs have detailed information about the tables whose extents were scanned during query execution. This log doesn't record statistics for queries that are part of commands, such as the [.set-or-append](/kusto/management/data-ingestion/ingest-from-query?view=azure-data-explorer&preserve-view=true) command. +- **TableUsageStatistics**: These logs contain detailed information about the tables whose extents are scanned during query execution. This log doesn't record statistics for queries that are part of commands, such as the [.set-or-append](/kusto/management/data-ingestion/ingest-from-query?view=azure-data-explorer&preserve-view=true) command. > [!NOTE] > The `TableUsageStatistics` log data doesn't contain the command or query text. -- **TableDetails**: These logs have detailed information about the cluster's tables. +- **TableDetails**: These logs contain detailed information about the cluster's tables. ### [Journal](#tab/journal) -- **Journal**: These logs have detailed information about metadata operations. +- **Journal**: These logs contain detailed information about metadata operations. --- -You can choose to send the log data to a Log Analytics workspace, a storage account, or stream it to an event hub. +You can send the log data to a Log Analytics workspace, a storage account, or stream it to an event hub. Diagnostic logs are disabled by default. Use the following steps to enable diagnostic logs for your cluster: 1. In the [Azure portal](https://portal.azure.com), select the cluster resource that you want to monitor. 1. Under **Monitoring**, select **Diagnostic settings**. - :::image type="content" source="media/using-diagnostic-logs/add-diagnostic-logs.png" alt-text="Screenshot shows the Diagnostic settings page where you can add a setting."::: + :::image type="content" source="media/using-metrics/diagnostic-settings.png" alt-text="Screenshot shows the Diagnostic settings tile." lightbox="media/using-metrics/diagnostic-settings.png"::: 1. Select **Add diagnostic setting**. + + :::image type="content" source="media/using-metrics/add-diagnostic-settings.png" alt-text="Screenshot shows the Diagnostic settings page where you can add a setting." lightbox="media/using-metrics/add-diagnostic-settings.png"::: + 1. In the **Diagnostic settings** window: - :::image type="content" source="media/using-diagnostic-logs/configure-diagnostics-settings.png" alt-text="Screenshot of the Diagnostic settings screen, on which you configure which monitoring data to collect for your Azure Data Explorer cluster."::: + :::image type="content" source="media/using-diagnostic-logs/configure-diagnostics-settings.png" alt-text="Screenshot of the Diagnostic settings screen, on which you configure which monitoring data to collect for your Azure Data Explorer cluster." lightbox="media/using-diagnostic-logs/configure-diagnostics-settings.png"::: 1. Enter a **Diagnostic setting name**. 1. Select one or more destination targets: a Log Analytics workspace, a storage account, or an event hub. - 1. Select logs to be collected: **Succeeded ingestion**, **Failed ingestion**, **Ingestion batching**, **Command**, **Query**, **Table usage statistics**, **Table details**, or **Journal**. - 1. Select [metrics](using-metrics.md#supported-azure-data-explorer-metrics) to be collected (optional). + 1. Select logs to collect: **Succeeded ingestion**, **Failed ingestion**, **Ingestion batching**, **Command**, **Query**, **Table usage statistics**, **Table details**, or **Journal**. + 1. Select [metrics](using-metrics.md#supported-azure-data-explorer-metrics) to collect (optional). 1. Select **Save** to save the new diagnostic logs settings and metrics. -Once the settings are ready, logs start to appear in the configured destination targets: a storage account, an event hub, or Log Analytics workspace. +After you create the settings, logs start to appear in the configured destination targets: a storage account, an event hub, or Log Analytics workspace. > [!NOTE] > If you send logs to a Log Analytics workspace, the `SucceededIngestion`, `FailedIngestion`, `IngestionBatching`, `Command`, `Query`, `TableUsageStatistics`, `TableDetails`, and `Journal` logs are stored in Log Analytics tables named: `SucceededIngestion`, `FailedIngestion`, `ADXIngestionBatching`, `ADXCommand`, `ADXQuery`, `ADXTableUsageStatistics`, `ADXTableDetails`, and `ADXJournal` respectively.