Managing & Collecting Data (part 2)

 




To read part 1, please click here



Configuring Log Analytics Storage Options

After determining the amount of data you can ingest as well as store in Log Analytics daily, you can affect the cost of data usage easily as the amount of data you store and retain directly impacts the cost. By exploring the Log Analytics workspace you can easily view the current usage and costs by choosing Usage and estimated costs from the General menu.

Now you can see a dashboard of information with the pricing tier and current costs on the left-hand side as well as graphs on the right-hand side to present the variation in consumption on a daily basis for the last 31 days. You can also explore following two options along the top menu bar:

  • Daily cap- It can enable or disable the ability to limit the data ingestion into the Log Analytics workspace per-day but it can pose a risk of security information loss valuable for detecting threats across your environment. This should be used only for non-production environments.

  • Data retention- It helps you to configure the number of days data can be retained in the Log Analytics workspace. Although by default the number of days is 31 days, but after enabling the Microsoft Sentinel the default free amount becomes 90 days. If you want to increase beyond 90 days, you will be charged with a set fee per GB per month.

Reviewing Alternative Storage Options

The following solutions can be considered as an alternate way for long-term storage, outside of Log Analytics:
  • Azure Blob Storage- A query can created in Azure Monitor to choose a data to move from the Log Analytics workspace and send it to the required Azure Storage Account which helps you to filter the information by type. After the defined query, Azure Automation can be used to run PowerShell as well as load the result as a CSV file while copying it to Azure Blob Storage. 

  • Azure SQL- The Azure Blob Storage's data can be ingested into Azure SQL enabling another method for searching and analyzing the data. Azure Data Factory is used to connect Azure Blob Storage location to Azure SQL Database/Data Warehouse, automating the ingestion of any new data. 

  • Azure Data Lake Storage Gen2- Instead of using azure Blob Storage, you can use this method to access the data through open source platforms like HDInsight, Hadoop, Cloudera, Azure Databricks, and Hortonworks. This solution is easy to manage, performance increasing as well as cost-effective.     








To read part 1, please click here



Comments

Popular posts from this blog

Deployment (Part 3)

Deployment (Part 1)

Deployment (Part 2)