Log management
Log management is the process for generating, transmitting, storing, accessing, and disposing of log data. A log data is composed of entries, and each entry contains information related to a specific event that occur within an organization's computing assets, including physical and virtual platforms, networks, services, and cloud environments.
The process of log management generally breaks down into:
- Log collection - a process of capturing actual data from log files, application standard output stream, network socket and other sources.
- Logs aggregation - a process of putting all the log data together in a single place for the sake of further analysis or/and retention.
- Log storage and retention - a process of handling large volumes of log data according to corporate or regulatory policies.
- Log analysis - a process that helps operations and security team to handle system performance issues and security incidents
Overview
Effectively analyzing large volumes of diverse logs can pose many challenges, such as:
- Volume: log data can reach hundreds of gigabytes of data per day for a large organization. Simply collecting, centralizing and storing data at this volume can be challenging.
- Normalization: logs are produced in multiple formats. The process of normalization is designed to provide a common output for analysis from diverse sources.
- Velocity: The speed at which logs are produced from devices can make collection and aggregation difficult
- Veracity: Log events may not be accurate. This is especially problematic for systems that perform detection, such as intrusion detection systems.
Logging can produce technical information usable for the maintenance of applications or websites. It can serve:
- to define whether a reported bug is actually a bug
- to help analyze, reproduce and solve bugs
- to help test new features in a development stage
Terminology
- Logging would then be defined as all instantly discardable data on the technical process of an application or website, as it represents and processes data and user input.
- Auditing, then, would involve data that is not immediately discardable. In other words: data that is assembled in the auditing process, is stored persistently, is protected by authorization schemes and is, always, connected to some end-user functional requirement.
Deployment life-cycle
- in the initial stages, organizations use different log-analyzers for analyzing the logs in the devices on the security perimeter. They aim to identify the patterns of attack on the perimeter infrastructure of the organization.
- with the increased use of integrated computing, organizations mandate logs to identify the access and usage of confidential data within the security perimeter.
- at the next level of maturity, the log analyzer can track and monitor the performance and availability of systems at the level of the enterprise — especially of those information assets whose availability organizations regard as vital.
- organizations integrate the logs of various business applications into an enterprise log manager for a better value proposition.
- organizations merge the physical-access monitoring and the logical-access monitoring into a single view.