LogPoint Collectors – The ingestion component of the LogPoint system
Collectors are responsible for the ingestion, normalisation and enrichment of log data from disparate log sources into the LogPoint platform. This is achieved with LogPoint’s single taxonomy and compiled plugins that normalise any log file into LogPoint’s standardised key/value pair format for long term storage. By normalising the log data upon ingestion (rather than upon search) LogPoint can significantly speed up the search and correlation process.
There are hundreds of plugins available out-of-the-box when deploying the LogPoint software, further simplifying the process of log ingestion by taking away the hassle of configuring these yourself. Indeed, if there is a commercial-off-the-shelf (COTS) product that LogPoint doesn’t have a plugin for, LogPoint will commit to getting this written for customers as part of the normal support process.
The Collector architecture also provides full data-enrichment capabilities, meaning that gathered events can be correlated with external metadata for contextual analytics. For instance, LogPoint can be used to correlate Threat Intelligence feeds against any collected data source or correlate incidents with CMDBs to quickly target pinpoint locations of devices. Any structured data can be used to enrich the collected data. These capabilities increase performance and accuracy of analytics through ingest-time enrichment, without the need to import and fragment existing data.
Alongside the Collector, LogPoint provides the LogPoint Agent which enables the transmission of encrypted log data, system and integrity monitoring of log sources and fetching log data from sources that may not have an easy method of data transmission natively to the network security tool. The LogPoint Agent can be implemented and managed from a central LogPoint server to simplify the dissemination of the functionality across the customers wider network.
LogPoint Backend – The storage component of the LogPoint solution
The Backend of the LogPoint system is a NOSQL-based storage solution, which means that all data is stored in flat file, enabling searches that take seconds. This architecture is then split down into individual repositories, defined by the customer, as best suits business needs.
These repositories (repos) can manage retention policies on an individual basis, meaning that sensible usage of repos can provide business with massive savings on storage infrastructure. The LogPoint platform even allows you to automatically migrate data within repo onto different storage tiers as the data becomes older, until such a time as the data is no longer required, when it can be automatically deleted.
Finally, access control is integrated into the system so that administrators can control which of their security analysts can view which repo. This can be particularly useful in environments where customers wish to store log data but want to restrict correlation of that data to a few trusted employees (such as HR records).
LogPoint Search head – The analytics component of the LogPoint managed network security system
The Search head is where customers will develop the custom content that extracts value from that raw and normalized log data. The built-in log analysis engine will use this content to automatically detect and alert on critical incidents on your systems. Events monitored can be very diverse and can, for example, include an ongoing attack, a compromised system, malicious insider activity, performance degradations, and much more.
To enable your organization to create value fast, LogPoint offers several preconfigured configurations based on 400+ Use Cases, allowing your team to orchestrate analytics and playbooks, without the hassle of countless hours of professional services.
Analysts produce content (dashboards, alerts, reports) through the search function of the LogPoint portal, which takes advantage of the single taxonomy to simplify the process. Everything stored within the LogPoint backend can be searched using the same key/value pair identifiers, this means that even in the case of new functionality being released by LogPoint, all of the existing content will continue to work as initial intended.
Within LogPoint, events can trigger alerts, but can also be configured to trigger an incident. Incidents within LogPoint can be assigned Risk levels, then assigned to a user. This can then be used to jump to the data that has triggered the incident directly through the GUI. Once the incident has been investigated, the analyst can either comment, resolve, close or re-open the incident.
Finally, to further enable the segregation of data, it is possible to assign the four eyes principle to individual key/value pairs within logs through the implementation of Data Privacy mode. In LogPoint any field that can be used to identify a given user may be encrypted/obfuscated when presented to the GUI in a way that no analyst can see the original data. The analyst can still run queries, view dashboards and reports etc. but the encrypted field-values will be shown and not the real data. If there is a requirement for the analyst to see the unencrypted data, the analyst can request time-based access from the companies nominated Data Privacy officer.