Here’s a list of the 10 best log management tools:
- Solarwinds Papertrail
- ManageEngine Syslog Forwarder
- DIY log archiving
Looking for an alternative to Cronolog?
Cronolog was a handy tool that reorganized your server logs by date. Andrew Ford wrote his own script to split out his Apache web server log files. He was so pleased with all of the time that little script saved him that he made it available to the systems administrators of the world by posting the code on GitHub. He also set up a website, called cronolog.org, to promote the script. Unfortunately, Cronolog, which was written for Unix and Linux, is no longer available.
If you are looking for an alternative to Cronolog to manage your system log files, we’ll show you some other utilities to try. Very few utilities match Cronolog exactly, mainly because the task of copying files is very straightforward and most system administrators can create such a script in a minute. However, we found some newer log manager utilities that give you even better functionality than Cronolog.
What about Analog?
Analog was a free web log analysis computer program first released in 1995, by Stephen Turner.
Analog provided analysis of metrics such as; Number for requests, response request method, requests per path, response time statistics and much more.
Unfortunately it was discontinued back in 2010 but you may find our following list of log management and analysis tools useful to help you find an alternative.
Papertrail is a log management system produced by Solarwinds, a leading network software producer. The main purpose behind Papertrail is to centralize all log file data in one place, so it is a log aggregator. That makes it markedly different from Coronolog, a logfile parser. That said, Papertrail’s file content filtering capabilities can extract records by date.
You can use Papertrail to examine a range of log files, including Windows events, Ruby on Rails program messages, router and firewall notifications, and Apache server log files. The service is cloud-based, so you don’t need to worry whether it will run on your operating system. You access the dashboard through your web browser.
The price for the service varies depending on the search volume that you put through it. There is a free plan that gives you a data throughput allowance of 100 MB per month. That is not very much, but if you limit your service coverage to just Apache logs, you might be able to get away with it. The cheapest paid plan gives you a data allowance of 1 GB per month at a price of $7. The paid plans work on a subscription basis and you pay a monthly fee.
Each plan lets you view a period of data and allows you to archive data for a different length of time. For example, the free service lets you operate on data from the last 48 hours and you can archive data for seven days. This would be enough to emulate Cronolog, because for that, you only need to look at one day’s worth of data at a time.
Splunk is a comprehensive log management system for macOS, Linux, and Windows. The system is a well-known utility within the system administration community. Splunk, Inc produces three versions of its network data monitoring software. The top-of-the-line version is called Splunk Enterprise, which costs $173 per month. This is a network management system rather than just a log file organizer. Fortunately, Splunk is also available for free, making it into our list of Cronolog alternatives.
The free Splunk is restricted to input file analysis. You can feed in any of your standard logs or funnel live data through a file into the analyzer. The free utility can only have one user account and its data throughput is limited to 500 MB per day. The system doesn’t explicitly deal with network alerts, but you could force that functionality by getting alerts written to a file and then bounced into Splunk.
A data sorting and filtering utility is built into Splunk, and you can write out to files from the analyzer. These features can emulate Cronolog by dividing log records by date and writing each group out to new files.
Like Cronolog, Fluentd runs on Linux systems — Debian, CentOS, and Ubuntu. It can also be installed on Mac OS, Amazon Linux, RHEL, and Windows. This cloud-based utility acts as a hub for log file information gathered by an agent on your system. The tool can collect live data streams to create log files as well as monitor and manage existing files. One of the data sources that Fluentd is written to manage is the logging system of Apache.
Results from log record analysis can be made to trigger alerts, but these have to be processed by Nagios, or a Nagios-based monitoring system. Fluentd is an open-source project, so you can download the source code. This tool is free to use.
The Fluentd website is the source for the program and it is also the location of community pages where you can get help and advice on running the tool from other users. The core package can be extended through plugins written by other community members. Those plugins are usually free of charge.
You can use many other free interfaces as a front end for Fluentd, such as Kibana. The Fluentd utility can also be integrated with tools that include Elasticsearch, MongoDB, and InfluxDB for analysis.
Logstash is a log creation facility produced by Elastic. This Dutch software organization has created a range of data exploration products that link together in the “Elastic Stack.” This suite of programs is open source and each product is available for free. The core element of the Elastic Suite is Elasticsearch. This is a searching and sorting utility that can process data from several files into unified results. Elasticsearch can be integrated into other tools and is available for use with many of the other utilities in this list.
Logstash is the Elastic Stack’s data gathering tool. The functions of Logstash can be tailored to emulate Cronolog. The facility creates source files for analysis by other tools, such as Elasticsearch. The power of this tool is that it can collate data from several different sources. However, if you If want to reorganize your Apache log files, there is no reason why you can’t limit the data search to just one source log file.
The capabilities of Logstash include file parsing, so you can use this function to split up your log files by date. The output of Logstash can be formatted to suit a long list of utilities for analysis or display. It can also be written to a plain text file on disk, which is exactly what Cronolog used to do.
Elastic produces Kibana, which is a great free front end for any data gathering tool. Other useful tools in this list can funnel data to Kibana, so you don’t have to rely just on the other Elastic Stack programs to source data for this application.
The full capabilities of Kibana go way beyond the file parsing function of Cronolog. However, the wide range of commands available with Kibana include basic file management that can split out any log file by date. Kibana has a command language console that lets you create scripts and programs to process files. However, if you don’t have programming skills, the preset data manipulation facilities of the interface give you a lot of powerful data sorting and filtering utilities that will help you manage your log files.
The interface includes time-based analysis tools including filters, so you can easily isolate records in a log file that relate to a specific date. Raw data, graphs, and other visualizations can be written out to files or used to generate reports. Standard reports can be scheduled to run periodically, so creating a filter by date and setting it to run daily and output to a plain text file would give you exactly the same results that you used to get from Cronolog.
The benefit of using Kibana is that it can give much more assistance than Cronolog could. You can compare data from different sources and visualize the information from all of your system log files to analyze performance and forecast capacity requirements. To get a full data management facility, you should probably use Logstash to collate source data, Elasticsearch to sort data, and Kibana to display results. Kibana has plenty of data sourcing and manipulation facilities, so it could be used as a standalone data analysis tool.
Graylog is a free, open-source log file-based system that can give you a lot more functionality than just a log archiving utility. This log analyzer has a graphical user interface and it can run on Ubuntu, Debian, CentOS, and SUSE Linux. You can also run it on a virtual machine on Microsoft Windows and you can install the Graylog system on Amazon AWS.
This log management facility can work with any logs. You can feed data into it from other sources by channeling system reports into a file, thus creating your own logs. The interface doesn’t acquire copies of logs, but sits on live logs, updating the information that feeds into the analyzing engine as new records are written to the log.
Action scripts can forward log data to the screen, to other logs, or on to other applications. The dashboard shows data in the form of histograms, pie charts, line graphs, and color-coded lists. The interface includes a search and query function, which allows you to filter log records to get information on specific types of events or specific sources.
The Graylog processes aggregate data to simplify displays on the Dashboard’s Home page and also to enable alert conditions to be specified across data sources and over time. Those overall views of data are not your only option because you can drill down and see the detailed records that created a summary. This makes Graylog a data mining tool.
Alert conditions can be customized and you can write actions to be performed in the event of alerts arising. These actions include executing scripts or notifying specific team members by email or by Slack message.
This is an amazing and very comprehensive tool that can automate your log file processing and automatically execute fault resolution.
The two essential elements of Cronolog are that it could split up log files by date and that it could be run automatically. XpoLog includes both those functions. This is a great improvement on Cronolog, however, because XpoLog includes a lot of other functionality. It is a vast improvement on that discontinued log parsing tool.
XpoLog can analyze data from a range of sources, including Apache server logs, AWS, Windows and Linux event logs, and Microsoft IIS. The utility can be installed on Mac OS X 10.11, macOS 10.12 and 10.13, Windows Server 2008 R2, Windows Server 2012, Windows Server 2016, Windows 8, 8.1, and 10. The software can also be installed on Linux Kernel 2.6 and later. You can opt for a cloud-based version if you don’t want to install the software. You can access it through Chrome, Firefox, Internet Explorer, or Microsoft Edge.
Apart from straightforward log file management, the XpoLog analysis engine detects unauthorized file access and helps optimize application and hardware usage. XpoLog gathers data from selected sources and will monitor those files that you include in its scope. Once data is centralized, XpoLog merges all data sources and creates its own database of records. Those records can be searched and filtered for analysis, and results can be written out to files. That functionality offers the same file parsing as Cronolog. Results can be written out to files or retained as archives for viewing through the XpoLog dashboard.
XpoLog is available for free. If you just want to split up your Apache log files, then the free version will be good enough. To deal with larger volumes of data and employ the system for analysis, then you might have to step up to one of the paid plans.
The free version allows you to process up to 1 GB of data per day, and the system will retain that data for five days. You could always write out the records to text files to get around that five day limit. The cheapest paid plan offers exactly the same data throughput limit and data retention period as the free service, so it is difficult to see why anyone would pay the $9 per month price tag for that package. More expensive plans give you an unlimited data retention period, with the cheapest unlimited option including an allowance of 1GB data throughput per day for $39 per month. You get progressively larger daily data throughput allowances at each price point. The top plan gives you a data throughput of 8GB per day and costs $534 per month. You have to pay for the service annually in advance, even though it is priced per month. You can also buy a perpetual license.
The Syslog Forwarder runs on the Windows operating system and it is completely free to use. It intercepts syslog records and forwards them on to different syslog servers, according to a rule base. The functions of the forwarder let you filter out irrelevant, mundane, or unimportant log messages. All blocked messages are sent to the original log file, but don’t get sent on to an end log file.
The rule base of the Syslog Forwarder allows you to write to new log files each day, thus emulating the functionality of Cronolog. The big difference between Syslog Forwarder and Cronolog is that this existing log manager runs on Windows with a GUI interface, whereas Cronolog was a command line function for Unix and Linux systems.
Probably the closest alternative to Cronolog, Managelogs is written in “C.” Not only is the utility free, but the source code is available for you to read through. The program is specifically designed to manage Apache web server logs.
Managelogs has different operating modes activated by the variables specified when launching the program. You can set the utility to archive log files by date, or you can specify a maximum file size, which will copy over the log file to a new name and then clear out the current log file so it can start again from scratch and build up new records.
If you specify that logs should be split by date, Managelogs will ensure that files are consolidated across sessions, so stopping and restarting the server manager won’t wipe out existing records on an incomplete day.
10. DIY log archiving
You can write your own copy of Cronolog as a script for Unix or Unix-like operating systems such as Linux and Mac OS. Although there are plenty of clever things you can do with regular expressions and pattern matching to pick out records for a specific date, the easiest way to get log archives per day is to write a copy script and then schedule it to run at midnight. If the last instructions in the script remove the existing file, new records will accumulate in a separate file throughout the day, to be archived off again at midnight.
for f in $FILES
$CP $LOGDIR/$f $LOGARCH/$f.$DATE.log
$MV $LOGDIR/$f $LOGDIR/$f.$DATE.saved
cat /dev/null > /opt/apache/logs/access_log
Don’t get stressed that cronolog.org is no longer operating or that none of the download sites that used to deliver Cronolog no longer list it. Cronolog was not that great, and you could quite easily write your own version in just a couple of minutes.
Log management utilities are very useful and despite the limited capabilities of Cronolog, many systems administrators came to rely on its services. As you can see from this review, many other log management tools not only give you the ability to parse your log files by date, but also give you some amazing data visualization and analysis features.
Every one of the recommendations in our list of Cronolog replacements can be used for free. All of these facilities give you better service than the do-it-yourself replication of Cronolog. Try out any of these tools and see which of them gives you the extra features needed to improve log and facilities management.