In Novell Log Manager, a user can export search results as a CSV file. Usually the generated CSV has more than 300+ columns corresponding to the 300+ event fields in a Sentinel schema of which more than 200+ columns may be empty.
So you may have data in the 1st column and data in 300th column but you need to scroll across all those empty columns in between to even see the data. Moreover Open Office, Excel, Google Docs most often throw up an error while trying to open a CSV file with so many columns.
To solve this problem, I have written a simple perl script that removes all the empty columns from the specified input file and writes it to the specified output file. After which the resultant output file can be looked at and edited easily in any of the editors.
To run this script, simply open up a terminal in Linux and type -
perl deleteEmptyColumnsFromCSV.pl Input_CSV_Filename Output_Filename
Substitute Input_CSV_Filename with the name of the CSV file from which you want empty columns to be removed and Output_Filename with name of the file to which the output needs to be written.