Calling all Perl Gurus and BASH Masters

Darkstar757

Diamond Member
Feb 1, 2003
3,190
6
81
Ok here is the issue I need to draft a script that will parse a standard Unix mail file and split the file when the mail on it is older than two months. I really need to do this in perl and sed and awk are ok as well. I really want to figure out how to do this on my own but I need help. Right now my script is able to read in the entire mail file and print it out. I need to know how to sort and split this file without damaging the standard unix mail file format.


Thanks so much guys,
Darkstar
 

notfred

Lifer
Feb 12, 2001
38,241
4
0
What is the standard unix mail file format? I forget, I haven't looked at it in a while.
 

notfred

Lifer
Feb 12, 2001
38,241
4
0
Yeah, I'm working on a windows machine at the moment and don't remember the format off the top of my head.
 

Darkstar757

Diamond Member
Feb 1, 2003
3,190
6
81
i can post the sample because I have sensitive email in it.


However the file is on a sun machine using a standard email format here is a clip from the file header.


From MAILER-DAEMON Wed Jun 1 14:00:11 2005
Date: 01 Jun 2005 14:00:11 -0400
From: Mail System Internal Data <MAILER-DAEMON@sun0>
Subject: DON'T DELETE THIS MESSAGE -- FOLDER INTERNAL DATA
Message-ID: <1117648811@sun0>
X-IMAP: 1108847500 0000002506 Junk NonJunk $Forwarded $MDNSent $Label5 $Label1 $Label2 $Label3 $Label4
Status: RO

 

urname7698

Senior member
Feb 2, 2004
479
0
0
Originally posted by: Darkstar757
i can post the sample because I have sensitive email in it.


However the file is on a sun machine using a standard email format here is a clip from the file header.


From MAILER-DAEMON Wed Jun 1 14:00:11 2005
Date: 01 Jun 2005 14:00:11 -0400
From: Mail System Internal Data <MAILER-DAEMON@sun0>
Subject: DON'T DELETE THIS MESSAGE -- FOLDER INTERNAL DATA
Message-ID: <1117648811@sun0>
X-IMAP: 1108847500 0000002506 Junk NonJunk $Forwarded $MDNSent $Label5 $Label1 $Label2 $Label3 $Label4
Status: RO

Seems simple enough..
read into an array
loop through the array reading lines and writing to a new file
check the date line
change the output file name (split)
keep reading, writing output..

or in bash you could probably just do:
tail `egrep -n 'Date: <FILLINTHEDATE>' <FILENAME> | awk -F: '{ print $1}' ` <FILENAME> > beforedate
head `egrep -n 'Date: <FILLINTHEDATE>' <FILENAME> | awk -F: '{ print $1}' ` <FILENAME> > afterdate

Ok thats completely untested but you get the idea..
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |