Solved

Joining rows in a csv file using Perl

Posted on 2006-11-16
3
302 Views
Last Modified: 2010-05-19
Below is an example of a exported data file from an application that I am working with.  I would like skip the first row and then join the rows so that I can load them into an Oracle external table.

Exported file:
(spsku),item,location
30759@ROCK,30759,ROCK
,indmd,Case
30759@RK07,30759,RK07
,indmd,Case
,,MIN_TIME,MAX_TIME,0.000000
750@GTWN,750,GTWN
,indmd,Case
468@GTWN,468,GTWN
,indmd,Case
779@GTWN,779,GTWN
,indmd,Case
780@GTWN,780,GTWN
,indmd,Case
85338@GTWN,85338,GTWN
,indmd,Case
,,10/ 2/2006  0:00:00,10/ 3/2006  0:00:00,3564.000000
,,10/ 3/2006  0:00:00,10/ 4/2006  0:00:00,4316.000000
,,10/ 4/2006  0:00:00,10/ 5/2006  0:00:00,8128.000000
,,10/ 5/2006  0:00:00,10/ 6/2006  0:00:00,8382.000000
,,10/ 6/2006  0:00:00,10/ 7/2006  0:00:00,5974.000000


Desired format:
30759@ROCK,30759,ROCK,indmd,Case
30759@RK07,30759,RK07,indmd,Case,,MIN_TIME,MAX_TIME,0.000000
750@GTWN,750,GTWN,indmd,Case
468@GTWN,468,GTWN,indmd,Case
779@GTWN,779,GTWN,indmd,Case
780@GTWN,780,GTWN,indmd,Case
85338@GTWN,85338,GTWN,indmd,Case,,10/ 2/2006  0:00:00,10/ 3/2006  0:00:00,3564.000000,,10/ 3/2006 0:00:00,10/ 4/2006  0:00:00,4316.000000,,10/ 4/2006  0:00:00,10/ 5/2006  0:00:00,8128.000000,,10/ 5/2006  0:00:00,10/ 6/2006  0:00:00,8382.000000,,10/ 6/2006  0:00:00,10/ 7/2006  0:00:00,5974.000000
0
Comment
Question by:ewgf2002
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
3 Comments
 
LVL 84

Expert Comment

by:ozo
ID: 17961847
$n=0;
while( <DATA> ){
    next unless $n++;
    print "\n" if $n>3 && !/^,/;
    chomp;
    print;
}
print "\n";
__DATA__
(spsku),item,location
30759@ROCK,30759,ROCK
,indmd,Case
30759@RK07,30759,RK07
,indmd,Case
,,MIN_TIME,MAX_TIME,0.000000
750@GTWN,750,GTWN
,indmd,Case
468@GTWN,468,GTWN
,indmd,Case
779@GTWN,779,GTWN
,indmd,Case
780@GTWN,780,GTWN
,indmd,Case
85338@GTWN,85338,GTWN
,indmd,Case
,,10/ 2/2006  0:00:00,10/ 3/2006  0:00:00,3564.000000
,,10/ 3/2006  0:00:00,10/ 4/2006  0:00:00,4316.000000
,,10/ 4/2006  0:00:00,10/ 5/2006  0:00:00,8128.000000
,,10/ 5/2006  0:00:00,10/ 6/2006  0:00:00,8382.000000
,,10/ 6/2006  0:00:00,10/ 7/2006  0:00:00,5974.000000
0
 

Author Comment

by:ewgf2002
ID: 17965013
Here is what I have and I am not getting any results.  I pretty much new to perl.

##########################################################################
#
#   Function:       process_records
#
#   Description:      This function adds the records to an array.
#
#   Returns:      @RecordList
#
##########################################################################

sub process_records
{
    if ($ProdCount == 0)
    {
        while ($CurrentValue)
        {
           
             next unless $ProdCount++;
             print "\n" if $ProductCount>3 && !/^,/;
             chomp;

            $RecordList[$ProdCount][0] = $Header;
            $RecordList[$ProdCount][1] = $Item;
            $RecordList[$ProdCount][2] = $Location;
          $RecordList[$ProdCount][3] = $IndDmd;
            $RecordList[$ProdCount][4] = $Uom;
            $RecordList[$ProdCount][5] = $Garbage;
            $RecordList[$ProdCount][6] = $StartDate;
          $RecordList[$ProdCount][7] = $EndDate;

            #$CurrentValue = <INPUTFILE>;
        } # End while
    } # End if

   
    return @RecordList;

} # End sub process_records

##########################################################################
#
#   Function: print_list
#
#   Description:      This sub routine prints the list of row values in the
#                  proper format:
#                  ProductCode~Location~Date~Qty
#
#   Returns:      @RecordList
#
##########################################################################

sub print_list
{
    # Print the records to the first 'wpl' file
    for $ProcessRow (1..$#RecordList)
    {
       print OUTPUTFILE "$RecordList[$ProcessRow][0]~$RecordList[$ProcessRow][1]~$RecordList[$ProcessRow][2]~$RecordList[$ProcessRow][3]~$RecordList[$ProcessRow][4]~$RecordList[$ProcessRow][5]~$RecordList[$ProcessRow][6]~$RecordList[$ProcessRow][7]\n";      
    } # End for

       close (OUTPUTFILE2);

    return @RecordList;

} # End sub print_list
0
 
LVL 3

Accepted Solution

by:
nishayr earned 500 total points
ID: 17976489
Hi there,

try this... (might need to update the path to your perl script at the top line...)
-------------------------------------------------------------------------------------------------
#!/usr/bin/perl

use strict;
use warnings;

my $dataFile = "data.txt";
my $outputFile = "formattedData.txt";

open (IN, "$dataFile") || die ("Could not open data file: $dataFile\n");
open (OUT, ">$outputFile") || die ("Could not open data file: $outputFile\n");

my $out;
my $line;

$line = <IN>; #skip first line

# initialise
$line = <IN>;
chop($line);
print OUT $line;

while ($line = <IN>)
{
      chop ($line); # get rid of the newline
      print OUT "\n" if ($line !~ /^,/);
      print OUT $line;
}
print OUT "\n";

close (IN);
close (OUT);
0

Featured Post

Industry Leaders: We Want Your Opinion!

We value your feedback.

Take our survey and automatically be enter to win anyone of the following:
Yeti Cooler, Amazon eGift Card, and Movie eGift Card!

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

This article will show, step by step, how to integrate R code into a R Sweave document
Today, the web development industry is booming, and many people consider it to be their vocation. The question you may be asking yourself is – how do I become a web developer?
In this fourth video of the Xpdf series, we discuss and demonstrate the PDFinfo utility, which retrieves the contents of a PDF's Info Dictionary, as well as some other information, including the page count. We show how to isolate the page count in a…
Simple Linear Regression

728 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question