in reply to Re^3: Splitting the records into multiple worksheets
in thread Splitting the records into multiple worksheets

Hello, Do you have any specific example of pulling the raw data into DB? Please check below how i am pulling raw data and let me know next step if possible. How i should import into csv and then excel if possible.

my $urlstringlog = $apiurllog . $sortor . $filter . $target . $appID . + $date2; $strExcelFilename = "C:/" . "Analysis-Details-automatic-bk" . $date . +".xlsx"; do { # Run get the users run_api_call($urlstringlog); @responsetext = parse_json ($client->responseContent()); push @responsetextall, @responsetext; $linkheader = $client->responseHeader("link"); if ($linkheader=~ m/next/) { (my $link1 = $linkheader) =~ s/ .*? (self)/$1/gx; (my $link2 = $link1) =~ s/self/$1/g; print "Post Pagination: $link10"; $urlstringlog = $link2; } } while ($linkheader=~ m/next/); for my $i (0..$#responsetextall) { $responsetextall[$i] =~ s/]\[/,/g; for my $j (0..$#{$responsetextall[$i]}) { @responsests1 = $responsetextall[$i][$j]{tat}; $responseid = $responsetextall[$i][$j]{act}{id}; $responsdisp = $responsetextall[$i][$j]{act}{Name}; $responsalter = $responsetextall[$i][$j]{act}{Id}; $responseclientipadd = $responsetextall[$i][$j]{cln}{ipAd +}; $responseappsdebug = $responsetextall[$i][$j]{deb}{Data}; + for my $m (0..$#responsests1) { for my $n (0..$#{$responsests1[$m]}) { $responseapps2id = $responsests1[$m][$n]{id}; $responseapps2 = $responsests1[$m][$n]{ty}; $responseapps1 = $responsests1[$m][$n]{Name}; if ( $responseapps2 eq 'AppInstance' ){ @responseapps3 = $responsests1[$m][$n]{displayName}; $responseapps5 = $responseapps3[0]; print "Its Matches Appinstances"; $responseapps4 = $responsests1[$m][$n]{Name}; if ( $responseapps2 eq 'AppUser'){ $responseapps3id = $responsests1[$m][$n]{a +Id}; } } push @responseapps3id,$responseapps3id; push @responseapps5 , $responseapps4; push @responseapps7, $responseapps7; } } push @responsalter, $responsalter; push @responseclientipadd,$responseclientipa +dd; } }

Replies are listed 'Best First'.
Re^5: Splitting the records into multiple worksheets
by Bod (Parson) on Feb 20, 2021 at 22:31 UTC
    Do you have any specific example of pulling the raw data into DB?

    I've not had cause to use Sqlite as I usually have another DB available. But this is the sort of thing you need. Untested and without checking for errors creating connections and opening files etc. Hopefully it will give you a start.

    use DBI; use DBD::Sqlite; # Connect to your database my $dbh = DBI->connect("dbi:sqlite:$db_name:localhost:$port", $db_user +, $db_password); # Create a temporary table $dbh->do("CREATE TEMPORARY TABLE IF NOT EXISTS ExcelData (name VARCHAR +(80), data INT)"); # Populate temporary table from your datasource foreach my $row(@responsetext) { my ($name, $data) = split / +/, $row $dbh->do("INSERT INTO ExcelData SET name = '$name', data = $data") +; } # Create a query to sort the data how you want it and output to CSV fi +le open $fh, '>', 'myfile.csv'; my $query = $dbh->prepare("SELECT name, data FROM ExcelData WHERE data + > 20 ORDER BY data"); $query->exceute; my ($n, $d); while (($n, $d) = $query->fetchrow_array) { print $fh qq["$n",$d\n]; } close $fh;

        I do limit the privileges of the script's DB account to SELECT, INSERT, UPDATE, DELETE and CREATE TEMPORARY TABLE

      That's very good suggestions but i will have to calculate the .csv file with around 2000000 rows of data in order to compare against another excel . I will not have any server to install database. Can i create a temporary .db file in my system and save the data and then fetched the data as .csv. How i can handle the .xlsx file's row limitations in case i will have 2000000 rows of records to handle? Can i create dynamic worksheets by checking the records in array?

        "I will not have any server to install database."

        Several people have suggested you use DBD::SQLite, you could have at least looked at the one line description:

        "DBD::SQLite is a Perl DBI driver for SQLite, that includes the entire thing in the distribution. So in order to get a fast transaction capable RDBMS working for your perl project you simply have to install this module, and nothing else."

        "How i can handle the .xlsx file's row limitations in case i will have 2000000 rows of records to handle? Can i create dynamic worksheets by checking the records in array?"

        What do you expect XLSX to do if it has more than the max rows per worksheet? Given the limits are well documented, find a way to work around them or use a tool/format that better suits your source data.