"...i'm worrying about is its speed...the data is already in the script...what about tens of thousands of strings?"
I assume you expect/want qweqwe rtyr tyr \'asdasd fghfghfgh as output, right?
Perhaps it might be an alternative to write your data to a file as JSON with something like this:
use JSON::XS;
my $data = [qw( qweqwe rtyr tyr \'asdasd fghfghfgh)];
my $json = encode_json $data;
# to file...
Then process your data line by line as usual:
#!/usr/bin/env perl
use JSON::XS;
use strict;
use warnings;
use feature qw (say);
# use Data::Dump;
open my $fh, '<', 'data.dat' or die $!;
while ( my $line = <$fh> ) {
chomp $line;
say nikolay( decode_json($line) );
}
close $fh;
sub nikolay {
my $data = shift;
join " ", @$data; # what ever...
}
__END__
karls-mac-mini:monks karl$ ./json.pl
qweqwe rtyr tyr \'asdasd fghfghfgh
...
Please see also JSON::XS, JSON and Benchmark.
Edit: Link added.
Best regards, Karl
«The Crux of the Biscuit is the Apostrophe»
|