#!/usr/bin/perl # Convert hexdump -C like output back to binary. Supports * lines, wh +ich # xxd -r does not (this is the reason for this script). Assumes good +input. use strict; use warnings; my ($off, $line, $asterisk); while (<>) { if (/^([[:xdigit:]]{2,}0)\s+((?:[[:xdigit:]]{2}\s+){1,16})/) { if (defined($asterisk)) { my $nlines = (hex($1) - hex($off)) / 16 - 1; print map { chr hex } split /\s+/, $line x $nlines; undef $asterisk; } print map { chr hex } split /\s+/, $2; $off = $1, $line = $2; } elsif (/^\*/) { $asterisk = 1; } else { last; } }

Replies are listed 'Best First'.
Re: hexdump2bin: convert hexdump -C like output back to binary
by Anonymous Monk on Oct 07, 2014 at 10:57 UTC
    When I try hexdump2bin.pl on a large 'hexdump -C' output with many '*' lines covering large offsets it fails "Out of memory!". I am not a perl expert. Can this be fixed?

      Most likely this line:

      print map { chr hex } split /\s+/, $line x $nlines;

      is problematic as it constructs a string of $nlines * length($nlines) bytes in memory before splitting and then printing it.

      The most likely approach to fix this would be to convert the statement to a loop:

      for my $repeat (1..$nlines) { print map { chr hex } split /\s+/, $line; }

      I haven't tested this change, so you're advised to compare the results of the changed file to the results without the change before trusting the output.

      Personally, I wonder how large your input hexdump and your output hexdump files are. Perl usually can handle strings up to 2GB without any problem, so likely your machine has less RAM than that.