I could probably do it by brute force but also do not know how to approach that either.
Let's assume for a moment that the problem was to use any of your (small, but for discussion only) test set at most once each to achieve the target value. With such a small test set, and the once-only criteria, the problem is simple and fast:
#! perl -slw
use strict;
use Data::Dump qw[ pp ]; $Data::Dump::WIDTH = 1000;
use List::Util qw[ sum ];
use Algorithm::Combinatorics qw[ combinations ];
use Time::HiRes qw[ time ];
=comment
I have a number of sizes of weights, for argument purposes, say they a
+re .01 .12 1.1 2.2 5.3 10.5 50.1 100.6 gram.
Each weight will also have a known precision but that complexity will
+only be used in the program once I figure out the basics.
I need to write a program that calculates the smallest total number of
+ weights to use to obtain a total that is as close to a desired as po
+ssible.
=cut
our $M //= 1;
my $start = time;
my %results;
my @vals = ( qw[ 0.01 0.12 1.1 2.2 5.3 10.5 50.1 100.6 ] ) x $M;
for my $k ( 1 .. $#vals ) {
my $iter = combinations( \@vals, $k );
while( $_ = $iter->next ) {
push @{ $results{ sum @$_ } }, [ @$_ ];
}
}
printf "Finding all %d results took %6.3f seconds\n", scalar( keys %re
+sults ), time() - $start;
printf "<Enter> to see them all"; <STDIN>;
pp \%results;
__END__
C:\test>1216334 -M=1
Finding all 254 results took 0.004 seconds
<Enter> to see them all
The results:
Now let's loosen my additional criteria and say that any of your values can be used twice. Here's the headline statistic (the full data is not interesting): C:\test>1216334 -M=2
Finding all 4817 results took 1.284 seconds
<Enter> to see them allTerminating on signal SIGINT(2)
Note how the number combinations has risen from 254 to 4817, a 19x increase; but they significant thing is the time taken. And increase from 0.004 seconds to 1.284 seconds, a 321x increase.
So, extrapolating for -M=3 (ie. each value can be used at most 3 times) we get: No of results: 4817 * 19 = 91,523; and the time would be: 1.284 x 321 = 6.8694 seconds.
However, when we run the code we get: C:\test>1216334 -M=3
Finding all 16777214 results took 173.459 seconds
4817 to 16,777,214 is a 3483x increase. (And don't get excited about the 173 seconds; a 16.7 million key hash of 7 element arrays requires enough memory to push my 16GB machine into swapping, so I commented out the bit that stored the results.
The point here is that the growth in size and time is not geometric (multiply|); nor even exponential(powers); but geometrically exponential. It grows veryn * m quickly.
Imagine that the tolerence on a 0.01 part was, percentage-wise, fractionally less than the tolerance on a 100.6 part. In other words, to accumulate a total of 100.61, it would be better (according to your tolerance criteria) to use 10061 x 0.01 parts, than to use 1 x a 100.6 part + 1 x a 0.01 part.
If that were the case, then to brute force a solution, would require you to allow each of your 7 values to be (re)used upto 10,061 times each.
Now, if -M=3 would generate 16.7 million possibilities, imagine the size of the numbers that -M=10,061 would generate! (And the time required to generate them all.)
At this point you may be thinking this is impossible.
But ... the saving grace may be those tolerances.
If you told us the maximum final tolerance, and the per part tolerances, then we could limit the number of each individual part, by simply dividing the maximum final tolerance by the tolerance per part. If the tolerance values are real(istic), rather than faux values chosen to deliberately complicate the solution, then we could use those tolerances to constrain the per component maximum reuse counts such that the brute force time/memory requirements would be within the bounds of reality.
Bottom line: if this is a real problem you are needing to solve -- rather than an academic exercise design to frustrate solution -- then if you supply us with the full criteria, a reasonable (time/accuracy/machine specification) solution might be possible.
However, if this is an academic exercise design to create a plausible, but ultimately insoluble, problem; then all you are doing by posting it, is wasting all our time.
And that is the ultimate sin on a site that prides itself on providing solutions, rather than airy-fairy illusions to possible academic approaches to the problem, without either code, nor even an assessment of the probability that such academic objet'd'art might lead to an at least feasible, if not practical, solution.
With the rise and rise of 'Social' network sites: 'Computers are making people easier to use everyday'
Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
In the absence of evidence, opinion is indistinguishable from prejudice.
Suck that fhit
|