This looks like a similar problem to one of my posts,
Re: match with elements in array. Recently, I remember running some benchmarks with 100 million hash keys. Takes about 90 seconds on my not so fast machine to create a hash that big and about 25 seconds on a more up-to-date machine.
If the hash table will fit into memory without paging, running 10^7 queries will be real fast once you have the table built. For the simplest approach to work, the number of unique values is what matters. Of course some encoding of the ranges is possible (problem above just had non-overlapping ranges, but hash value could be an array of range ID #'s or a single ID that represents multiple ranges.). Curious, how many unique values do you have that fit within one or more of the valid ranges?