Sorry, no idea. I haven't tried. (Most likely it doesn't exist.)
Loading your file into postgres wasn't hard, I just tried it. YMMV, of course, especially when you don't have postgresql installed yet.
In case it helps, here is a quick-'n-dirty load into a (postgres!) table, of your file.
#!/bin/bash file=KJV_fixed.csv schema=public table=kjv t=$schema.$table echo " drop table if exists $t ; create table if not exists $t ( recordnum serial PRIMARY KEY , book int , chapter int , verse int , text text ) " | psql -X < $file perl -ne 'chomp; my @arr = split(/[,]/, $_, 4); print join("\t", @arr), "\n"; ' | psql -c " copy $t(book, chapter, verse, text) from stdin (format csv, header false, delimiter E'\t'); analyze $t; "
Output from that is:
DROP TABLE CREATE TABLE Timing is on. ANALYZE Time: 326.648 ms