It has started working now.  I don't know what I changed.  I dropped every single table from hive, explicitly created a new directory on HDFS and moved the .csv file to that directory, ran hive again and created the table.  This time it worked.  I can perform queries against the directory.  Maybe hadoop and hive confused each other about the other directory (it got corrupted or something)...or maybe I screwed something up, I dunno.

I would have expected better error-detection.  Instead of simply returning 0-length queries, it would be nice if hive would actually produce an error message if I create the table in an invalid or incorrect fashion...but perhaps it couldn't tell; maybe the database just looked empty at hive's level of abstraction.

I mean, even if I did screw something up (an option I am entirely open to), I never really got an error about it.  Hive gladly wrapped a "hive" table around the directory, and the csv file in question, without an error of any kind.  Hive could see the table, it listed and could describe it, but would then return empty queries against the table.  When I moved the exact same csv file to a brand new HDFS directory and tried again from scratch, everything started working.  I probably did something wrong, but some sort of error message would have been very helpful.

Anyway, these tools are still pretty young.  I understand that they will continue to evolve.  The ability to detect and report errors will almost certainly improve with time.
Thanks for the concerted efforts to help.


Keith Wiley     [EMAIL PROTECTED]

"What I primarily learned in grad school is how much I *don't* know.
Consequently, I left grad school with a higher ignorance to knowledge ratio than
when I entered."
                                           --  Keith Wiley