使用nutch1.8  + hadoop2.2.1运行爬虫任务时候 出现的如下问题:15/01/12 22:35:23 ERROR crawl.Injector: Injector: java.lang.IllegalArgumentException: Wrong FS: hdfs://192.168.137.131:9000/user/haduser/crawld/crawldb/1501539946, expected: file:///
        at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:642)
        at org.apache.hadoop.fs.RawLocalFileSystem.pathToFile(RawLocalFileSystem.java:69)
        at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:516)
        at org.apache.hadoop.fs.FileSystem.isDirectory(FileSystem.java:1410)
        at org.apache.hadoop.fs.ChecksumFileSystem.rename(ChecksumFileSystem.java:496)
        at org.apache.nutch.crawl.CrawlDb.install(CrawlDb.java:159)
        at org.apache.nutch.crawl.Injector.inject(Injector.java:295)
        at org.apache.nutch.crawl.Injector.run(Injector.java:316)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
        at org.apache.nutch.crawl.Injector.main(Injector.java:306)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
有哪位大神知道解决办法么?