hadoop java.net.URISyntaxException:绝对 URI 中的相对路径:rsrc:hbase-

时间:2023-05-04
本文介绍了hadoop java.net.URISyntaxException:绝对 URI 中的相对路径:rsrc:hbase-common-0.98.1-hadoop2.jar的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着跟版网的小编来一起学习吧!

问题描述

限时送ChatGPT账号..

我有一个连接到 HBASE 的 map reduce 作业,但我不知道我在哪里遇到了这个错误:

I have a map reduce job that connects to HBASE and I can't figure out where I am running into this error:

Exception in thread "main" java.lang.reflect.InvocationTargetException
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.eclipse.jdt.internal.jarinjarloader.JarRsrcLoader.main(JarRsrcLoader.java:58)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
Caused by: java.lang.IllegalArgumentException: java.net.URISyntaxException: Relative path in absolute URI: rsrc:hbase-common-0.98.1-hadoop2.jar
        at org.apache.hadoop.fs.Path.initialize(Path.java:206)
        at org.apache.hadoop.fs.Path.<init>(Path.java:172)
        at org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil.findOrCreateJar(TableMapReduceUtil.java:703)
        at org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil.addDependencyJars(TableMapReduceUtil.java:656)
        at org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil.addHBaseDependencyJars(TableMapReduceUtil.java:573)
        at org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil.addDependencyJars(TableMapReduceUtil.java:617)
        at org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2.configureIncrementalLoad(HFileOutputFormat2.java:398)
        at org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2.configureIncrementalLoad(HFileOutputFormat2.java:356)
        at com.ancestry.bigtree.hfile.JsonToHFileDriver.run(JsonToHFileDriver.java:117)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
        at com.ancestry.bigtree.hfile.JsonToHFileDriver.main(JsonToHFileDriver.java:69)
        ... 10 more
Caused by: java.net.URISyntaxException: Relative path in absolute URI: rsrc:hbase-common-0.98.1-hadoop2.jar
        at java.net.URI.checkPath(URI.java:1804)
        at java.net.URI.<init>(URI.java:752)
        at org.apache.hadoop.fs.Path.initialize(Path.java:203)

如果我没有 Hbase 库,则作业运行良好.生成的相对路径在哪里?如何强制生成的路径是绝对的?

If I don't have Hbase libraries, the job runs fine. Where is the relative path being generated? How can I force the generated paths to be absolute?

在我的代码中,我有这两行:

In my code I have these two lines:

TableMapReduceUtil.addHBaseDependencyJars(conf);HFileOutputFormat2.configureIncrementalLoad(job, htable);

TableMapReduceUtil.addHBaseDependencyJars(conf); HFileOutputFormat2.configureIncrementalLoad(job, htable);

如果我删除它们我没问题,但这项工作并没有做我需要做的事情.我最终试图创建 HFILE 以与 hbase bulkloader 一起使用.

if I remove them I am ok but the job does not do what I need it to do. I am ultimately trying to create HFILE to use with hbase bulkloader.

环境:HBase 0.96.1.2.0.10.0-1-hadoop2Hadoop 2.2.0.2.0.10.0-1

Environment: HBase 0.96.1.2.0.10.0-1-hadoop2 Hadoop 2.2.0.2.0.10.0-1

提前感谢您的任何帮助或指导.

Thank you in advance for any help or direction.

推荐答案

异常有点误导;没有真正的相对路径被解析,这里的问题是Hadoop路径"不支持文件名中的':'.在您的情况下,rsrc:hbase-common-0.98.1-hadoop2.jar"被解释为rsrc"是方案",而我怀疑您真的打算添加资源文件:///path/to/your/jarfile/rsrc:hbase-common-0.98.1-hadoop2.jar".这是一个讨论非法字符的旧 JIRA:

The exception is a bit misleading; there's no real relative path being parsed, the issue here is that Hadoop "Path" doesn't support ':' in filenames. In your case, "rsrc:hbase-common-0.98.1-hadoop2.jar" is being interpreted as "rsrc" being the "scheme", whereas I suspect you really intended to add the resource file:///path/to/your/jarfile/rsrc:hbase-common-0.98.1-hadoop2.jar". Here's an old JIRA discussing the illegal character:

https://issues.apache.org/jira/browse/HADOOP-3257

请注意,您可能也无法使用该绝对路径,因为它的文件名中仍然有 ':'.您可以尝试转义文件名,如rsrc%3Ahbase-common-0.98.1-hadoop2.jar",但在使用它的另一端可能无法正确找到它.

Note that you probably won't be able use that absolute path either, since it still has ':' in the filename. You can try escaping the filename like "rsrc%3Ahbase-common-0.98.1-hadoop2.jar", but then it may not be found correctly on the other end where it is being used either.

解决此问题的最佳方法是解决引入rsrc:hbase-common-0.98.1-hadoop2.jar"的根本原因——使用 Eclipse 构建可运行 jar 可能是导致该问题的原因之一.如果可能,尝试使用 Eclipse 以外的其他工具构建您的 jar,看看是否会出现同样的问题;您也可以在 Eclipse 中创建 jar 时尝试选择将所需的库打包到生成的 jar 中".

The best way to fix this is to tackle the root cause of "rsrc:hbase-common-0.98.1-hadoop2.jar" being introduced--using Eclipse to build your runnable jar is one likely cause of the issue. If possible, try to build your jar using something other than Eclipse and see if the same problem occurs; you can also try to select "Package required libraries into generated jar" when creating your jar in Eclipse.

如果 uber-jar 最终太大,您还可以尝试将原始依赖 jar (如 hbase-common-0.98.1-hadoop2.jar)与您可能需要的任何其他依赖一起放入所有节点上的类路径中,然后跳过对TableMapReduceUtil.addHBaseDependencyJars(conf);"的调用.

If the uber-jar ends up too large, you can also try to place the original dependency jars like hbase-common-0.98.1-hadoop2.jar into the classpath on all your nodes along with any other dependencies you may need, and then skip the call to "TableMapReduceUtil.addHBaseDependencyJars(conf);".

这是另一个用户遇到与您所看到的类似问题的旧线程:

Here's an old thread of another user running into a similar problem as what you're seeing:

http://lucene.472066.n3.nabble.com/Error-while-running-MapR-program-on-multinode-configuration-td4053610.html

这篇关于hadoop java.net.URISyntaxException:绝对 URI 中的相对路径:rsrc:hbase-common-0.98.1-hadoop2.jar的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持跟版网!

上一篇:在 Hadoop 中更改文件拆分大小 下一篇:CombineFileInputFormat Hadoop 0.20.205 的实现

相关文章