maven - Error building Hadoop 2.6 on Windows 8 -


i'm following tutorial build , install hadoop. http://www.srccodes.com/p/article/38/build-install-configure-run-apache-hadoop-2.2.0-microsoft-windows-os

however, when give below command vs2010 command prompt:

 mvn package -pdist,native-win -dskiptests -dtar 

i below error:

main: [mkdir] skipping c:\hdfs\hadoop-hdfs-project\hadoop-hdfs\target\native because exists.  [exec] current os windows 8.1  [exec] executing 'cmake' arguments:  [exec] 'c:\hdfs\hadoop-hdfs-project\hadoop-hdfs/src/'  [exec] '-dgenerated_javah=c:\hdfs\hadoop-hdfs-project\hadoop-hdfs\target/native/javah'  [exec] '-djvm_arch_data_model=64'  [exec] '-drequire_libwebhdfs=false'  [exec] '-drequire_fuse=false'  [exec] '-g'  [exec] 'visual studio 10 win64'  [exec]  [exec] ' characters around executable , arguments  [exec] not part of command.  execute:java13commandlauncher: executing 'cmake' arguments: 'c:\hdfs\hadoop-hdfs-project\hadoop-hdfs/src/' '-dgenerated_javah=c:\hdfs\hadoop-hdfs-project\hadoophdfs\target/native/javah' '-djvm_arch_data_model=64' '-drequire_libwebhdfs=false' '-drequire_fuse=false' '-g' 'visual studio 10 win64'  ' characters around executable , arguments not part of command. [info] ------------------------------------------------------------------------ [info] reactor summary: [info] [info] apache hadoop main ................................. success [  1.781s] [info] apache hadoop project pom .......................... success [  1.333s] [info] apache hadoop annotations .......................... success [  1.030s] [info] apache hadoop assemblies ........................... success [  0.375s] [info] apache hadoop project dist pom ..................... success [  2.104s] [info] apache hadoop maven plugins ........................ success [  6.628s] [info] apache hadoop minikdc .............................. success [  1.047s] [info] apache hadoop auth ................................. success [  1.173s] [info] apache hadoop auth examples ........................ success [  1.594s] [info] apache hadoop common ............................... success [ 59.046s] [info] apache hadoop nfs .................................. success [  1.905s] [info] apache hadoop kms .................................. success [  6.491s] [info] apache hadoop common project ....................... success [  0.150s] [info] apache hadoop hdfs ................................. failure [ 19.351s] [info] apache hadoop httpfs ............................... skipped [info] apache hadoop hdfs bookkeeper journal .............. skipped [info] apache hadoop hdfs-nfs ............................. skipped [info] apache hadoop hdfs project ......................... skipped [info] hadoop-yarn ........................................ skipped [info] hadoop-yarn-api .................................... skipped [info] hadoop-yarn-common ................................. skipped [info] hadoop-yarn-server ................................. skipped [info] hadoop-yarn-server-common .......................... skipped [info] hadoop-yarn-server-nodemanager ..................... skipped [info] hadoop-yarn-server-web-proxy ....................... skipped [info] hadoop-yarn-server-applicationhistoryservice ....... skipped [info] hadoop-yarn-server-resourcemanager ................. skipped [info] hadoop-yarn-server-tests ........................... skipped [info] hadoop-yarn-client ................................. skipped [info] hadoop-yarn-applications ........................... skipped [info] hadoop-yarn-applications-distributedshell .......... skipped [info] hadoop-yarn-applications-unmanaged-am-launcher ..... skipped [info] hadoop-yarn-site ................................... skipped [info] hadoop-yarn-registry ............................... skipped [info] hadoop-yarn-project ................................ skipped [info] hadoop-mapreduce-client ............................ skipped [info] hadoop-mapreduce-client-core ....................... skipped [info] hadoop-mapreduce-client-common ..................... skipped [info] hadoop-mapreduce-client-shuffle .................... skipped [info] hadoop-mapreduce-client-app ........................ skipped [info] hadoop-mapreduce-client-hs ......................... skipped [info] hadoop-mapreduce-client-jobclient .................. skipped [info] hadoop-mapreduce-client-hs-plugins ................. skipped [info] apache hadoop mapreduce examples ................... skipped [info] hadoop-mapreduce ................................... skipped [info] apache hadoop mapreduce streaming .................. skipped [info] apache hadoop distributed copy ..................... skipped [info] apache hadoop archives ............................. skipped [info] apache hadoop rumen ................................ skipped [info] apache hadoop gridmix .............................. skipped [info] apache hadoop data join ............................ skipped [info] apache hadoop ant tasks ............................ skipped [info] apache hadoop extras ............................... skipped [info] apache hadoop pipes ................................ skipped [info] apache hadoop openstack support .................... skipped [info] apache hadoop amazon web services support .......... skipped [info] apache hadoop client ............................... skipped [info] apache hadoop mini-cluster ......................... skipped [info] apache hadoop scheduler load simulator ............. skipped [info] apache hadoop tools dist ........................... skipped [info] apache hadoop tools ................................ skipped [info] apache hadoop distribution ......................... skipped [info] ----------------------------------------------------------------------- [info] build failure [info] ----------------------------------------------------------------------- [info] total time: 01:47 min [info] finished at: 2015-03-28t21:18:11+00:00 [info] final memory: 78m/363m [info] ----------------------------------------------------------------------- [warning] requested profile "native-bin" not activated because not exist. [error] failed execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (make) on project hadoop-hdfs: ant buildexception has occured: execute failed: java.io.ioexception: cannot run program "cmake" (in directory "c:\hdfs\hadoop-hdfs-project\hadoop-hdfs\target\native"): createprocess error=2, system cannot find file specified [error] around ant part ...<exec failonerror="true" dir="c:\hdfs\hadoop-hdfs-project\hadoop-hdfs\target/native" executable="cmake">... @ 5:107 in c:\hdfs\hadoop-hdfs-project\hadoop-hdfs\target\antrun\build-main.xml  [error] -> [help 1] org.apache.maven.lifecycle.lifecycleexecutionexception: failed execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (make) on project hadoop-hdfs: ant buildexception has occured: execute failed: java.io.ioexception: cannot run program "cmake" (in directory "c:\hdfs\hadoop-hdfs-project\hadoop-hdfs\target\native"): createprocess error=2, system cannot find file specifiedaround ant part ...<exec failonerror="true" dir="c:\hdfs\hadoop-hdfs-project\hadoop-hdfs\target/native" executable="cmake">... @ 5:107 in c:\hdfs\hadoop-hdfs-project\hadoop-hdfs\target\antrun\build-main.xml     @ org.apache.maven.lifecycle.internal.mojoexecutor.execute(mojoexecutor.java:216)     @ org.apache.maven.lifecycle.internal.mojoexecutor.execute(mojoexecutor.java:153)     @ org.apache.maven.lifecycle.internal.mojoexecutor.execute(mojoexecutor.java:145)     @ org.apache.maven.lifecycle.internal.lifecyclemodulebuilder.buildproject(lifecyclemodulebuilder.java:116)     @ org.apache.maven.lifecycle.internal.lifecyclemodulebuilder.buildproject(lifecyclemodulebuilder.java:80)     @ org.apache.maven.lifecycle.internal.builder.singlethreaded.singlethreadedbuilder.build(singlethreadedbuilder.java:51)     @ org.apache.maven.lifecycle.internal.lifecyclestarter.execute(lifecyclestarter.java:128)     @ org.apache.maven.defaultmaven.doexecute(defaultmaven.java:307)     @ org.apache.maven.defaultmaven.doexecute(defaultmaven.java:193)     @ org.apache.maven.defaultmaven.execute(defaultmaven.java:106)     @ org.apache.maven.cli.mavencli.execute(mavencli.java:862)     @ org.apache.maven.cli.mavencli.domain(mavencli.java:286)     @ org.apache.maven.cli.mavencli.main(mavencli.java:197)     @ sun.reflect.nativemethodaccessorimpl.invoke0(native method)     @ sun.reflect.nativemethodaccessorimpl.invoke(nativemethodaccessorimpl.java:62)     @ sun.reflect.delegatingmethodaccessorimpl.invoke(delegatingmethodaccessorimpl.java:43)     @ java.lang.reflect.method.invoke(method.java:483)     @ org.codehaus.plexus.classworlds.launcher.launcher.launchenhanced(launcher.java:289)     @ org.codehaus.plexus.classworlds.launcher.launcher.launch(launcher.java:229)     @ org.codehaus.plexus.classworlds.launcher.launcher.mainwithexitcode(launcher.java:415)     @ org.codehaus.plexus.classworlds.launcher.launcher.main(launcher.java:356)  caused by: org.apache.maven.plugin.mojoexecutionexception: ant buildexception has occured: execute failed: java.io.ioexception: cannot run program "cmake" (in directory "c:\hdfs\hadoop-hdfs-project\hadoop-hdfs\target\native"): createprocess error=2, system cannot find file specifiedaround ant part ...<exec failonerror="true" dir="c:\hdfs\hadoop-hdfs-project\hadoop-hdfs\target/native" executable="cmake">... @ 5:107 in c:\hdfs\hadoop-hdfs-project\hadoop-hdfs\target\antrun\build-main.xml     @ org.apache.maven.plugin.antrun.antrunmojo.execute(antrunmojo.java:355)     @ org.apache.maven.plugin.defaultbuildpluginmanager.executemojo(defaultbuildpluginmanager.java:134)     @ org.apache.maven.lifecycle.internal.mojoexecutor.execute(mojoexecutor.java:208)     ... 20 more  caused by: c:\hdfs\hadoop-hdfs-project\hadoop-hdfs\target\antrun\build-main.xml:5: execute failed: java.io.ioexception: cannot run program "cmake" (in directory "c:\hdfs\hadoop-hdfs-project\hadoop-hdfs\target\native"): createprocess error=2, system cannot find file specified     @ org.apache.tools.ant.taskdefs.exectask.runexec(exectask.java:675)     @ org.apache.tools.ant.taskdefs.exectask.execute(exectask.java:498)     @ org.apache.tools.ant.unknownelement.execute(unknownelement.java:291)     @ sun.reflect.generatedmethodaccessor19.invoke(unknown source)     @ sun.reflect.delegatingmethodaccessorimpl.invoke(delegatingmethodaccessorimpl.java:43)     @ java.lang.reflect.method.invoke(method.java:483)     @ org.apache.tools.ant.dispatch.dispatchutils.execute(dispatchutils.java:106)     @ org.apache.tools.ant.task.perform(task.java:348)     @ org.apache.tools.ant.target.execute(target.java:390)     @ org.apache.tools.ant.target.performtasks(target.java:411)     @ org.apache.tools.ant.project.executesortedtargets(project.java:1399)     @ org.apache.tools.ant.project.executetarget(project.java:1368)     @ org.apache.maven.plugin.antrun.antrunmojo.execute(antrunmojo.java:327)     ... 22 more  caused by: java.io.ioexception: cannot run program "cmake" (in directory "c:\hdfs\hadoop-hdfs-project\hadoop-hdfs\target\native"): createprocess error=2, system cannot find file specified     @ java.lang.processbuilder.start(processbuilder.java:1048)     @ java.lang.runtime.exec(runtime.java:620)     @ org.apache.tools.ant.taskdefs.execute$java13commandlauncher.exec(execute.java:862)     @ org.apache.tools.ant.taskdefs.execute.launch(execute.java:481)     @ org.apache.tools.ant.taskdefs.execute.execute(execute.java:495)     @ org.apache.tools.ant.taskdefs.exectask.runexecute(exectask.java:631)     @ org.apache.tools.ant.taskdefs.exectask.runexec(exectask.java:672)     ... 34 more  caused by: java.io.ioexception: createprocess error=2, system cannot find file specified     @ java.lang.processimpl.create(native method)     @ java.lang.processimpl.<init>(processimpl.java:386)     @ java.lang.processimpl.start(processimpl.java:137)     @ java.lang.processbuilder.start(processbuilder.java:1029)     ... 40 more 

i don't understand error @ all. have cmake installed , c:\cmake\bin added path well.

hmm...this error makes me think had typo in maven command:

[warning] requested profile "native-bin" not activated because not exist.

check command; make sure it's native-win, not native-bin

i wrote guide recently building hadoop on server 2012 r2 (should identical 8.1). let me know.


Comments

Popular posts from this blog

google chrome - Developer tools - How to inspect the elements which are added momentarily (by JQuery)? -

angularjs - Showing an empty as first option in select tag -

php - Cloud9 cloud IDE and CakePHP -