Home:ALL Converter>Spark on windows 10 not working

Spark on windows 10 not working

Ask Time:2016-09-03T00:15:08         Author:Marko Taht

Json Formatter

Im trying to get spark working on win10. When i try to run spark shell i get this error :

'Spark\spark-2.0.0-bin-hadoop2.7\bin..\jars""\ is not recognized as an internal or external command,operable program or batch file.

Failed to find Spark jars directory. You need to build Spark before running this program.

I am using a pre-built spark for hadoop 2.7 or later. I have installed java 8, eclipse neon, python 2.7, scala 2.11, gotten winutils for hadoop 2.7.1 And i still get this error.

When I donwloaded spark it comes in the tgz, when extracted there is another tzg inside, so i extracted it also and then I got all the bin folders and stuff. I need to access spark-shell. Can anyone help?

EDIT: Solution i ended up using:

1) Virtual box

2) Linux mint

Author:Marko Taht,eproduced under the CC 4.0 BY-SA copyright license with a link to the original source and this disclaimer.
Link to original article:https://stackoverflow.com/questions/39296802/spark-on-windows-10-not-working
Priti Singh :

I got the same error while building Spark. You can move the extracted folder to C:\\\n\nRefer this:\nhttp://techgobi.blogspot.in/2016/08/configure-spark-on-windows-some-error.html",
2016-09-24T15:00:17
Ani Menon :

You are probably giving the wrong folder path to Spark bin.\n\nJust open the command prompt and change directory to the bin inside the spark folder.\n\nType spark-shell to check.\n\nRefer: Spark on win 10",
2016-09-03T18:18:22
yy