I am new to Hadoop world. Just started learning the new things about hadoop.
I am getting below error while importing data from mysql to hdfs using sqoop:
sqoop:000> sqoop import --connect jdbc:mysql://localhost/books --username root --password thanks --table authors --m 1;
Exception has occurred during processing command
Exception: org.codehaus.groovy.control.MultipleCompilationErrorsException Message: startup failed:
groovysh_parse: 1: expecting EOF, found 'import' @ line 1, column 7.
sqoop import --connect jdbc:mysql://localhost/books --username root --password thanks --table authors --m 1;
^
1 error
Could you help me in fixing this error?
Mb-One :
it seems that you are using sqoop2\n\nyou need to follow these steps!\n\n1st step \ncheck if you have installed sqoop correctly\n\nsqoop:000> show version --all \n\n\nyou should get a response something like this \n\nServer version: Sqoop 2.0.0-SNAPSHOT revision Unknown Compiled by \njarcec on Wed Nov 21 16:15:51 PST 2012 Client version: Sqoop \n2.0.0-SNAPSHOT revision Unknown Compiled by jarcec on Wed Nov 21 16:15:51 PST 2012 Protocol version: [1] \n\n\n2nd step\n\nCheck what connectors are available on your Sqoop server:\n\n sqoop:000> show connector --all \n 1 connector(s) to show: Connector with\n id 1: \n Name: generic-jdbc-connector \n Class:\n org.apache.sqoop.connector.jdbc.GenericJdbcConnector \n Supported job\n types: [EXPORT, IMPORT]\n\n\n3rd step\n\nsqoop:000> create connection --cid 1 \nCreating connection for connector \nwith id 1 \nPlease fill following values to create new connection object \nName: First connection \n\nConfiguration configuration \nJDBC Driver Class: com.mysql.jdbc.Driver \nJDBC Connection String: jdbc:mysql://mysql.server/database \nUsername: sqoop \nPassword: ***** \nJDBC Connection Properties: There are currently \n0 values in the map: entry# \n\nSecurity related configuration options Max connections: 0 New \nconnection was successfully created with validation status FINE and \npersistent id 1 \n\n\nstep 4\nnow create a job for importing data\nat the end it will also ask for extractore and loaders, use 1 as a value for both. \n\nsqoop:000> create job --xid 1 --type import \nCreating job for \nconnection with id 1 Please fill following values to create new job \nobject \nName: First job \nDatabase configuration \nTable name: users \nTable SQL statement: Table \ncolumn names: \nPartition column name: \n Boundary query: \n\nOutput configuration \nStorage type: \n0 : HDFS Choose: 0 \nOutput directory: /user/jarcec/users \nNew job was successfully created with \nvalidation status FINE and persistent id 1 \n\n\nstep 5\nnow start the job \n\nsqoop:000> start job --jid 1 \n\n\nand import your data ",
2014-09-11T10:05:26