On importing a csv file into a table with same structure that placed on different machine, with mysql import using mysql root user , it is common that we may end up with some access denied error messages.
While building a nodejs application (npm install) I came across some weird issues which are in a nutshell as follows.
v8.h:336:1: error: expected unqualified-id before ‘using’, node-gyp
The nodejs version I was using was v5.10.1 and the reason for this issue was incompatible gcc. and upadating gcc to correct version solved the problems. Following steps are what I did to fix the issue.
sudo apt-get install python-software-properties
sudo add-apt-repository ppa:ubuntu-toolchain-r/test
sudo apt-get update
sudo apt-get install gcc-5 g++-5
sudo update-alternatives --install /usr/bin/gcc gcc /usr/bin/gcc-5 80 --slave /usr/bin/g++ g++ /usr/bin/g++-5
sudo update-alternatives --config gcc (choose gcc-5 from the list)
Check that the correct versions of gcc and g++ are being used by default by running gcc -v; g++ -v
When we execute a sqoop export action via oozie (moving data from hdfs to mysql ) using the tag <command>, we may come across the error java.lang.RuntimeException: Can’t parse input data: ‘NULL’. This sqoop export command, which was executed successfully though terminal, may raise this issue, when it is combined with oozie.
The fix for the above issue is to remove all the single/double quotes from the command. eg : Instead of –input-null-string ‘NULL’, we need to use –input-null-string NULL
Sqoop command: The Sqoop command can be specified either using the command element or multiple arg elements. – When using the command element, Oozie will split the command on every space into multiple arguments.- When using the arg elements, Oozie will pass each argument value as an argument to Sqoop. The arg variant should be used when there are spaces within a single argument. – All the above elements can be parameterized (templatized) using EL expressions.
Also I think the following apache documentation on sqoop is of high relevance for beginners in sqoop domain :
The sqoop action runs a Sqoop job synchronously.- The information to be included in the oozie sqoop action are the job-tracker, the name-node and Sqoop command or arg elements as well as configuration.- A prepare node can be included to do any prep work including hdfs actions. This will be executed prior to execution of the sqoop job.- Sqoop configuration can be specified with a file, using the job-xml element, and inline, using the configuration elements.- Oozie EL expressions can be used in the inline configuration. Property values specified in the configuration element override values specified in the job-xml file.
Note that Hadoop mapred.job.tracker and fs.default.name properties must not be present in the inline configuration. As with Hadoop map-reduce jobs, it is possible to add files and archives in order to make them available to the Sqoop job.
This was a bit scary for me as it was for the first time I was facing something like this. And I even tried adding hostname of ubuntu server and its ip address directly in my /etc/hosts file. and further to my frustration it did not work. But The actual fix that worked for me is as follows:
It is commom to get “Page not found” error when you type above link. Generally this is related to name node of the application. To ensure that just type in the command jps in terminal and see if namenode is present in the output. Most proobably it would not be there and that is why you are not able to lauch that url.
A. As a soultion to this error, strop all the services either by running stop-all.sh (depreicated) orrunning stop-dfs.sh and stop-yarn.sh
B. Then restart hadoop by typing instart-dfs.shstart-yarn.sh
C. Now try executing jps command and see if namenode is there. Ii would be there with all possiblilities. Then try http://localhost:50070 in browser and you would the window.
D. If namenose is still missing on executing jps command even after restrting the entire hadoop system (step C), you would have to do something about the hdfs file system, which has been specified in the previous post. Please go through it if you are interested