In general this is what I do to load files on HDFS using the exact same profile you are using:
# first, make my directory (**NOTE** you'll need to change the username)
sudo /usr/local/hadoop/bin/hdfs dfs -mkdir /user/ballard
sudo /usr/local/hadoop/bin/hdfs dfs -chown ballard /user/ballard
#second, EVERY TIME I make a shell I issue the following 5 commands:
export HADOOP_CONF_DIR=/usr/local/hadoop/etc/hadoop
export HADOOP_HOME=/usr/local/hadoop
export PATH=$PATH:/usr/local/hadoop/bin:/usr/local/spark/bin
. /usr/local/hadoop/etc/hadoop/hadoop-env.sh
. /usr/local/hadoop/etc/hadoop/yarn-env.sh
# then, after that load some files
hdfs dfs -copyFromLocal some_filename_on_my_local_directory filename_in_hdfs
By default the file will be put in your HDFS directory made above (mine by default is /user/ballard).
Hope this helps,
-Jeff