ssh -t dsa // it will promt for password and name of key file
1. suppose file name is mykey. It will generate two file mykey and mykey.pub
2. it will also generate hidden folder .ssh. There will be a file named known_hosts in that folder.
3. copy this two file into .ssh folder
cd .ssh
cp ../mykey ../mykey.pub .
4. keep this folder with 3 files for future.
5. Once admin give u access to server, type
ssh alamt@cluster.cbrc.kaust.edu.sa
First time it will prompt password. But from next it will not.
For cluster
1. cd user
2. cd DMF6
3. cd ../user/alamt
4. cp -rf DMF6 ../alamt/.
5. cp run_job_vlad.sh run_job_tanvir.sh
6. chmod a+x run_job_tanvir.sh
7. ./run_job_tanvir.sh node25 CGI tanvir_01
8.
scp Copy file from remote --> local
======================
open terminal in local machine
scp source destination
> scp alamt@cluster.cbrc.kaust.edu.sa:/home/alamt/DMF6/CGI.tanvir_01.results/*.rank /home/tanviralam/tst
scp Copy from local --> remote
======================
scp local remote
> scp /home/tanviralam/myfile.txt alamt@cluster.cbrc.kaust.edu.sa:/home/alamt/store/myfile.txt
sftp Connect to remote pc using GUI
======================
1. login using ssh alamt@cluster.cbrc.kaust.edu.sa
2. type following in location bar
sftp://alamt@cluster.cbrc.kaust.edu.sa
3. So you can see and use gui of remote machine.
Cluster commands
==================
1. see number of nodes in cluster: bhosts 2. See the number of processor in nodes:
ssh node25 cat /proc/cpuinfo | grep -c processor
Or you can login and then see it
First login into that node. for example login into node 25 .
type> ssh node25
type> ssh node25 cat /proc/cpuinfo | grep -c processor
3. see the running jobs
type> top
Comments
Post a Comment