HPC cluster with exceed on demand visualization tool

1 回表示 (過去 30 日間)
Aishwarya Venkatesh
Aishwarya Venkatesh 2019 年 7 月 23 日
回答済み: Saurabh Chaudhary 2022 年 7 月 19 日
Hi all,
I am working with fitrgp function in matlab and doing some computations for large datasets. The computation time is really too much(parfor doesnot seems to be helping much). So i have installed HPC cluster with exceed on demand as visualization tool. But the question is how to do job submission in HPC cluster for the scripts written in matlab, in order to reduce the computational time? It would be great, if you guys could give some suggestions.
Thanks a lot in advance!!!!

回答 (2 件)

Sahithi Kanumarlapudi
Sahithi Kanumarlapudi 2019 年 8 月 5 日
If your matlab code is in the file myScript.m, you might use a job submissions script like:
#!/bin/bash
#SBATCH -t 2:00
#SBATCH -N 1
#SBATCH -n 12
#SBATCH -mem-per-cpu 1024
#SBATCH -L matlab
. ~/.profile
module load matlab
matlab -nodisplay -nosplash -r "run('myScript.m'); exit" > myScript.out

Saurabh Chaudhary
Saurabh Chaudhary 2022 年 7 月 19 日
you can make the slurm file like this, and save the file xyx_slurm.sl and in order to run it enter sbatch xyz_slurm.sl in terminal
#!/bin/bash
#SBATCH --job-name=matlab_test
#SBATCH --time=0-03:00 # adjust this to match the walltime of your job
#SBATCH --nodes=1
#SBATCH --ntasks=1
#SBATCH --cpus-per-task=1 # adjust this if you are using parallel commands
#SBATCH --mem=4000 # adjust this according to the memory requirement per node you need
#SBATCH --partition=small
# Choose a version of MATLAB by loading a module:
module load matlab/R2022a
matlab -nodisplay -nosplash -softwareopengl < /pathofyourdirectory/cosplot.m

カテゴリ

Help Center および File ExchangeCluster Configuration についてさらに検索

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by