ssh - how to automatically run a bash script when my qsub jobs are finished on a server? -
i run script when of jobs have sent server done.
for example, send
ssh server "for in config*; qsub ./run 1 $i; done"
and list of jobs started. automatically start script on server process output these jobs once completed.
i appreciate advice me avoid following inelegant solution:
if save each of 1000 job id's above call in separate file, check contents of each file against current list of running jobs, i.e. output call to:
ssh qstat
i need check every half hour, imagine there better way.
it depends bit on job scheduler using , version, there's approach can taken if results-processing can done on same queue job.
one handy way of managing lots of related job in more recent versions of torque (and grid engine, , others) launch individual jobs job array (cf. http://docs.adaptivecomputing.com/torque/4-1-4/content/topics/commands/qsub.htm#-t). requires mapping individual runs numbers somehow, may or may not convenient; if can jobs, simplify managing jobs; can qsub them in 1 line, can qdel or qhold them @ once (while still having capability deal jobs individually).
if this, submit analysis job had dependency on array of jobs run once of jobs in array complete: (cf. http://docs.adaptivecomputing.com/torque/4-1-4/content/topics/commands/qsub.htm#dependencyexamples). submitting job like:
qsub analyze.sh -w depend=afterokarray:427[]
where analyze.sh had script analysis, , 427 job id of array of jobs launched. (the [] means run after completed). syntax differs other schedulers (eg, sge/oge) ideas same.
getting right can take doing, , tristan's approach has advantage of being simple, , working scheduler; learning use job arrays in situation if you'll doing alot of may worth time.
Comments
Post a Comment