|
|
\[\[_TOC_\]\]
|
|
|
|
|
|
## Pyrene front-end
|
|
|
|
|
|
Interactive work on the Pyrene front‐end `login01` is dedicated to tasks of short walltime or low CPU-time and memory, such as:
|
|
|
* Text editing.
|
|
|
* Code compilation and (short) execution.
|
|
|
* Graphical applications.
|
|
|
* Batch job preparation and submission.
|
|
|
* Manipulating data and result files.
|
|
|
* Sharing temporary files (**for less than 1 day**) on the /tmp space.
|
|
|
|
|
|
For instance:
|
|
|
```
|
|
|
# The gedit text editor
|
|
|
gedit &
|
|
|
# Gaussview
|
|
|
module load gaussview/6.0
|
|
|
gaussview &
|
|
|
# MATLAB
|
|
|
module load MATLAB/R2020b
|
|
|
matlab &
|
|
|
```
|
|
|
|
|
|
Some limits apply on the Pyrene front‐end:
|
|
|
* Instant usage: 4 CPU and 8 GB of RAM per user.
|
|
|
* 2 hrs of CPU time per user process.
|
|
|
* The /scratch space is in read-only mode.
|
|
|
|
|
|
## Visualization node
|
|
|
|
|
|
[As on the Pyrene front-end](#pyrene-front-end-login01), interactive work can be done on the visualization node (`visu01`) by launching interactive jobs.
|
|
|
|
|
|
Interactive jobs give access to more resources than the front-end node:
|
|
|
* Max walltime : 4 hours.
|
|
|
* Max mem per job: 184000 MB.
|
|
|
* Read/write access to the /scratch space through the `SCRATCHDIR` variable (see the [usage of the scratch space in jobs](https://git.univ-pau.fr/num-as/pyrene-cluster/-/wikis/2-Slurm/2.1-Batch-job-submission#usage-of-the-scratch-space-in-jobs)).
|
|
|
|
|
|
Interactive jobs can be either [`salloc` jobs](#interactive-jobs-salloc) or [remote vizualisation jobs](#remote-vizualisation).
|
|
|
|
|
|
### `salloc` jobs
|
|
|
|
|
|
The `salloc` command opens an interactive session on the vizualisation node. From this session, interactive work can be done as on the front-end node:
|
|
|
|
|
|
!!salloc.png!!
|
|
|
|
|
|
By default, an `salloc` interactive job provides 1h of walltime and 1000 MB of RAM. [Slurm options](https://slurm.schedmd.com/salloc.html) can be added to modify this behavior. For instance, to request 2 hours, 4 cores and 8000 MB per core:
|
|
|
```
|
|
|
salloc --time=120 --cpus-per-task=4 --mem-per-cpu=8000
|
|
|
```
|
|
|
|
|
|
### Remote vizualisation
|
|
|
|
|
|
Remote vizualization on Pyrene is based on the coupling of VirtualGL and TurboVNC.
|
|
|
|
|
|
On the server side (vizualisation node):
|
|
|
* VirtualGL redirects the 3D rendering commands from Linux OpenGL applications to the 3D accelerator hardware (GPU) of the node.
|
|
|
* TurboVNC compresses the 3D images to 2D before sending them to the client.
|
|
|
|
|
|
On the client side (user personal computer), rendered 2D images sent by TurboVNC from the vizualisation node can be displayed thanks to the TurboVNC viewer.
|
|
|
|
|
|
#### Create a session
|
|
|
|
|
|
On the Pyrene front-end, create a vizualization session with the following commands:
|
|
|
```
|
|
|
module load pyrene-visu/current
|
|
|
runVisu.sh
|
|
|
```
|
|
|
By default, the `runVisu.sh` command provides 1h of walltime and 1000 MB of RAM. [Slurm options](https://slurm.schedmd.com/salloc.html) can be added to modify this behavior. For instance, to request 2 hours, 4 cores and 8000 MB per core:
|
|
|
```
|
|
|
runVisu.sh --time=120 --cpus-per-task=4 --mem-per-cpu=8000
|
|
|
```
|
|
|
|
|
|
!!runVisu.png!!
|
|
|
|
|
|
**This window must not be closed to keep the session opened !**
|
|
|
|
|
|
The VNC_URL line provides the URL needed to connect to the session (see below).
|
|
|
|
|
|
#### Connect to the session
|
|
|
|
|
|
On your personal user computer:
|
|
|
* **Only needed once**: install from https://sourceforge.net/projects/turbovnc/files a recent version of TurboVNC for your OS: .deb for Linux Debian/Ubuntu, .rpm for Linux RHEL (Fedora, Centos, etc.), .dmg for macOS, .exe for Windows.
|
|
|
* Launch the TurboVNC Viewer specifying the VNC_URL provided by the session. For instance on Linux:
|
|
|
```
|
|
|
/opt/TurboVNC/bin/vncviewer cspnr-visu01.univ-pau.fr:1
|
|
|
```
|
|
|
* Enter your enter your UPPA username and password.
|
|
|
|
|
|
#### Launch applications
|
|
|
From the TurboVNC viewer, use the `vglrun` command in a xterm terminal to launch applications:
|
|
|
```
|
|
|
# Gaussview
|
|
|
module load gaussview/6.0
|
|
|
vglrun gaussview &
|
|
|
# MATLAB
|
|
|
module load MATLAB/R2020b
|
|
|
vglrun matlab -nosoftwareopengl &
|
|
|
# Paraview
|
|
|
module load paraview/4.4.0-mpi
|
|
|
vglrun paraview &
|
|
|
```
|
|
|
|
|
|
!!turboVNC.png!! |
|
|
\ No newline at end of file |