Url | http://hpcuniversity.org/trainingMaterials/157 |
Creator | William McGrath |
Contributor | TACC |
Publisher | TACC |
Description | TACC Prerequisites: Linux, C/Fortran/Python, HPC Cluster Programming Environement This one day workshop will introduce participants to the four data intensive computing modes Stampede is designed to provide: 1) high throughput data processing, 2) parallel data analysis, 3) large shared memory applications, and 4) large-scale visualization. The material will focus on using the job launchers, parallel R, and visualization tools available on Stampede. The workshop will use a mixture of lecture and hands-on training to explore how users can use Stampede for their data driven computing needs. |
Format | |
Language | English |
Subject | Computational Science |
Keyword | GridFTP, Data transfer, Big data, Globus online |
Audience | Researcher, Educator, Learner/Student, Professional/Practitioner |
Education Level | Higher Education, Graduate/Professional, Undergraduate (Lower Division), Undergraduate (Upper Division) |
HPCU Subject | Programming/Algorithms, Storage |
HPCU Subject 2 | Storage, Computer Systems Organization, Architectures, Devices, Hardware, Accelerators, Data, Input/Output, Data Processing, Data Types |
Type | Tutorial, Instructional Material, Lecture/Presentation |
Related Resources
Published Reviews for this Item
(Not Rated)
©1994-2024
|
Shodor
|
Privacy Policy
|
NSDL
|
XSEDE
|
Blue Waters
|
ACM SIGHPC
|
|
|
|
|
|
XSEDE Code of Conduct
|
Not Logged In. Login