Urlhttp://hpcuniversity.org/trainingMaterials/157
CreatorWilliam McGrath
ContributorTACC
PublisherTACC
DescriptionTACC

Prerequisites: Linux, C/Fortran/Python, HPC Cluster Programming Environement

This one day workshop will introduce participants to the four data intensive computing modes Stampede is designed to provide: 1) high throughput data processing, 2) parallel data analysis, 3) large shared memory applications, and 4) large-scale visualization. The material will focus on using the job launchers, parallel R, and visualization tools available on Stampede. The workshop will use a mixture of lecture and hands-on training to explore how users can use Stampede for their data driven computing needs.
FormatPDF
LanguageEnglish
SubjectComputational Science
KeywordGridFTP, Data transfer, Big data, Globus online
AudienceResearcher, Educator, Learner/Student, Professional/Practitioner
Education LevelHigher Education, Graduate/Professional, Undergraduate (Lower Division), Undergraduate (Upper Division)
HPCU SubjectProgramming/Algorithms, Storage
HPCU Subject 2Storage, Computer Systems Organization, Architectures, Devices, Hardware, Accelerators, Data, Input/Output, Data Processing, Data Types
TypeTutorial, Instructional Material, Lecture/Presentation
Suggest Changes →

Related Resources

Published Reviews for this Item
(Not Rated)
(0 Reviews)
This item has no reviews — please add yours now →
©1994-2024   |   Shodor   |   Privacy Policy   |   NSDL   |   XSEDE   |   Blue Waters   |   ACM SIGHPC   |   feedback  |   facebook   |   twitter   |   rss   |   youtube   |   XSEDE Code of Conduct   |   Not Logged In. Login