Next 10 News Items>
 

Genomic Sequencing at Children’s Mercy: Saving Time to Save Lives

Genomic sequencing – that is, rapid sequencing – is instrumental to diagnosing and treating critically ill patients, and managing the high data volumes involved in genomics is essential to the process. Children’s Mercy Hospital in Kansas City, MO, (354 beds, not-for-profit, treating children from birth through the age of 21) operates what it says is the world’s first whole genome sequencing center in a pediatric setting, where physicians, clinical laboratory scientists, molecular geneticists, bioinformaticians and software engineers work to sequence and analyze rare inherited dise...

Read More



3-D simulations and NASA supercomputer advance research of the origin of stars

What processes are involved in the formation of individual stars and stellar clusters in our own galaxy and other galaxies? Scientists at the University of California, Berkeley, and Lawrence Livermore National Laboratory are using NASA's most powerful supercomputer, Pleiades, to create unique star-formation simulations to answer this fundamental scientific question.  Like something from a video game, the simulations zoom through the entire evolution of young star clusters. A giant cloud of interstellar gas and dust collapses under the forces of gravity. Inside the cloud, turbulent cl...

Read More



International Summer School on Web Science and Technology

WebST 2016 is a research training event addressed to graduates and postgraduates in the first steps of their academic career. With a global scope, it aims at updating them about the most recent advances in the critical, multidisciplinary and fast developing area of web studies, which covers a large spectrum of current exciting research and industrial innovation from computing and technologies to social sciences and the humanities and has turned out to be the largest socio-technical infrastructure in human history. Renowned academics and industry pioneers will lecture and share their views...

Read More



3rd SC Workshop on HPC Training Best Practices

The International HPC Training Consortium is organizing a workshop entitled "Third SC Workshop on Best Practices for HPC Training" to be held at the SC16 Conference on Monday, November 14, 2016 from 9:00 AM to 12:30 PM. The goal of this workshop is to engage individuals around the globe to share expertise and best practices for in-person, web-based and asynchronous HPC training for the community. We welcome everyone interested to attend this workshop. You must register and pay the SC16 workshop fee to attend this and other SC16 workshops. This year's SC16 workshop will begin with lightnin...

Read More



Call for Papers: MLHPC 2016: Machine Learning in High Performance Computing Environments Workshop - Deadline: July 30, 2016

The intent of this workshop is to bring together researchers, practitioners, and scientific communities to discuss methods that utilize extreme scale systems for machine learning. This workshop will focus on the greatest challenges in utilizing HPC for machine learning and methods for exploiting data parallelism, model parallelism, ensembles, and parameter search. We invite researchers and practitioners to participate in this workshop to discuss the challenges in using HPC for machine learning and to share the wide range of applications that would benefit from HPC powered machine learning...

Read More



SDSC, UC San Diego Health Sciences to Launch Year 2 of Mentoring Program

The San Diego Supercomputer Center (SDSC) at the University of California San Diego, in collaboration with the UC San Diego Division of Health Sciences, is preparing to launch the second year of a new mentoring program designed to provide a pathway for high school students to gain access to experts in their field of interest. The first phase of the Mentor Assistance Program (MAP), co-founded by Ange Mason, SDSC’s education program manager, and Kellie Church, assistant professor in the Department of Reproductive Medicine within UC San Diego’s School of Medicine, was recently celebrated...

Read More



NCAR Awards 42 Million Core Hours on Yellowstone Supercomputer

Nine science projects were recently chosen to receive computational time and storage space on the Yellowstone supercomputer in Cheyenne. The most recent recommended allocations total 42.6 million core hours, 270 terabytes of archival storage, and 47,000 hours on data analysis and visualization systems, Shader says. To provide some perspective on what these numbers mean, here are some useful comparisons. In simplest terms, Yellowstone can be thought of as 72,576 personal computers that are cleverly interconnected to perform as one computer. The computational time allocated is equivalent to...

Read More



University of Melbourne releases Spartan HPC service

The University of Melbourne has launched a new high performance computing (HPC) service called Spartan. It combines traditional HPC with a cloud computing component and the university claims that no other university has put a system like this into production. University of Melbourne head of research computer services, Bernard Meade, said the application of HPC and cloud techniques would increase research productivity across a wide range of disciplines. Spartan can grow and evolve according to the demands of researchers, expanding physically or virtually as required. The design features a ...

Read More



Computer is Trained to Recognize Events in YouTube

Using deep learning techniques, a group of researchers has trained a computer to recognize events in videos on YouTube - even the ones the software has never seen before like riding a horse, baking cookies or eating at a restaurant. Researchers from Disney Research and Shanghai's Fudan University used both scene and object features from the video and enabled link between these visual elements and each type of event to be automatically determined by a machine-learning architecture known as neural network. "Notably, this approach not only works better than other methods in recognizing event...

Read More



The Inventors of the Internet Are Trying to Build a Truly Permanent Web

If you wanted to write a history of the Internet, one of the first things you would do is dig into the email archives of Vint Cerf. In 1973, he co-created the protocols that Internet servers use to communicate with each other without the need for any kind of centralized authority or control. He has spent the decades since shaping the Internet’s development, most recently as Google’s “chief Internet evangelist.” Thankfully, Cerf says he has archived about 40 years of old email—a first-hand history of the Internet stretching back almost as far as the Internet itself. But you’d a...

Read More



'Cut!' - the AI director

At the Cannes Lions advertising festival on Thursday morning, an audience was shown a series of short films in the annual New Directors Showcase, which highlights emerging talent. One of the entries had AI as a director. A few days ago, I saw Eclipse, a pop video featuring a French band, at the offices of Saatchi and Saatchi, which runs the Cannes showcase and commissioned the AI entry. What is remarkable about it is not the production values - it is actually a rather dull piece of work - but a process that involved AI at every stage. All the computer had been given was the track, Saatchi...

Read More




Next 10 News Items>
 
©1994-2016   |   Shodor   |   Privacy Policy   |   NSDL   |   XSEDE   |   Blue Waters   |   ACM SIGHPC   |   feedback  |   facebook   |   twitter   |   rss   |   youtube Not Logged In. Login