Next 25 Results →
← Previous 25 Results
April 2019 — May 2019

Some AI Just Shouldn't Exist (May 1, 2019)
Human bias can seep into AI systems. Amazon abandoned a recruiting algorithm after it was shown to favor men’s resumes over women’s; researchers concluded an algorithm used in courtroom sentencing was more lenient to white people than to black people; a study found that mortgage algorithms discriminate against Latino and African American borrowers.



Optimizing Network Software to Advance Scientific Discovery (May 1, 2019)
High-performance computing (HPC)—the use of supercomputers and parallel processing techniques to solve large computational problems—is of great use in the scientific community. For example, scientists at the U.S. Department of Energy's (DOE) Brookhaven National Laboratory rely on HPC to analyze the data they collect at the large-scale experimental facilities on site and to model complex processes that would be too expensive or impossible to demonstrate experimentally.



Student-Led Women in Computer Science at UO: ‘Community Building is Very Important' (Apr 30, 2019)
Women are graduating with computer science degrees at a much lower rate than men. But experts say they've found the disconnect and are taking action for gender equality in the industry. A computer science degree was a no-brainer for these two students. “Hey, I can do this, I can be one of those cool people that you hear about,” UO computer science student Sierra Battan says.



The FIRST Step in U.S. Technology Competitiveness (Apr 30, 2019)
One of the unsung heroes in the U.S. battle to maintain its science and technology competitiveness is the FIRST organization known for its youth robotics competitions. With the transition to 5G communications and advancements in artificial intelligence (AI), the competitive battle between countries for technology leadership is growing. President Trump has gone so far as to sign an executive order to spur investment in artificial intelligence and has called 5G a race “America must win.”



Teaching Old Codes New Tricks (Apr 29, 2019)
Can a single weather model be run globally at a resolution high enough to start resolving individual thunderstorms — whether in the American Midwest, African rainforests, or anywhere else? The answer depends on the horsepower of your supercomputer. Running extremely complicated models over the entire Earth at such high resolution is a herculean task for any machine, which has made a predictive global, storm-scale model impractical.



Girls Who Code Helps Draft 'Landmark' Legislation Aimed at Closing the Gender Gap (Apr 29, 2019)
In the 10 months since Girls Who Code announced a set of policy recommendations aimed at closing the gender gap in K-12 computer science, the national nonprofit has been working with states to promote legislation that would help measure the extent of the gender disparity in U.S. classrooms.



If Data is the New Oil, We're About to Bust (Apr 28, 2019)
You’ve heard it before: Data is the new oil. The oft-quipped adage regained traction last year when Intel CEO Brian Krzanich repeated it in a Fortune interview. When enterprise execs and AI experts say data is the new oil, they mean it’s fuel for our information economy; the single largest driver of innovation. And the proof is all around us. You’d be hard pressed to find a company that doesn’t capture and mobilize data to some extent. Imagine running an ad campaign without metrics. Thin...
Read More



Supercomputer Mixes Streams with CPU, GPU, and FPGA (Apr 28, 2019)
If this is truly the age of heterogeneous supercomputers, then the system installed earlier this month at the University of Tsukuba is its poster child. The new NEC-built machine, which is now up and running at the university’s Center for Computational Sciences (CCS), is powered by Intel Xeon CPUs, Nvidia Tesla GPUs, and Intel Stratix 10 FPGAs. Known as Cygnus, the 80-node cluster gets its name from Cygnus A, a galaxy that features twin jets of material shooting out from its center.



A Novel Data-Compression Technique for Faster Computer Programs (Apr 27, 2019)
Data compression leverages redundant data to free up storage capacity, boost computing speeds, and provide other perks. In current computer systems, accessing main memory is very expensive compared to actual computation. Because of this, using data compression in the memory helps improve performance, as it reduces the frequency and amount of data programs need to fetch from main memory.



Dr. Debora Sijacki Wins 2019 PRACE Ada Lovelace Award for HPC (Apr 27, 2019)
The European PRACE initiative announced that Dr. Debora Sijacki from the University of Cambridge will receive the 2019 PRACE Ada Lovelace Award for HPC for her outstanding contributions to and impact on high performance computing in Europe. As a computational cosmologist she has achieved numerous high-impact results in astrophysics based on numerical simulations on state-of-the-art supercomputers.



Universities Leverage High-Performance Computing for Multiple Returns on Investment (Apr 26, 2019)
If you build it, they will come — or, at the very least, if you don’t build it, they’ll probably go somewhere else. The “it” here refers to high-performance computing resources, and “they” refers to talented faculty members. More and more, HPC is becoming a competitive differentiator for institutions vying for top researchers.



New Technique Cuts AI Training Time by More Than 60 Percent (Apr 26, 2019)
North Carolina State University researchers have developed a technique that reduces training time for deep learning networks by more than 60 percent without sacrificing accuracy, accelerating the development of new artificial intelligence (AI) applications.



Is Data Science the Fourth Pillar of the Scientific Method? (Apr 25, 2019)
Nvidia CEO Jensen Huang revived a decade-old debate last month when he said that modern data science (AI plus HPC) has become the fourth pillar of the scientific method. While some disagree with the notion that statistical analysis alone can reveal undiscovered laws, the argument may be moot if data science continues its current course being an extremely useful and in-demand tool for all manners of scientific discovery.



Google's AI Experts Try to Automate Themselves (Apr 25, 2019)
Just before 9 am last Thursday, an unusual speed dating scene sprang up in San Francisco. A casually dressed crowd, mostly male, milled around a gilt-edged Beaux Arts ballroom on Nob Hill. Pairs and trios formed quickly, but not in search of romance. Ice breakers were direct: What’s your favorite programming language? Which data analysis framework are you most expert in? More delicately, conversations drifted toward rankings on Kaggle.com, a site that has turned data science into a kind of spo...
Read More



Democratizing Insights from Fast Data (Apr 24, 2019)
Organizations are overwhelmed by boundless streams of data from suppliers, products, assets, apps, IT infrastructure, employees and customers. If they could quickly make sense of it all, they could respond faster, cut costs, improve service and find new sources of revenue. But they struggle for many reasons. Big-data analytics (on-prem or in the cloud) demands hard-to-find developer, data science and IT skill sets.



Adding Artificial Intelligence to Drug Discovery (Apr 24, 2019)
Scientists face slim odds when trying to turn a molecule into a medicine. Most studies put the batting average at about 0.100—or 1 in 10. Some go a little higher, some a little lower, but the success rate for drug discovery is never “good.” Some scientists believe that the success rate could be improved if drug discovery were to apply artificial intelligence (AI), that is, if it were to use advanced computational tools such as machine learning (ML) and molecular dynamics simulation.



11 Ways to Avert a Data-Storage Disaster (Apr 23, 2019)
Tracy Teal was a graduate student when she executed what should have been a routine command in her Unix terminal: rm −rf *. That command instructs the computer to delete everything in the current directory recursively, including all subdirectories. There was just one problem — she was in the wrong directory.



Behind the Scenes of the First Black Hole Photo (Apr 23, 2019)
Last Wednesday, scientists from the Event Horizon Telescope project unveiled the first ever photo of a black hole. While the photo itself is incredible, the feats of human ingenuity the project’s scientists used to capture it are just as impressive if not moreso.



AI Agent Offers Rationales Using Everyday Language to Explain Its Actions (Apr 22, 2019)
Georgia Institute of Technology researchers, in collaboration with Cornell University and University of Kentucky, have developed an artificially intelligent (AI) agent that can automatically generate natural language explanations in real-time to convey the motivations behind its actions. The work is designed to give humans engaging with AI agents or robots confidence that the agent is performing the task correctly and can explain a mistake or errant behavior.



Scientists from NUST MISIS Create a Super-Fast Robot Microscope to Search for Dark Matter (Apr 22, 2019)
Researchers from the National University of science and technology MISIS (NUST MISIS, Moscow, Russia) and the National Institute for Nuclear Physics (INFN, Naples, Italy) have developed a simple and cost-effective technology that allows increasing the speed of the automated microscopes (AM) by 10-100 times. The microscopes' speed growth will help scientists in many fields: medicine, nuclear physics, astrophysics, neutrino physics, archeology, geology, volcanology, archeology.



12 Key Tips for Learning Data Science (Apr 21, 2019)
Data scientist ranks as the best job for 2019 in America on Glassdoor. With a median base salary of $108,000 and a job satisfaction rank of 4.3 out of 5, plus a fair number of openings predicted, that is not surprising. The question is: What does one have to do to get on track to qualify for this job?



First Machine-Generated Book Published (Apr 21, 2019)
Springer Nature published its first machine-generated book, compiled using an algorithm developed by researchers from Goethe University. This collaboration broke new ground with the first machine-generated book to be published by a scholarly publisher. The book offers an overview of new research publications on lithium-ion batteries—a structured, automatically generated summary of a large number of current research articles.



Student Demand for Computer Science Straining UW-Madison Department Resources (Apr 20, 2019)
UW-Madison students are signing up in record numbers to study computer science, elevating the program to be the most popular undergraduate major on campus in each of the last two years. Student demand is nine times larger than it was a decade ago, from 168 in 2009 to nearly 1,600 in the program this academic year.



Scientists Build a Machine to See All Possible Futures (Apr 20, 2019)
In the 2018 movie Infinity War, a scene featured Dr. Strange looking into 14 million possible futures to search for a single timeline where the heroes would be victorious. Perhaps he would have had an easier time with help from a quantum computer. A team of researchers from Nanyang Technological University, Singapore (NTU Singapore) and Griffith University in Australia has constructed a prototype quantum device that can generate all possible futures in a simultaneous quantum superposition.



Here's a Roadmap for Increasing Access to Computer Science (Apr 19, 2019)
The economic case for getting computer science education built into public school curriculum is a glaring one: There are a half-million tech jobs currently unfilled around the country, largely due to a technical skills gap. But given the tech industry’s diversity problem, equity and inclusion must be key components of the conversation around those high-paying jobs, and who will have access to the benefits they represent.

©1994-2024   |   Shodor   |   Privacy Policy   |   NSDL   |   XSEDE   |   Blue Waters   |   ACM SIGHPC   |   feedback  |   facebook   |   twitter   |   rss   |   youtube   |   XSEDE Code of Conduct   |   Not Logged In. Login