Category

HPC

Understanding the strongest electromagnetic fields in the universe

By | Data, Great Lakes, HPC, Research, Uncategorized

Alec Thomas is part of the team from the U-M College of Engineering Gérard Mourou Center for Ultrafast Optical Science that is building the most powerful laser in the U.S.

Dubbed “ZEUS,” the laser will be 3-petawatts of power. That’s a ‘3’ with 15 zeros. All the power generated in the entire world is 10-terawatts, or 1000 times less than the ZEUS laser. 

The team’s goal is to use the laser to explore how matter behaves in the most extreme electric and magnetic fields in the universe, and also to generate new sources of radiation beams, which may lead to developments in medicine, materials science, and national security. 

A simulation of a plasma wake.

This simulation shows a plasma wake behind a laser pulse. The plasma behaves like water waves generated behind a boat. In this image, the “waves” are extremely hot plasma matter, and the “boat” is a short burst of powerful laser light. (Image courtesy of Daniel Seipt.)

“In the strong electric fields of a petawatt laser, matter becomes ripped apart into a `plasma,’ which is what the sun is made of. This work involves very complex and nonlinear physical interactions between matter particles and light. We create six-dimensional models of particles to simulate how they might behave in a plasma in the presence of these laser fields to learn how to harness it for new technologies. This requires a lot of compute power,” Thomas said. 

That compute power comes from the Great Lakes HPC cluster, the university’s fastest high-performance computing cluster. The team created equations to solve a field of motion for each six-dimensional particle. The equations run on Great Lakes and help Thomas and his team to learn how the particle might behave within a cell. Once the field of motion is understood, solutions can be developed. 

“On the computing side, this is a very complex physical interaction. Great Lakes is designed to handle this type of work,” said Brock Palen, director of Advanced Research Computing, a division of Information and Technology Services. 

Thomas has signed up for allocations on the Great Lakes HPC cluster and Data Den storage. “I just signed up for the no-cost allocations offered by the U-M Research Computing Package. I am planning to use those allocations to explore ideas and concepts in preparation for submitting grant proposals.”

Learn more and sign up for the no-cost U-M Research Computing Package (UMRCP).

Prof. Thomas’ work is funded by a grant from the National Science Foundation.

No-cost research computing allocations now available

By | HPC, News, Research, Systems and Services, Uncategorized

U-M Research Computing PackageResearchers on all university campuses can now sign up for the U-M Research Computing Package, a new package of no-cost supercomputing resources provided by Information and Technology Services.

As of Sept. 1, university researchers have access to a base allocation for 80,000 CPU hours of high-performance computing and research storage services at no cost. This includes 10 terabytes of high-speed and 100 terabytes of archival storage.

These base allocations will meet the needs of approximately 75 percent of current high-performance-computing users and 90 percent of current research storage users. Researchers must sign up on ITS’s Advanced Research Computing website to receive the allocation.

“With support from President (Mark) Schlissel and executive leadership, this initiative provides a unified set of resources, both on campus and in the cloud, that meet the needs of the rich diversity of disciplines. Our goal is to encourage the use, support and availability of high-performance computing resources for the entire research community,” said Ravi Pendse, vice president for information technology and chief information officer.

The computing package was developed to meet needs across a diversity of disciplines and to provide options for long-term data management, sharing and protecting sensitive data, and more competitive cost structures that give faculty and research teams more flexibility to procure resources on short notice.

“It is incredibly important that we provide our research community with the tools necessary so they can use their experience and expertise to solve problems and drive innovation,” said Rebecca Cunningham, vice president for research and the William G. Barsan Collegiate Professor of Emergency Medicine. “The no-cost supercomputing resources provided by ITS and Vice President Pendse will greatly benefit our university community and the countless individuals who are positively impacted by their research.”

Ph.D. students may qualify for their own UMRCP resources depending on who is overseeing their research and their adviser relationship. Students should consult with their Ph.D. program administrator to determine their eligibility. ITS will confirm this status when a UMRCP request is submitted.

Undergraduate and master’s students do not currently qualify for their own UMRCP, but they can be added as users or administrators of another person’s UMRCP. Students can also access other ITS programs such as Great Lakes for Course Accounts, and Student Teams.

“If you’re a researcher at Michigan, these resources are available to you without financial impact. We’re going to make sure you have what you need to do your research. We’re investing in you as a researcher because you are what makes Michigan Research successful,” Brock Palen, Advanced Research Computing director.

Services that are needed beyond the base allocation provided by the UMRCP are available at reduced rates and are automatically available for all researchers on the Ann Arbor, Dearborn, Flint and Michigan Medicine campuses.

More Information

Access the sensitive data HPC cluster via web browser

By | Armis2, HPC, News

Researchers, data scientists, and students can now more easily analyze sensitive data on the Armis2 High-Performance Computing (HPC) Cluster. No Linux knowledge required, just a web browser, an account, and a login. 

This is made possible by a web interface called Open OnDemand, and is provided by Advanced Research Computing (ARC). 

“It is now much easier to analyze sensitive data, without investing hours in training. This makes the Open OnDemand tool more accessible and user-friendly. I’m excited to see the research breakthroughs that happen now that a significant barrier has been removed,” said Matt Britt, ARC HPC manager. 

Open OnDemand offers easy file management, command-line access to the Armis2 HPC cluster, job management and monitoring, and graphical desktop environments and desktop interactive applications such as RStudio, MATLAB, and Jupyter Notebook.

Resource: Getting started (Web-based Open OnDemand) – section 1.2. For assistance or questions, please contact ARC at arc-support@umich.edu.

ARC is a division of Information and Technology Services (ITS).

HPC, storage now more accessible for researchers

By | HPC, News, Systems and Services

U-M Research Computing Package decorative image

Information and Technology Services has launched a new package of supercomputing resources for researchers and PhD students on all U-M campuses: the U-M Research Computing Package, provided by ITS.

The U-M Research Computing Package will reduce the current rates for high performance computing and research storage services provided by ITS by an estimated 35-40 percent, effective July 1. 

In addition, beginning Sept. 1, university researchers will have access to a base allocation for high-performance computing and research storage services (including high-speed and archival storage) at no cost, thanks to an additional investment from ITS. These base allocations will meet the needs of approximately 75 percent of current high-performance computing users and 90 percent of current research storage users.

Learn more about the U-M Research Computing Package

 

RMP new feature alert: View Great Lakes HPC account information

By | General Interest, HPC, News, Systems and Services

Advanced Research Computing (ARC), a division of ITS, has been developing a self-service tool called the Resource Management Portal (RMP) to give researchers and their delegates the ability to directly manage the IT research services they consume from ARC. 

Customers who use the Great Lakes High-Performance Computing Clusters now have the ability to view their account information via the RMP, including the account name, resource limits (CPUs and GPUs), scratch space usage, and the user access list.

“We are excited to be able to offer this tool for customers. It should make their busy lives easier,” said Todd Raeker, ARC research experience manager. 

The RMP is a self-service-only user portal with tools and APIs for research managers, unit support staff, and delegates to manage their ARC IT resources. The RMP team is slowly adding capabilities over time. 

To get started or find help, contact arc-support@umich.edu.

Using tweets to understand climate change sentiment

By | HPC, News, Research, Systems and Services

A team from Urban Sustainability Research Group of the School for Environment and Sustainability (UM-SEAS) has been studying public tweets to understand climate change and global warming attitudes in the U.S. 

Dimitris Gounaridis, is a fellow with the study. The team is mentored by Joshua Newell, and combines work about perceptions on climate change by Jianxun Yang and proprietary level vulnerability assessment by Wanja Waweru

“This research is timely and urgent. It helps us identify hazards, and elevated risks of flooding and heat, for socially vulnerable communities across the U.S. This risk is exacerbated especially for populations that do not believe climate change is happening,” Dimitris stated. 

The research team used a deep learning algorithm that is able to recognize text and predict whether the person tweeting believes in climate change or not. The algorithm analyzed a total of 7 million public tweets from a combination of datasets from a dataset called the U-M Twitter Decahose and the George Washington University Libraries Dataverse. This dataset consists of an historical archive of Decahose tweets and an ongoing collection from the Decahose. The current deep learning model has an 85% accuracy rate and is validated at multiple levels.

The map below shows the prediction of specific users that believe or are skeptical of climate change and global warming. Dimitris used geospatial modeling techniques to identify clusters of American skepticism and belief to create the map.

A map of the United States with blue and red dots indicating climate change acceptance.

(Image courtesy Dimitris Gounaridis.)

The tweet stream is sampled in real-time. Armand Burks, a research data scientist with ARC, wrote the Python code that is responsible for continuously collecting the data and storing it in Turbo Research Storage. He says that many researchers across the university are using this data for various research projects as well as classes. 

“We are seeing an increased demand for shared community data sets like the Decahose. ARC’s platforms like Turbo, ThunderX, and Great Lakes, hold and process that data, and our data scientists are available, in partnership with CSCAR, to assist in deriving meaning from such large data. 

“This is proving to be an effective way to combine compute services, methodology, and campus research mission leaders to make an impact quickly,” said Brock Palen, director of ARC.

In the future, Dimitris plans to refine the model to increase its accuracy, and then combine that with climate change vulnerability for flooding and heat stress.

“MIDAS is pleased that so many U-M faculty members are interested in using the Twitter Decahose. We currently have over 40 projects with faculty in the Schools of Information, Kinesiology, Social Work, and Public Health, as well as at Michigan Ross, the Ford School, LSA and more,” said H.V. Jagadish, MIDAS director and professor of Electrical Engineering and Computer Science

The Twitter Decahose is co-managed and supported by MIDAS, CSCAR, and ARC, and is available to all researchers without any additional charge. For questions about the Decahose, email Kristin Burgard, MIDAS outreach and partnership manager.

Global research uses computing services to advance parenting and child development

By | General Interest, Great Lakes, HPC, News, Research, Uncategorized

Andrew Grogan-Kaylor, professor of Social Work, has spent the past 15 years studying the impact of physical discipline on children within the United States. 

Working with a team of other researchers at the School of Social Work, co-led by professors Shawna Lee and Julie Ma, he recently expanded his research to include children from all over the world, rather than exclusively the U.S. Current data for 62 low- and middle-income countries has been provided by UNICEF, a United Nations agency responsible for providing humanitarian and developmental aid to children worldwide. This data provides a unique opportunity to study the positive things that parents do around the world.

a group of smiling children

(Image by Eduardo Davad from Pixabay)

“We want to push research on parenting and child development in new directions. We want to do globally-based, diversity-based work, and we can’t do that without ARC services,” said Grogan-Kaylor. “I needed a bigger ‘hammer’ than my laptop provided.” 

The “hammer” he’s referring to is the Great Lakes HPC cluster. It can handle processing the large data set easily. When Grogan-Kaylor first heard about ARC, he thought it sounded like an interesting way to grow his science, and that included the ability to run more complicated statistical models that were overwhelming his laptop and department desktop computers. 

He took a workshop led by Bennet Fauber, ARC senior applications programmer/analyst, and found Bennet to be sensible and friendly. Bennet made HPC resources feel within reach to a newcomer. Typically, Grogan-Kaylor says, this type of resource is akin to learning a new language, and he’s found that being determined and persistent and finding the right people are key to maximizing ARC services. Bennet has explained error messages, how to upload data, and how to schedule jobs on Great Lakes. He also found a friendly and important resource at the ARC Help Desk, which is staffed by James Cannon. Lastly, departmental IT director Ryan Bankston has been of enormous help in learning about the cluster.

“We’re here to help researchers do what they do best. We can handle the technology, so they can solve the world’s problems,” said Brock Palen, ARC director. 

“Working with ARC has been a positive, growthful experience, and has helped me contribute significantly to the discussion around child development and physical punishment,” said Grogan-Kaylor. “I have a vision of where I’d like our research to go, and I’m pleased to have found friendly, dedicated people to help me with the pragmatic details.” 

More information

ARC, LSA support groundbreaking global energy tracking

By | General Interest, Great Lakes, HPC, News, Research, Uncategorized

How can technology services like high-performance computing and storage help a political scientist contribute to more equal access to electricity around the world? 

Brian Min, associate professor of political science and research associate professor with the Center for Political Studies, and lead researcher Zachary O’Keeffe have been using nightly satellite imagery to generate new indicators of electricity access and reliability across the world as part of the High-Resolution Electricity Access (HREA) project. 

The collection of satellite imagery is unique in its temporal and spatial coverage. For more than three decades, images have captured nighttime light output over every corner of the globe, every single night. By studying small variations in light output over time, the goal is to identify patterns and anomalies to determine if an area is electrified, when it got electrified, and when the power is out. This work yields the highest resolution estimates of energy access and reliability anywhere in the world.

A satellite image of Kenya in 2017

This image of Kenya from 2017 shows a model-based classification of electrification status based upon all night statistically recalibrated 2017 VIIRS light output. (Image courtesy Dr. Min. Sources: NOAA, VIIRS DNB, Facebook/CIESIN HRSL).

LSA Technology Services and ARC both worked closely with Min’s team to relieve pain points and design highly-optimized, automated workflows. Mark Champe, application programmer/analyst senior, LSA Technology Services, explained that, “a big part of the story here is finding useful information in datasets that were created and collected for other purposes. Dr. Min is able to ask these questions because the images were previously captured, and then it becomes the very large task of finding a tiny signal in a huge dataset.”

There are more than 250 terabytes of satellite imagery and data, across more than 3 million files. And with each passing night, the collection continues to grow. Previously, the images were not easily accessible because they were archived in deep storage in multiple locations. ARC provides processing and storage at a single place, an important feature for cohesive and timely research. 

The research team created computational models that run on the Great Lakes High-Performance Computing Cluster, and that can be easily replicated and validated. They archive the files on the Locker Large-File Storage service

One challenge Min and O’Keeffe chronically face is data management. Images can be hundreds of megabytes each, so just moving files from the storage service to the high-performance computing cluster can be challenging, let alone finding the right storage service. Using Turbo Research Storage and Globus File Transfer, Min and O’Keeffe found secure, fast, and reliable solutions to easily manage their large, high-resolution files.

Brock Palen, director of ARC, said that top speeds were reached when moving files from Great Lakes to Turbo at 1,400 megabytes per second. 

Min and team used Globus extensively in acquiring historical data from the National Oceanic and Atmospheric Administration (NOAA). Champe worked with the research team to set up a Globus connection to ARC storage services. The team at NOAA was then able to push the data to U-M quickly and efficiently. Rather than uploading the data to later be downloaded by Min’s team, Globus streamlined and sped up the data transfer process. 

Champe noted, “Over 100TB of data was being unarchived from tape and transferred between institutions. Globus made that possible and much less painful to manage.”

“The support we’ve gotten from ARC and LSA Technology has been incredible. They have made our lives easier by removing bottlenecks and helping us see new ways to draw insights from this unique data,” said Min. 

Palen added, “We are proud to partner with LSA Technology Services and ITS Infrastructure networking services to provide support to Dr. Min’s and O’Keeffe’s work. Their work has the potential to have a big impact in communities around the world.” 

“We should celebrate work such as this because it is a great example of impactful research done at U-M that many people helped to support,” Champe continued.

Min expressed his gratitude to the project’s partners. “We have been grateful to work with the World Bank and NOAA to generate new insights on energy access that will hopefully improve lives around the world.”

These images are now available via open access (free and available to all)

This is made possible by a partnership between the University of Michigan, the World Bank, Amazon Web Services, and NOAA

DNA sequencing productivity increases with ARC-TS services

By | HPC, News, Research, Systems and Services
NovaSeq, the DNA sequencer that is about the size of large laser printer.

The Advanced Genomics Core’s Illumina NovaSeq 6000 sequencing platform. It’s about the size of large laser printer.

On the cutting-edge of research at U-M is the Advanced Genomics Core’s Illumina NovaSeq 6000 sequencing platform. The AGC is one of the first academic core facilities to optimize this exciting and powerful instrument, that is about the size of a large laser printer. 

The Advanced Genomics Core (AGC), part of the Biomedical Research Core Facilities within the Medical School Office of Research, provides high-quality, low-cost next generation sequencing analysis for research clients on a recharge basis. 

One NovaSeq run can generate as much as 4TB of raw data. So how is the AGC able to generate, process, analyze, and transfer so much data for researchers? They have partnered with Advanced Research Computing – Technology Services (ARC-TS) to leverage the speed and power of the Great Lakes High-Performance Computing Cluster

With Great Lakes, AGC can process the data, and then store the output on other ARC-TS services: Turbo Research Storage and Data Den Research Archive, and share with clients using Globus File Transfer. All three services work together. Turbo offers the capacity and speed to match the computational performance of Great Lakes, Data Den provides an archive of raw data in case of catastrophic failure, and Globus has the performance needed for the transfer of big data. 

“Thanks to Great Lakes, we were able to process dozens of large projects simultaneously, instead of being limited to just a couple at a time with our in-house system,” said Olivia Koues, Ph.D., AGC managing director. 

“In calendar year 2020, the AGC delivered nearly a half petabyte of data to our research community. We rely on the speed of Turbo for storage, the robustness of Data Den for archiving, and the ease of Globus for big data file transfers. Working with ARC-TS has enabled incredible research such as making patients resilient to COVID-19. We are proudly working together to help patients.”

“Our services process more than 180,000GB of raw data per year for the AGC. That’s the same as streaming the three original Star Wars movies and the three prequels more than 6,000 times,” said Brock Palen, ARC-TS director. “We enjoy working with AGC to assist them into the next step of their big data journey.”

ARC-TS is a division of Information and Technology Services (ITS). The Advanced Genomics Core (ACG) is part of the Biomedical Research Core Facilities (BRCF) within the Medical School Office of Research.

Using machine learning and the Great Lakes HPC Cluster for COVID-19 research

By | General Interest, Great Lakes, HPC, News, Research, Uncategorized

A researcher in the College of Literature, Science, and the Arts (LSA) is pioneering two separate, ongoing efforts for measuring and forecasting COVID-19: pandemic modeling and a risk tracking site

The projects are led by Sabrina Corsetti, a senior undergraduate student pursuing dual degrees in honors physics and mathematical sciences, and supervised by Thomas Schwarz, Ph.D., associate professor of physics. 

The modeling uses a machine learning algorithm that can forecast future COVID-19 cases and deaths. The weekly predictions are made using the ARC-TS Great Lakes High-Performance Computing Cluster, which provides the speed and dexterity to run the modeling algorithms and data analysis needed for data-informed decisions that affect public health. 

Each week, 51 processes (one for each state and one for the U.S.) are run in parallel (at the same time). “Running all 51 analyses on our own computers would take an extremely long time. The analysis places heavy demands on the hardware running the computations, which makes crashes somewhat likely on a typical laptop. We get all 51 done in the time it would take to do 1,” said Corsetti. “It is our goal to provide accurate data that helps our country.”

The predictions for the U.S. at the national and state levels are fed into the COVID-19 Forecasting Hub, which is led by the UMass-Amherst Influenza Forecasting Center of Excellence based at the Reich Lab. The weekly predictions generated by the hub are then read out by the CDC for their weekly forecast updates Center for Disease Control (CDC) COVID-19 Forecasting Hub

The second project, a risk tracking site, involves COVID-19 data-acquisition from a Johns Hopkins University repository and the Michigan Safe Start Map. This is done on a daily basis, and the process runs quickly. It only takes about five minutes, but the impact is great. The data populates the COVID-19 risk tracking site for the State of Michigan that shows by county the total number of COVID-19 cases, the average number of new cases in the past week, and the risk level.

“Maintaining the risk tracking site requires us to reliably update its data every day. We have been working on implementing these daily updates using Great Lakes so that we can ensure that they happen at the same time each day. These updates consist of data pulls from the Michigan Safe Start Map (for risk assessments) and the Johns Hopkins COVID-19 data repository (for case counts),” remarked Corsetti.

“We are proud to support this type of impactful research during the global pandemic,” said Brock Palen, director of Advanced Research Computing – Technology Services. “Great Lakes provides quicker answers and optimized support for simulation, machine learning, and more. It is designed to meet the demands of the University of Michigan’s most intensive research.”

ARC is a division of Information and Technology Services (ITS). 

Related information