Category

HPC

2023 Winter Maintenance & Globus File Transfer upgrade 

By | Feature, General Interest, Great Lakes, HPC, News, Systems and Services

Winter maintenance is coming up! See the details below. Reach out to arc-support@umich.edu with questions or if you need help. 

These services will be unavailable: 

  • Great Lakes – We will be updating Great Lakes on a rolling basis throughout December and beginning of January, and if successful, there should be no downtime or impact, with the following exceptions: 
    • Single precision GPUs (SPGPU) will be down Jan. 4-5 for networking maintenance. Those nodes will return back to production when maintenance has been completed and the nodes have been reloaded.
    • Customers will be notified via email of any changes to Great Lakes maintenance that will require downtime.
    • If unsuccessful, the Great Lakes maintenance will begin on Jan. 4-5, starting at 8am.  In either case, we will email everyone with the updated maintenance status.
  • Globus on the storage transfer nodes: Jan. 17-18.

Maintenance notes:

  • No downtime for ARC storage systems maintenance (Turbo, Locker, and Data Den).
  • Open OnDemand (OOD) users will need to re-login. Any existing jobs will continue to run and can be reconnected in the OOD portal.
  • Login servers will be updated, and the maintenance should not have any effect on most users. Those who are affected will be contacted directly by ARC. 
  • Copy any data and files that may be needed during maintenance to your local drive using Globus File Transfer before maintenance begins. 
  • Slurm email will be improved, providing  more detailed information about completed jobs.

Countdown to maintenance 

For Great Lakes HPC jobs, use the command “maxwalltime” to discover the amount of time remaining until maintenance begins. 

Jobs that request more walltime than remains until maintenance will automatically be queued and start once maintenance is complete. If the plan for Great Lakes maintenance is successful, any queued jobs will be able to run as usual (except for the SPGPU nodes as discussed above). Customers will be notified via email if downtime is required for Great Lakes.

Status updates and additional information

How can we help you?

For assistance or questions, please contact ARC at arc-support@umich.edu.

XSEDE is now ACCESS 

By | Feature, HPC, News

Access: advancing innovation and decorative imageThe national HPC resource known as XSEDE has now fully transitioned to ACCESS, Advanced Cyberinfrastructure Coordination Ecosystem: Services & Support. 

ARC staff members, Todd Raeker and Shelly Johnson, are here to help you understand what is happening and how you can take advantage of the national resources and allocations.  

ACCESS has streamlined the allocation application process. Many projects that formerly had to submit an annual allocation proposal under XSEDE may now fit within one of the new ACCESS opportunities for small or medium-sized projects. You can now make one request with a much shorter allocations proposal. You can also make these requests at any time, rather than wait for the quarterly deadlines. Allocations are awarded for the duration of a supporting grant or for 12 months, with possible extensions up to five years.

There are new tiers as well. With the new Discover tier, there are 1.5 million available credits, and is closer in line to annual allocation from XSEDE. Researchers can submit the one-page application anytime. Check the overview of the ACCESS opportunities to see where your project fits. If you find that your research activities are considered to be a large-scale project, you will need to submit a proposal for a Maximize ACCESS award. Requests for Maximize ACCESS are accepted every six months. More details are available on the Maximize ACCESS web page.

Are you a grad student researcher? You now qualify to apply as the principal investigator (PI) for an ACCESS allocation in the Explore ACCESS tier to obtain resources to help complete your dissertation. Graduate students will need to include a letter from their advisor to support the request, and advisors must be added as a co-PI.

If you are interested in assisting the national research and educational community by reviewing ACCESS allocation requests, you can sign up to join the ACCESS allocations review committee.

Need help? Contact arc-support@umich.edu, or visit the ACCESS web page on the ARC website. You can also keep up with ongoing updates on the ACCESS webpage, access-ci.org.

Attend a listening session with ARC Director Brock Palen

By | Feature, HPC, News

You’re invited! ARC Director Brock Palen would like to hear from researchers. 

Are ARC services meeting their needs? What is not working well for them? Is something technical impeding their ability to do their research? Do they like Turbo Research Storage, the HPC web interface Open OnDemand, or the no-cost allocations offered by the U-M Research Computing Package

This is an open, virtual, drop-in office hour. All are welcome. 

There are three sessions are available: 

Researchers can also email Brock Palen at brockp@umich.edu or reach out to ARC at arc-support@umich.edu.

Precision Health and ARC team up on a self-service tool for genetic research

By | Great Lakes, HPC, News

Encore is a self-serve genetic analysis tool that researchers can now run using a point-and-click interface without the need to directly manipulate the genetic data. Only a phenotype file is needed to build a GWAS model with SAIGE (genetics analysis software), launch and monitor job progress, and interactively explore results.

It is geared for a range of disciplines and specialties including biostatistics, epidemiology, neuroscience, gastroenterology, anesthesiology, clinical pharmacy, and bioinformatics.

The tool was developed at the U-M School of Public Health Center for Statistical Genetics and is managed by Precision Health and supported by ITS’s Advanced Research Computing (ARC).  

Brock Palen, ARC director, “When someone uses Encore they are actually running on Great Lakes, and we are happy to provide the computational performance behind Encore.”

Using Encore is easy. No coding, command-line/Linux knowledge is required to run GWAS in Encore. Researchers also do not need to have knowledge of batch job submission or scheduling, or have direct access to a high-performance computing cluster. Encore automatically prepares job submission scripts and submits the analysis to the Great Lakes High-Performance Computing Cluster. 

Great Lakes is the university’s flagship open-science high-performance computing cluster. It is much faster and more powerful than a laptop, and provides quicker answers and optimized support for simulation, genomics, machine learning, life science, and more. The platform provides a balanced combination of computing power, I/O performance, storage capability, and accelerators.

Visit the Encore wiki page to learn more

To get started, send an email to PHDataHelp@umich.edu

For questions about Great Lakes, contact arc-support@umich.edu

New Resource Management Portal feature for Armis2 HPC Clusters

By | Armis2, HPC, News

Advanced Research Computing (ARC), a division of Information and Technology Services (ITS), has been developing a self-service tool called the Resource Management Portal (RMP) to give researchers and their delegates the ability to directly manage the IT research services they consume from ARC. 

Customers who use the Armis2 High-Performance Computing Cluster now have the ability to view their account information via the RMP, including the account name, resource limits (CPUs and GPUs), and the user access list.

“We are proud to be able to offer this tool for customers who use the HIPAA-certified Armis2 cluster,” said Brock Palen, ARC director. 

The RMP is a self-service-only user portal with tools and APIs for research managers, unit support staff, and delegates to manage their ARC IT resources. The RMP team is slowly adding capabilities over time. 

To get started or find help, contact arc-support@umich.edu.

Understanding the strongest electromagnetic fields in the universe

By | Data, Great Lakes, HPC, Research, Uncategorized

Alec Thomas is part of the team from the U-M College of Engineering Gérard Mourou Center for Ultrafast Optical Science that is building the most powerful laser in the U.S.

Dubbed “ZEUS,” the laser will be 3-petawatts of power. That’s a ‘3’ with 15 zeros. All the power generated in the entire world is 10-terawatts, or 1000 times less than the ZEUS laser. 

The team’s goal is to use the laser to explore how matter behaves in the most extreme electric and magnetic fields in the universe, and also to generate new sources of radiation beams, which may lead to developments in medicine, materials science, and national security. 

A simulation of a plasma wake.

This simulation shows a plasma wake behind a laser pulse. The plasma behaves like water waves generated behind a boat. In this image, the “waves” are extremely hot plasma matter, and the “boat” is a short burst of powerful laser light. (Image courtesy of Daniel Seipt.)

“In the strong electric fields of a petawatt laser, matter becomes ripped apart into a `plasma,’ which is what the sun is made of. This work involves very complex and nonlinear physical interactions between matter particles and light. We create six-dimensional models of particles to simulate how they might behave in a plasma in the presence of these laser fields to learn how to harness it for new technologies. This requires a lot of compute power,” Thomas said. 

That compute power comes from the Great Lakes HPC cluster, the university’s fastest high-performance computing cluster. The team created equations to solve a field of motion for each six-dimensional particle. The equations run on Great Lakes and help Thomas and his team to learn how the particle might behave within a cell. Once the field of motion is understood, solutions can be developed. 

“On the computing side, this is a very complex physical interaction. Great Lakes is designed to handle this type of work,” said Brock Palen, director of Advanced Research Computing, a division of Information and Technology Services. 

Thomas has signed up for allocations on the Great Lakes HPC cluster and Data Den storage. “I just signed up for the no-cost allocations offered by the U-M Research Computing Package. I am planning to use those allocations to explore ideas and concepts in preparation for submitting grant proposals.”

Learn more and sign up for the no-cost U-M Research Computing Package (UMRCP).

Prof. Thomas’ work is funded by a grant from the National Science Foundation.

No-cost research computing allocations now available

By | HPC, News, Research, Systems and Services, Uncategorized

U-M Research Computing PackageResearchers on all university campuses can now sign up for the U-M Research Computing Package, a new package of no-cost supercomputing resources provided by Information and Technology Services.

As of Sept. 1, university researchers have access to a base allocation for 80,000 CPU hours of high-performance computing and research storage services at no cost. This includes 10 terabytes of high-speed and 100 terabytes of archival storage.

These base allocations will meet the needs of approximately 75 percent of current high-performance-computing users and 90 percent of current research storage users. Researchers must sign up on ITS’s Advanced Research Computing website to receive the allocation.

“With support from President (Mark) Schlissel and executive leadership, this initiative provides a unified set of resources, both on campus and in the cloud, that meet the needs of the rich diversity of disciplines. Our goal is to encourage the use, support and availability of high-performance computing resources for the entire research community,” said Ravi Pendse, vice president for information technology and chief information officer.

The computing package was developed to meet needs across a diversity of disciplines and to provide options for long-term data management, sharing and protecting sensitive data, and more competitive cost structures that give faculty and research teams more flexibility to procure resources on short notice.

“It is incredibly important that we provide our research community with the tools necessary so they can use their experience and expertise to solve problems and drive innovation,” said Rebecca Cunningham, vice president for research and the William G. Barsan Collegiate Professor of Emergency Medicine. “The no-cost supercomputing resources provided by ITS and Vice President Pendse will greatly benefit our university community and the countless individuals who are positively impacted by their research.”

Ph.D. students may qualify for their own UMRCP resources depending on who is overseeing their research and their adviser relationship. Students should consult with their Ph.D. program administrator to determine their eligibility. ITS will confirm this status when a UMRCP request is submitted.

Undergraduate and master’s students do not currently qualify for their own UMRCP, but they can be added as users or administrators of another person’s UMRCP. Students can also access other ITS programs such as Great Lakes for Course Accounts, and Student Teams.

“If you’re a researcher at Michigan, these resources are available to you without financial impact. We’re going to make sure you have what you need to do your research. We’re investing in you as a researcher because you are what makes Michigan Research successful,” Brock Palen, Advanced Research Computing director.

Services that are needed beyond the base allocation provided by the UMRCP are available at reduced rates and are automatically available for all researchers on the Ann Arbor, Dearborn, Flint and Michigan Medicine campuses.

More Information

Access the sensitive data HPC cluster via web browser

By | Armis2, HPC, News

Researchers, data scientists, and students can now more easily analyze sensitive data on the Armis2 High-Performance Computing (HPC) Cluster. No Linux knowledge required, just a web browser, an account, and a login. 

This is made possible by a web interface called Open OnDemand, and is provided by Advanced Research Computing (ARC). 

“It is now much easier to analyze sensitive data, without investing hours in training. This makes the Open OnDemand tool more accessible and user-friendly. I’m excited to see the research breakthroughs that happen now that a significant barrier has been removed,” said Matt Britt, ARC HPC manager. 

Open OnDemand offers easy file management, command-line access to the Armis2 HPC cluster, job management and monitoring, and graphical desktop environments and desktop interactive applications such as RStudio, MATLAB, and Jupyter Notebook.

Resource: Getting started (Web-based Open OnDemand) – section 1.2. For assistance or questions, please contact ARC at arc-support@umich.edu.

ARC is a division of Information and Technology Services (ITS).

HPC, storage now more accessible for researchers

By | HPC, News, Systems and Services

U-M Research Computing Package decorative image

Information and Technology Services has launched a new package of supercomputing resources for researchers and PhD students on all U-M campuses: the U-M Research Computing Package, provided by ITS.

The U-M Research Computing Package will reduce the current rates for high performance computing and research storage services provided by ITS by an estimated 35-40 percent, effective July 1. 

In addition, beginning Sept. 1, university researchers will have access to a base allocation for high-performance computing and research storage services (including high-speed and archival storage) at no cost, thanks to an additional investment from ITS. These base allocations will meet the needs of approximately 75 percent of current high-performance computing users and 90 percent of current research storage users.

Learn more about the U-M Research Computing Package

 

RMP new feature alert: View Great Lakes HPC account information

By | General Interest, HPC, News, Systems and Services

Advanced Research Computing (ARC), a division of ITS, has been developing a self-service tool called the Resource Management Portal (RMP) to give researchers and their delegates the ability to directly manage the IT research services they consume from ARC. 

Customers who use the Great Lakes High-Performance Computing Clusters now have the ability to view their account information via the RMP, including the account name, resource limits (CPUs and GPUs), scratch space usage, and the user access list.

“We are excited to be able to offer this tool for customers. It should make their busy lives easier,” said Todd Raeker, ARC research experience manager. 

The RMP is a self-service-only user portal with tools and APIs for research managers, unit support staff, and delegates to manage their ARC IT resources. The RMP team is slowly adding capabilities over time. 

To get started or find help, contact arc-support@umich.edu.