All Posts By

sdascola

U-M Research Computing Package automatic renewal begins July 1

By | News, Research, Uncategorized

** Looking for the LARCC Application?

____

The no-cost bundle of supercomputing resources known as the U-M Research Computing Package (UMRCP) automatically renews for most on July 1. 

Provided by Information and Technology Services, the UMRCP offers qualified researchers on all campuses (Ann Arbor, Dearborn, Flint, and Michigan Medicine) with allocations of high-performance computing, secure enclave, and research storage services. (Many units, including Michigan Medicine, provide additional resources to researchers. Be sure to check with your school or college.) 

If a faculty researcher has left the university (or is about to), and their research remains at the university, an alternative administrator must be assigned via the ARC Resource management Portal (RMP) so that the allocations can continue uninterrupted. ARC is available to help researchers make this transition. 

Don’t have the UMRCP? Here’s how to request resources 

Faculty, as well as staff and PhD students with their own funded research on all campuses (Ann Arbor, Dearborn, Flint, and Michigan Medicine), are welcome to request allocations. Full details are available on the Advanced Research Computing website

PhD researchers who do not have their own funded research can work with their advisor to be added to their allocations via the ARC Resource Management Portal (RMP).

“The UMRCP was launched in 2021 to meet the needs of a diversity of disciplines and to provide options for long-term data management, sharing, and protecting sensitive data,” said Brock Palen, director, ITS Advanced Research Computing. “The UMRCP alleviates a lot of the pressure that researchers feel in terms of managing the technology they need to achieve breakthroughs.”

More information

Globus can now be used with Armis2Ā 

By | Armis2, HPC, News, Uncategorized

Researchers who have an Armis2 High-Performance Computing account can now move data to and from other Protected Health Information (PHI)-approved systems using Globus File Transfer. (The endpoint is umich#armis2.) 

To learn more about your responsibility and approved services, visit the Sensitive Data Guide and the Protected Health Information (PHI) webpage on the Safe Computing Website. Send an email to ARC at arc-support@umich.edu to get started using Globus with PHI on your own system (this is not needed for researchers using ARC services including Armis2, and Data Den and Turbo with Sensitive Data).

“With the addition of Globus on Armis2, researchers using ITS ARC services can use the same Globus tools and processes to securely and reliably move their data on all ARC systems and across the university and beyond,” said Matt Britt, ARC HPC systems manager.

Globus allows the transfer and collaborative access of data between different storage systems, lab computers, and personal desktops and laptops. Globus enables researchers to use a web browser to submit transfer and data synchronization requests between destinations. 

As a robust, cloud-based, file transfer service, Globus is designed to securely move your data, ranging from megabytes to petabytes. ARC is a Globus Subscription Provider for the U-M community, which allows U-M resources to serve as endpoints or collections for file transfers.

“There are many interesting research collaborations happening at U-M, as well as nationally and internationally. Globus can facilitate all of those interactions securely,” said Brock Palen, ARC director. “Globus is the go-to tool we recommend for data transfer.”

Learn more 

How can we help you?

For assistance or questions, contact ARC at arc-support@umich.edu.

PFAS research in the Michigan mother-infant pairs study, supported by ITS, SPH, MM, AGC

By | News

Three mothers holding their infants. Everyone is sitting on a couch..PFAS (per- and polyfluoroalkyl substances) are a class of chemicals that have been around since the 1940s and became more broadly used in the post-war 1960s era. PFAS are in our homes, offices, water, and even our food and blood. PFAS break down slowly and are difficult to process, both in the environment and our bodies. 

Scientific studies have shown that exposure to some PFAS in the environment may be linked to harmful health effects in humans and animals. Because there are thousands of PFAS chemicals found in many different consumer, commercial, and industrial products, it is challenging to study and assess the human health and environmental risks. 

Fortunately, some of the most persistent PFAS are being phased out. The EPA has been working on drinking water protections, scientists are working on ways to break down and eliminate PFAS, and PFAS are being addressed at a national level

A team of University of Michigan researchers from the School of Public Health DoGoodS-Pi Environmental Epigenetics Lab and Michigan Medicine are working to understand how behaviors and environments during pregnancy can cause changes to the way genes work in offspring. This emerging field is known as toxicoepigenetics. 

Jackie Goodrich, Ph.D., research associate professor at the U-M School of Public Health, led the team. “PFAS may impact the development of something we all have called the epigenome. The epigenome is a set of modifications on top of our DNA that controls normal development and function. Environmental exposures like PFAS can alter how the epigenome forms, and this impacts development and health. Our study expands on current knowledge about PFAS and the epigenome by focusing on a type of epigenetic mark that is not usually measured.”

Vasantha Padmanabhan, Ph.D., M.S., professor emerita (in service), Department of Pediatrics, Michigan Medicine, built the Michigan Mother-Infant Pairs study over the past decade with an emphasis on identifying harmful exposures during pregnancy that impact women and their newborns. “I am so grateful to those who engaged in this study. PFAS are complex, and mothers’ and infants’ involvement helped us work toward a solution that impacts us all. I want to acknowledge the contributions of the U-M Department of Obstetrics and Gynecology, Michigan Institute for Clinical & Health Research (MICHR), and the Von Voigtlander Women’s Hospital that made this study possible.” 

Rebekah Petroff, Ph.D., a research fellow with Environmental Health Sciences, led the computation portion of the research. She said that using Turbo for storing the raw data and Great Lakes for high-performance computing (HPC) enabled a much faster analysis that was needed for the study with so much data to analyze. 

Turbo and Great Lakes are services provided by Advanced Research Computing, a division of Information and Technology Services (ITS). ARC facilitates powerful approaches to complex research challenges in fields ranging from physics to linguistics, and from engineering to medicine.

Petroff said, “This analysis would have taken over a month straight of computing time on a regular desktop computer. The first job we submitted to Great Lakes ran so fast—I had results the next morning! Great Lakes made this research possible, and I believe that our study results can be broadly impactful to public health and toxicoepigenetics going forward.”

Support for using this complex technology also came from Dan Barker, a UNIX systems admin with the U-M School of Public Health Biostatistics Department. Barker assisted with the code needed to use Great Lakes. “We started with a test run of a few hundred pairs of genomes. Once we were successful with that, we ran the entire nearly 750,000 epigenetic marks across 141 people and seven different PFAS.”

Barker also helped design and submit array jobs which are a series of identical, or near identical, tasks that are run multiple times. This is a common technique used by researchers when leveraging HPC. Array jobs allow for essential analytical comparisons among the test results. Petroff said, “In our study, we used an array job to split up our computations so that they ran much more efficiently!”

The U-M Advanced Genomics Core (AGC) performed the epigenetic assays, a kind of laboratory technique which measures marks on your DNA, for this project. AGC is part of the campus-wide laboratories that develop and provide state-of-the-art scientific resources to enable biomedical research known as Biomedical Research Core Facilities (BRCF). Other BRCF cores also worked on this project, including the Epigenomics Core and the Bioinformatics Core.

Genotyping is similar to reading a few words scattered on a page. This process gives researchers small packets of data to compare. Genotyping looks for information at a specific place in the DNA where we know important data will be. This project used a type of genotyping called microarrays (also known as “arrays”) and help researchers understand how regulation of DNA—including methylation and hydroxymethylation measured in this study—are impacted by exposures like PFAS.  

Brock Palen, ARC director, said, “This research is of human interest and impacts all of us. I’m pleased that ARC assisted their research with staff expertise, equipment, and no-cost allocations from the U-M Research Computing Package.”

Petroff said that follow up studies are needed to better understand if the results are universal or specific to this cohort of infants and parents. If the results hold steady, then a significant discovery has been made that will lead to more comprehensive PFAS mitigation solutions. “Although steps are being taken to mitigate PFAS, exposure is still prevalent, and a deeper understanding of how it impacts humans is needed,” said Dana Dolinoy, Ph.D., chair, NSF International Department Chair of Environmental Health Sciences and epigenetics expert.

Read the full article: Mediation effects of DNA methylation and hydroxymethylation on birth outcomes after prenatal per- and polyfluoroalkyl substances (PFAS) exposure in the Michigan mother–infant pairs cohort.

Funding was provided by grants from the National Institutes of Health, the U.S. Environmental Protection Agency, and the National Institute of Environmental Health Sciences Children’s Health Exposure Analysis Resource program.

ARC Summer 2023 Maintenance happening in June

By | HPC, News, Systems and Services

Summer maintenance will be happening earlier this year (June instead of August). Updates will be made to software, hardware, and operating systems to improve the performance and stability of services. ARC works to complete these tasks quickly to minimize the impact of the maintenance on research.

The dates listed below are the weeks the work will be occurring; the actual dates will be revised as planning continues.

HPC clusters and storage systems (/scratch) will be unavailable:

  • June 5-9: Great Lakes, Armis2, and Lighthouse

Storage systems will be unavailable:

  • June 6-7: Turbo, Locker, and Data Den

Queued jobs and maintenance reminders

Jobs will remain queued, and will automatically begin after the maintenance is completed. The command “maxwalltime” will show the amount of time remaining until maintenance begins for each cluster, so you can size your jobs appropriately. The countdown to maintenance will also appear on the ARC homepage

Status updates

How can we help you?

For assistance or questions, contact ARC at arc-support@umich.edu.

Data Den now supports sensitive data

By | News, Uncategorized

Data Den Research Archive is a service for preserving electronic data generated from research activities. It is a low-cost, highly durable storage system and is the largest storage system operated by ARC. Storing of sensitive data (including HIPAA, PII, and FERPA) is now supported (visit the Sensitive Data Guide for full details). This service is part of the U-M Research Computing Package (UMRCP) that provides storage allocations to researchers. Most researchers will not have to pay for Data Den. 

A disk-caching, tape-backed archive, this storage service is best for data that researchers do not need regularly, but still need to keep because of grant requirements. 

“Data Den is a good place to keep research data past the life of the grant,” said Jeremy Hallum, ARC research computing manager. “ARC can store data that researchers need to keep for five to ten years.” 

Hallum goes on to say that Data Den is only available in a replicated format. “Volumes of data are duplicated between servers or clusters for disaster recovery so research data is very safe.”

Data Den can be part of a well-organized data management plan providing international data sharing, encryption, and data durability. Jerome Kinlaw, ARC research storage lead, said that the Globus File Transfer service works well for data management. “Globus is easy to use for moving data in and out of Data Den.”

The ITS U-M Research Computing Package (UMRCP) provides 100 terabytes (TB) of Data Den storage to qualified researchers. This 100 TB can be divided between restricted and non-restricted variants of Data Den for use as needed. (The ITS Data Storage Finder can help researchers find the right storage solutions to meet their needs.)

“I’m pleased that Data Den now offers options for sensitive data, and that researchers can take advantage of the UMRCP allocations,” said Brock Palen, ARC director. “We want to lighten the load so that researchers can do what they do best, and our services are now more cost effective than ever.”

Globus maintenance happening at 9 a.m. on March 11

By | Armis2, Data, General Interest, Great Lakes, HPC, News, Research, Uncategorized

Due to planned maintenance by the vendor, Globus services will be unavailable for up to two hours beginning at 9 a.m. U.S. Eastern Time (10 a.m. Central Time) on Saturday, March 11, 2023.

Customers will not be able to authenticate or initiate any transfers during that time. Any transfers that have started before the outage will be stalled until the outage is over. Transfers will resume once maintenance is complete.

More details are available on the Globus blog.

For assistance or questions, please contact ARC at arc-support@umich.edu.

Protein structure prediction team achieved top rankings

By | Great Lakes, News, Uncategorized

CASP15 is a bi-annual competition assessment of methods of protein structure modeling. Independent assessors then compared the models with experiments, and the results and their implications were discussed at the CASP15 Conference, held December 2022, in Turkey.

A joint team with members from the labs of Dr. Peter Freddolino and Dr. Yang Zhang took first place in the Multimer and Interdomain Prediction categories, and was again the top-ranked server in the Regular (domains) category according to the CASP assessor’s criteria.

These wins are well-earned. Freddolino noted, “This is a highly competitive event, against some of the very best minds and powerful companies in the world.”

The Zhang/Freddolino team competed against nearly 100 other groups which include other academic institutions, as well as major cloud and commercial companies. Groups from around the world submitted more than 53,000 models on 127 modeling targets in 5 prediction categories. 

“Wei’s predictions did amazingly well in CASP15!,” said Freddolino. Wei Zheng, Ph.D., is a lab member and a research fellow with the Department of Computational Medicine and Bioinformatics (DCMB). 

Zheng said that the team participates in the regular protein structure prediction and protein complex structure prediction categories. “The results are assessed as regular protein domain modeling, regular protein inter-domain modeling, and protein complex modeling. In all categories, our models performed very well!” 

The technology that supported this impressive work 

The resources to achieve these results were grant-funded, which allowed the team to leverage a number of university resources, including:  

  • The Lighthouse High-Performance Computing Cluster (HPC) service. Lighthouse is managed by the Advanced Research Computing (ARC) team, and ARC is a division of Information and Technology Services (ITS). 
  • The algorithms were GPU-intensive and run on the Great Lakes HPC Cluster. Graphics processing units (GPUs) are specialized processors designed to accelerate graphics rendering. The Great Lakes cluster provided additional space for running compute cycles. Kenneth Weiss, IT project manager senior with DCMB and HITS, said that many of the algorithms used by Zheng benefited from the increased performance of being able to compute the data on a GPU.
  • Multiple storage systems, including Turbo Research Storage. High-speed storage was crucial for storing AI-trained models and sequence libraries used by the methods developed by Zhang, Freddolino, and Zheng called D-I-TASSER/DMFold-Multimer. 
  • Given the scale of the CASP targets, the grant-funded compute augmented capacity by utilizing the Great Lakes cluster, Freddolino and his team took advantage of the allocations provided by the ITS U-M Research Computing Package (UMRCP) and the HITS Michigan Medicine Research Computing Investment (MMRCI) programs which defrayed the cost of computing substantially.
  • The collaboration tool Slack was used to keep Freddolino and Zheng in close contact with ARC and the DCMB teams. This provided the ability to deal with issues promptly, avoiding delays that would have had a detrimental impact on meeting CASP targets.

Technology staff from ARC, DCMB, and Health Information and Technology Services (HITS) provided assistance to the research team. All of the teams helped with the mitigation of bottlenecks that affected speed and throughput that Zheng needed for results. Staff also located and helped leverage resources including those on Great Lakes, utilizing available partitions and queues on the clusters.

“Having the flexibility and capacity provided by Great Lakes was instrumental in meeting competition deadlines,” said Weiss.

DCMB staff and the HITS HPC Teams team took the lead on triaging software problems giving Freddolino’s group high priority.

ARC Director Brock Palen provided monitoring and guidance on real-time impact and utilization of resources. “It was an honor to support this effort. It has always been ARC’s goal to take care of the technology so researchers can do what they do best. In this case, Freddelino and Zheng knocked it out of the park.” 

Jonathan Poisson, technical support manager with DCMB, was instrumental in helping to select and configure the equipment purchased by the grant. “This assistance was crucial in meeting the tight CASP15 targets, as each target is accompanied by a deadline for results.” 

Read more on the Computational Medicine and Bioinformatics website and the Department of Biological Chemistry website.

Related presentation: D-I-TASSER: Integrating Deep Learning with Multi-MSAs and Threading Alignments for Protein Structure Prediction

The resources to achieve these results were provided by an NIH-funded grant (“High-Performance Computing Cluster for Biomedical Research,” SIG: S10OD026825). 

Flux transfer servers will be decommissioned on Jan. 9; use new endpoints

By | Feature, General Interest, News

To provide faster transfers of data from ARC services, ARC will be decommissioning the Flux-Xfer servers on January 9, 2023. You will need to update how you migrate your data. 

For everyone who uses ARC storage services, especially Data Den users: This message is VERY important! This change includes the use of scp from Flux-Xfer, as well as the Globus endpoint umich#flux. Any shared endpoints that you have created on umich#flux will be automatically migrated to a new Globus collection on January 9. Those who use Data Den should take special interest in Item 1 listed below. 

Action item – Use the new endpoints

  1. If you currently use globus and umich#flux to access your Locker or Data Den volume, you should use the Globus Collection ‘UMich ARC Locker Non-Sensitive Volume Collection’ for Locker, and the Globus Collection ‘UMich ARC Data Den Non-Sensitive Volume Collection’ for Data Den. 
  2. If you currently use globus and umich#flux to access your Turbo volumes, you should use the Globus Collection ‘UMich ARC Turbo Non-Sensitive Volume Collection’.
  3. If you currently use globus and umich#flux to access other storage volumes, you should use the Globus Collection ‘umich#greatlakes’.  
  4. If you currently use scp on flux-xfer to copy data to/from Turbo, you should use ‘globus-xfer1.arc-ts.umich.edu’.

User guide 

How can we help you?

For assistance or questions, please contact ARC at arc-support@umich.edu

2023 Winter Maintenance & Globus File Transfer upgradeĀ 

By | Feature, General Interest, Great Lakes, HPC, News, Systems and Services

Winter maintenance is coming up! See the details below. Reach out to arc-support@umich.edu with questions or if you need help. 

These services will be unavailable: 

  • Great Lakes – We will be updating Great Lakes on a rolling basis throughout December and beginning of January, and if successful, there should be no downtime or impact, with the following exceptions: 
    • Single precision GPUs (SPGPU) will be down Jan. 4-5 for networking maintenance. Those nodes will return back to production when maintenance has been completed and the nodes have been reloaded.
    • Customers will be notified via email of any changes to Great Lakes maintenance that will require downtime.
    • If unsuccessful, the Great Lakes maintenance will begin on Jan. 4-5, starting at 8am.  In either case, we will email everyone with the updated maintenance status.
  • Globus on the storage transfer nodes: Jan. 17-18.

Maintenance notes:

  • No downtime for ARC storage systems maintenance (Turbo, Locker, and Data Den).
  • Open OnDemand (OOD) users will need to re-login. Any existing jobs will continue to run and can be reconnected in the OOD portal.
  • Login servers will be updated, and the maintenance should not have any effect on most users. Those who are affected will be contacted directly by ARC. 
  • Copy any data and files that may be needed during maintenance to your local drive using Globus File Transfer before maintenance begins. 
  • Slurm email will be improved, providing  more detailed information about completed jobs.

Countdown to maintenance 

For Great Lakes HPC jobs, use the command “maxwalltime” to discover the amount of time remaining until maintenance begins. 

Jobs that request more walltime than remains until maintenance will automatically be queued and start once maintenance is complete. If the plan for Great Lakes maintenance is successful, any queued jobs will be able to run as usual (except for the SPGPU nodes as discussed above). Customers will be notified via email if downtime is required for Great Lakes.

Status updates and additional information

How can we help you?

For assistance or questions, please contact ARC at arc-support@umich.edu.

ARC Director Brock Palen spoke about Globus at SC22

By | General Interest, News

At the recent International Conference for High Performance Computing, Networking, Storage, and Analysis in Dallas (SC22) in Dallas, Tx., ARC Director Brock Palen spoke about how the University of Michigan is using Globus File Transfer. Globus allows the transfer of data between different storage systems, lab computers, and personal desktops/laptops.

“U-M is the largest public research university in North America by research spend. There are over a hundred top 10 graduate programs, and this drives a lot of interesting collaborations,” said Palen.  

Globus is a robust, cloud-based, file transfer service designed to move many large files, ranging from 10s of GBs to 10s of TBs. ARC is a Globus Subscription Provider for the U-M community, which allows U-M resources to serve as endpoints or collections for file transfers.

“Globus is the tool we use for data transfer, and it’s pretty much ubiquitous across Internet2, the national sites, other academic sites, and it interoperates everywhere,” Palen noted.  

Watch the YouTube video featuring Brock Palen

Check out the Globus user guide on the ARC website.