Category

Uncategorized

U-M Research Computing Package automatic renewal begins July 1

By | News, Research, Uncategorized

The no-cost bundle of supercomputing resources known as the U-M Research Computing Package (UMRCP) automatically renews for most on July 1. 

Provided by Information and Technology Services, the UMRCP offers qualified researchers on all campuses (Ann Arbor, Dearborn, Flint, and Michigan Medicine) with allocations of high-performance computing, secure enclave, and research storage services. (Many units, including Michigan Medicine, provide additional resources to researchers. Be sure to check with your school or college.) 

If a faculty researcher has left the university (or is about to), and their research remains at the university, an alternative administrator must be assigned via the ARC Resource management Portal (RMP) so that the allocations can continue uninterrupted. ARC is available to help researchers make this transition. 

Don’t have the UMRCP? Here’s how to request resources 

Faculty, as well as staff and PhD students with their own funded research on all campuses (Ann Arbor, Dearborn, Flint, and Michigan Medicine), are welcome to request allocations. Full details are available on the Advanced Research Computing website

PhD researchers who do not have their own funded research can work with their advisor to be added to their allocations via the ARC Resource Management Portal (RMP).

“The UMRCP was launched in 2021 to meet the needs of a diversity of disciplines and to provide options for long-term data management, sharing, and protecting sensitive data,” said Brock Palen, director, ITS Advanced Research Computing. “The UMRCP alleviates a lot of the pressure that researchers feel in terms of managing the technology they need to achieve breakthroughs.”

More information

Globus can now be used with Armis2 

By | Armis2, HPC, News, Uncategorized

Researchers who have an Armis2 High-Performance Computing account can now move data to and from other Protected Health Information (PHI)-approved systems using Globus File Transfer. (The endpoint is umich#armis2.) 

To learn more about your responsibility and approved services, visit the Sensitive Data Guide and the Protected Health Information (PHI) webpage on the Safe Computing Website. Send an email to ARC at arc-support@umich.edu to get started using Globus with PHI on your own system (this is not needed for researchers using ARC services including Armis2, and Data Den and Turbo with Sensitive Data).

“With the addition of Globus on Armis2, researchers using ITS ARC services can use the same Globus tools and processes to securely and reliably move their data on all ARC systems and across the university and beyond,” said Matt Britt, ARC HPC systems manager.

Globus allows the transfer and collaborative access of data between different storage systems, lab computers, and personal desktops and laptops. Globus enables researchers to use a web browser to submit transfer and data synchronization requests between destinations. 

As a robust, cloud-based, file transfer service, Globus is designed to securely move your data, ranging from megabytes to petabytes. ARC is a Globus Subscription Provider for the U-M community, which allows U-M resources to serve as endpoints or collections for file transfers.

“There are many interesting research collaborations happening at U-M, as well as nationally and internationally. Globus can facilitate all of those interactions securely,” said Brock Palen, ARC director. “Globus is the go-to tool we recommend for data transfer.”

Learn more 

How can we help you?

For assistance or questions, contact ARC at arc-support@umich.edu.

Data Den now supports sensitive data

By | News, Uncategorized

Data Den Research Archive is a service for preserving electronic data generated from research activities. It is a low-cost, highly durable storage system and is the largest storage system operated by ARC. Storing of sensitive data (including HIPAA, PII, and FERPA) is now supported (visit the Sensitive Data Guide for full details). This service is part of the U-M Research Computing Package (UMRCP) that provides storage allocations to researchers. Most researchers will not have to pay for Data Den. 

A disk-caching, tape-backed archive, this storage service is best for data that researchers do not need regularly, but still need to keep because of grant requirements. 

“Data Den is a good place to keep research data past the life of the grant,” said Jeremy Hallum, ARC research computing manager. “ARC can store data that researchers need to keep for five to ten years.” 

Hallum goes on to say that Data Den is only available in a replicated format. “Volumes of data are duplicated between servers or clusters for disaster recovery so research data is very safe.”

Data Den can be part of a well-organized data management plan providing international data sharing, encryption, and data durability. Jerome Kinlaw, ARC research storage lead, said that the Globus File Transfer service works well for data management. “Globus is easy to use for moving data in and out of Data Den.”

The ITS U-M Research Computing Package (UMRCP) provides 100 terabytes (TB) of Data Den storage to qualified researchers. This 100 TB can be divided between restricted and non-restricted variants of Data Den for use as needed. (The ITS Data Storage Finder can help researchers find the right storage solutions to meet their needs.)

“I’m pleased that Data Den now offers options for sensitive data, and that researchers can take advantage of the UMRCP allocations,” said Brock Palen, ARC director. “We want to lighten the load so that researchers can do what they do best, and our services are now more cost effective than ever.”

Globus maintenance happening at 9 a.m. on March 11

By | Armis2, Data, General Interest, Great Lakes, HPC, News, Research, Uncategorized

Due to planned maintenance by the vendor, Globus services will be unavailable for up to two hours beginning at 9 a.m. U.S. Eastern Time (10 a.m. Central Time) on Saturday, March 11, 2023.

Customers will not be able to authenticate or initiate any transfers during that time. Any transfers that have started before the outage will be stalled until the outage is over. Transfers will resume once maintenance is complete.

More details are available on the Globus blog.

For assistance or questions, please contact ARC at arc-support@umich.edu.

Protein structure prediction team achieved top rankings

By | Great Lakes, News, Uncategorized

CASP15 is a bi-annual competition assessment of methods of protein structure modeling. Independent assessors then compared the models with experiments, and the results and their implications were discussed at the CASP15 Conference, held December 2022, in Turkey.

A joint team with members from the labs of Dr. Peter Freddolino and Dr. Yang Zhang took first place in the Multimer and Interdomain Prediction categories, and was again the top-ranked server in the Regular (domains) category according to the CASP assessor’s criteria.

These wins are well-earned. Freddolino noted, “This is a highly competitive event, against some of the very best minds and powerful companies in the world.”

The Zhang/Freddolino team competed against nearly 100 other groups which include other academic institutions, as well as major cloud and commercial companies. Groups from around the world submitted more than 53,000 models on 127 modeling targets in 5 prediction categories. 

“Wei’s predictions did amazingly well in CASP15!,” said Freddolino. Wei Zheng, Ph.D., is a lab member and a research fellow with the Department of Computational Medicine and Bioinformatics (DCMB). 

Zheng said that the team participates in the regular protein structure prediction and protein complex structure prediction categories. “The results are assessed as regular protein domain modeling, regular protein inter-domain modeling, and protein complex modeling. In all categories, our models performed very well!” 

The technology that supported this impressive work 

The resources to achieve these results were grant-funded, which allowed the team to leverage a number of university resources, including:  

  • The Lighthouse High-Performance Computing Cluster (HPC) service. Lighthouse is managed by the Advanced Research Computing (ARC) team, and ARC is a division of Information and Technology Services (ITS). 
  • The algorithms were GPU-intensive and run on the Great Lakes HPC Cluster. Graphics processing units (GPUs) are specialized processors designed to accelerate graphics rendering. The Great Lakes cluster provided additional space for running compute cycles. Kenneth Weiss, IT project manager senior with DCMB and HITS, said that many of the algorithms used by Zheng benefited from the increased performance of being able to compute the data on a GPU.
  • Multiple storage systems, including Turbo Research Storage. High-speed storage was crucial for storing AI-trained models and sequence libraries used by the methods developed by Zhang, Freddolino, and Zheng called D-I-TASSER/DMFold-Multimer. 
  • Given the scale of the CASP targets, the grant-funded compute augmented capacity by utilizing the Great Lakes cluster, Freddolino and his team took advantage of the allocations provided by the ITS U-M Research Computing Package (UMRCP) and the HITS Michigan Medicine Research Computing Investment (MMRCI) programs which defrayed the cost of computing substantially.
  • The collaboration tool Slack was used to keep Freddolino and Zheng in close contact with ARC and the DCMB teams. This provided the ability to deal with issues promptly, avoiding delays that would have had a detrimental impact on meeting CASP targets.

Technology staff from ARC, DCMB, and Health Information and Technology Services (HITS) provided assistance to the research team. All of the teams helped with the mitigation of bottlenecks that affected speed and throughput that Zheng needed for results. Staff also located and helped leverage resources including those on Great Lakes, utilizing available partitions and queues on the clusters.

“Having the flexibility and capacity provided by Great Lakes was instrumental in meeting competition deadlines,” said Weiss.

DCMB staff and the HITS HPC Teams team took the lead on triaging software problems giving Freddolino’s group high priority.

ARC Director Brock Palen provided monitoring and guidance on real-time impact and utilization of resources. “It was an honor to support this effort. It has always been ARC’s goal to take care of the technology so researchers can do what they do best. In this case, Freddelino and Zheng knocked it out of the park.” 

Jonathan Poisson, technical support manager with DCMB, was instrumental in helping to select and configure the equipment purchased by the grant. “This assistance was crucial in meeting the tight CASP15 targets, as each target is accompanied by a deadline for results.” 

Read more on the Computational Medicine and Bioinformatics website and the Department of Biological Chemistry website.

Related presentation: D-I-TASSER: Integrating Deep Learning with Multi-MSAs and Threading Alignments for Protein Structure Prediction

The resources to achieve these results were provided by an NIH-funded grant (“High-Performance Computing Cluster for Biomedical Research,” SIG: S10OD026825). 

Dailey receives U-M Robotics’ first-ever alumni award 

By | General Interest, Happenings, News, Research, Uncategorized

Meghan Dailey will be presenting The Future of Machine Learning in Robotics on September 23 at 2 p.m., at FMCRB or Zoom

Meghan Dailey is the U-M Robotics department’s first Alumni Merit Award recipient!

Dailey is a member of the first-ever class in U-M Robotics. She earned a Masters of Science degree in 2015 with a focus in artificial intelligence. She is currently a machine learning specialist with Advanced Research Computing (ARC), a division of Information and Technology Service (ITS)

You’re invited 

In honor of the award, Dailey will be presenting “The Future of Machine Learning in Robotics” on Friday, September 23, 2 p.m., Ford Robotics Building (FMCRB) or on Zoom (meeting ID: 961 1618 4387, passcode: 643563). Machine learning is becoming widely prevalent in many different fields, including robotics. In a future where robots and humans assist each other in completing tasks, what is the role of machine learning, and how should it evolve to effectively serve both humans and robots? Dailey will discuss her past experiences in robotics and machine learning, and how she envisions machine learning contributing to the growth of the robotics field.

About Dailey

A member of the ARC Scientific Computing and Research Consulting Services team, Dailey helps researchers with machine learning and artificial intelligence programming. She has consulted with student and faculty teams to build neural networks for image analysis and classification. She also has extensive experience in natural language processing and has worked on many projects analyzing text sentiment and intent.

Image courtesy U-M Robotics

Maintenance on May 3: Great Lakes, Armis2, and Lighthouse 

By | Uncategorized

On Tuesday, May 3, from 7-7:30 a.m., ARC will perform a vendor upgrade to Slurm on Great Lakes, Armis2, and Lighthouse HPC Clusters and their storage systems. ARC apologizes for the inconvenience.

Impact

  • This maintenance will cause a Slurm outage from 7-7:30 a.m.

  • Access to the cluster file systems will be available during the update.

  • Jobs not able to complete prior to 7 a.m. May 3 will not be able to start until after maintenance has ended. These jobs will be eligible to start once maintenance has been completed.

Status updates

Check the ITS Service Status page and follow ARC on Twitter for progress updates.

How can we help you?

For assistance or questions, please contact ARC at arc-support@umich.edu, or visit Virtual Drop-in Office Hours (CoderSpaces) for hands-on help, available 9:30-11 a.m. and 2-3:30 p..m. on Tuesdays; 1:30-3 p.m. on Wednesdays; and 2-3:30 p.m. on Thursdays.

For other topics, contact the ITS Service Center:

Understanding the strongest electromagnetic fields in the universe

By | Data, Great Lakes, HPC, Research, Uncategorized

Alec Thomas is part of the team from the U-M College of Engineering Gérard Mourou Center for Ultrafast Optical Science that is building the most powerful laser in the U.S.

Dubbed “ZEUS,” the laser will be 3-petawatts of power. That’s a ‘3’ with 15 zeros. All the power generated in the entire world is 10-terawatts, or 1000 times less than the ZEUS laser. 

The team’s goal is to use the laser to explore how matter behaves in the most extreme electric and magnetic fields in the universe, and also to generate new sources of radiation beams, which may lead to developments in medicine, materials science, and national security. 

A simulation of a plasma wake.

This simulation shows a plasma wake behind a laser pulse. The plasma behaves like water waves generated behind a boat. In this image, the “waves” are extremely hot plasma matter, and the “boat” is a short burst of powerful laser light. (Image courtesy of Daniel Seipt.)

“In the strong electric fields of a petawatt laser, matter becomes ripped apart into a `plasma,’ which is what the sun is made of. This work involves very complex and nonlinear physical interactions between matter particles and light. We create six-dimensional models of particles to simulate how they might behave in a plasma in the presence of these laser fields to learn how to harness it for new technologies. This requires a lot of compute power,” Thomas said. 

That compute power comes from the Great Lakes HPC cluster, the university’s fastest high-performance computing cluster. The team created equations to solve a field of motion for each six-dimensional particle. The equations run on Great Lakes and help Thomas and his team to learn how the particle might behave within a cell. Once the field of motion is understood, solutions can be developed. 

“On the computing side, this is a very complex physical interaction. Great Lakes is designed to handle this type of work,” said Brock Palen, director of Advanced Research Computing, a division of Information and Technology Services. 

Thomas has signed up for allocations on the Great Lakes HPC cluster and Data Den storage. “I just signed up for the no-cost allocations offered by the U-M Research Computing Package. I am planning to use those allocations to explore ideas and concepts in preparation for submitting grant proposals.”

Learn more and sign up for the no-cost U-M Research Computing Package (UMRCP).

Prof. Thomas’ work is funded by a grant from the National Science Foundation.

Yottabyte (Blue) to retire April 2022  

By | Uncategorized

The Yottabyte Research Cloud (YBRC), powered by Verge.io, provides U-M researchers with high-performance, secure, and flexible computing environments enabling the analysis of data sets, and hosting of databases for research purposes. Yottabyte (Blue) will retire on April 4, 2022. Yottabyte (Maize) for sensitive data will continue to be offered as a service. 

To determine if a virtual server is hosted in YBRC ‘Blue,’ check the hostname for the word ‘blue’ in its name, such as ‘yb-hostname.blue.ybrc.umich.edu.’

Members of the ARC Yottabyte team or Unit IT Support staff members will reach out to customers before the end of 2021 to determine customer needs and develop migration plans. Customers should review their data and projects that are currently utilizing Yottabyte (Blue), and  delete anything not needed.  

Visit the YBRC webpage on the ARC website for additional information about the retirement

Leveraging technology to improve education outcomes

By | Uncategorized

Nicole Wagner LamResearchers from U-M campuses and all across the country are using education data provided by the State of Michigan to study a wide variety of topics ranging from the effects of COVID-19 on public school enrollment to the role of neighborhood instability on student educational outcomes to exploring the ways that financial assets can change youth’s lives.

An arm of the Education Policy Initiative (EPI), the Michigan Education Data Center (MEDC) is a secure data clearinghouse that helps researchers use the State of Michigan’s education data to answer critical questions that improve outcomes for students. 

“Improving public education is one of the most pressing challenges facing our country today,” stated Kyle Kwaiser, EPI data architect and manager. “We’re using tools meant for research support, but using them as the foundation of a data clearinghouse serving researchers nationwide, and they’re working well.”

“Our researchers cover a breadth of topics for which Michigan education data are being used across all campuses. We think that the findings of these projects are powerful or will be very useful to policymakers,” said Nicole Wagner Lam, associate director for the Education Policy Initiative, the MEDC sponsor. 

Lam goes on to say that there are currently about 60 active research projects, about half of which are being conducted by U-M researchers. Researchers affiliated with U-M or Michigan State University leverage this restricted data stored on Turbo Research Storage that is provided by ARC.

Researchers also need a secure way to transfer, store, and analyze restricted data. MEDC affiliates also use Globus File Transfer and Yottabyte Research Cloud (YBRC) along with Turbo. Together, these three services enable productive and impactful research. 

Steve Wolodkin, ARC research cloud designer, says that, as a private cloud environment, YBRC gives researchers both good data protection and a familiar desktop environment. It also provides an easy mechanism to give multiple users access to the same desktop. Further, it is easy to add and remove users, which benefits this group particularly, as many of the people using it are students and change regularly. 

Jeremy Hallum, ARC research computing manager, explains how it all works together. YBRC provides a Windows virtual machine pool with various statistical software, configured in a way that supports this group’s research. User profiles and shared storage are integrated with Turbo storage, which allows researchers to access their data on any machine that they use. Globus is designed to move many large files, ranging from tens of gigabytes to tens of terabytes.

Lam said, “We are lucky to be at U-M and to work with researchers from all over the country. There are a lot of low- or no-cost resources at U-M to leverage, and many units that provide support.”

Kwaiser says that they worked closely with ARC two years ago when they were getting started. ARC helped create the structure, ensure security, and train MEDC how to use the services. It was a lot of work to get started. Now, everything is running smoothly. When needed, staff will attend one of four weekly office hours or attend an Office of Research training session. 

Several other ITS services have also been valuable for database hosting, access control, security, and VPN network connections. MiDatabase, a hosting service that provides campus with a centrally managed, on-premise cloud environment that reduces the cost, risk, and overhead involved in running services independently. MCommunity APIs are used to monitor and control access, know who’s logging on when, revoke access, control access to ARC resources. ARC is core but the MEDC team is also using ITS services. ITS Information Assurance was instrumental during the startup up, particularly when gaining data access approvals from the State of Michigan. ITS worked with MEDC and the State of Michigan to set up a standing Virtual Private Network (VPN) connection.

“I knew we could make solid progress with these ITS experts on board. I appreciate being able to work directly with them. It took months to gain trust from all parties when we were setting up, and now the resulting research is amazing,” Kwaiser said. “We’ve been able to get really far with resources at U-M.”