The U-M Research Computing Package is available to all researchers on all campuses (Ann Arbor, Dearborn, Flint, and Michigan Medicine), including:
Faculty and Staff
All professors and lecturers with an active appointment
Emeritus faculty
Guest faculty who are visiting for one to two years
Principal investigators (e.g., research scientists) on self-funded research activities
PostDocs who have their own research grant
PhD Students
PhD students may qualify for their own UMRCP resources depending on who is overseeing their research and their advisor relationship. Students should consult with their PhD program administrator to determine their eligibility. ITS will confirm this status when a UMRCP request is submitted.
Undergraduate and Masters Students
Undergraduate and Masters students do not currently qualify for their own UMRCP. They can be added as users or administrators of another person’s UMRCP. Students can also access other ITS programs such as Great Lakes for Course Accounts, and Student Teams.
I already have some of the ARC services listed. Do I still need to apply?
Yes. Existing ARC resources are not migrated to UMRCP support until the leader of the resources applies. Existing resources can be migrated to the UMRCP once created and do not require movement of data or changes in workflow.
Can I sign someone else up for the UMRCP?
No. To ensure that we can best meet your needs, it is important that researchers sign up directly for the allocations. It is not possible at this time to sign up on behalf of someone else.
I have multiple students/collaborators in my research team. Can they share resources?
Yes. All UMRCP and ARC resources are designed to be collaborative and multidisciplinary capable. UMRCP recipients can delegate both administrative and user access to anyone with a valid campus uniquename.
I used up / need more resources than UMRCP provides, can I get more?
I’m a faculty member who already has the UMRCP. How do I renew?
No action is required. Your no-cost allocations automatically renew on July 1.
I’m a PhD student with my own funded research. How do I renew my UMRCP?
No action is required. Your no-cost allocations automatically renew on July 1.
I’m a PhD, Master’s, or undergraduate student, and my advisor has left the university. My research is continuing to finish my dissertation. What should I do?
If your faculty advisor has left the university (or are about to), but you remain: an alternative administrator (Staff or Faculty who can verify the ongoing need of the allocation for wrapping up activities related to the former research group) must be assigned to the account so that the allocations can continue uninterrupted. See How do I delegate administrative management of my UMRCP or other ARC resources?
I’m an active faculty member but received a UMRCP off boarding message warning that my resources won’t be renewed. What do I need to do to keep them?
Likely your MCommunity entry is private or another status (e.g., Retired/Emeritus) which makes verifying your active research status ambiguous and needs to be confirmed. You can reply to the automated message or contact arc-support@umich.edu to confirm your active status and renew your resources for the upcoming year.
How do I make changes if I have already submitted the UMRCP request form?
If you have already submitted the UMRCP request form and you want to modify it, open a ticket by sending an email to ARC Support, arc-support@umich.edu. Be sure to include the name of your project.
Visit the ARC Resource Management Portal. On your project overview page look for Project Admins. Select the Add button and add a trusted university member by inputting their uniqname. This is helpful when the project owner is leaving the university but requires the resources to allow students to finish their dissertation work, as the new delegated admin can confirm the ongoing need each June for UMRCP renewal. Admins can make any change to your project so be sure to only add members you trust.
The national HPC resource known as XSEDE has now fully transitioned to Advanced Cyberinfrastructure Coordination Ecosystem: Services & Support (ACCESS).
ACCESS is an open scientific discovery infrastructure combining leadership class resources at partner sites across the United States to create an integrated, persistent computational resource. The National Science Foundation’s ACCESS program builds upon the successes of the 11-year XSEDE project, while also expanding the ecosystem with capabilities for new modes of research and further democratizing participation.
ARC staff is here to answer questions and to help you take advantage of the national resources available through the ACCESS program. Contact arc-support@umich.edu for assistance from ARC staff, Shelly Johnson and Todd Raeker, or visit the ACCESS website for more information.
Due to maintenance, the high-performance computing (HPC) clusters and their storage systems (/home and /scratch) will be unavailable: Great Lakes, Armis2, Lighthouse : Monday, June 5, 2023, 8am – Friday, June…
OVERVIEW This course will familiarize the student with the basics of accessing and interacting with Linux computers using the GNU/Linux operating system’s Bash shell, also generically referred to as “the…
OVERVIEW This workshop will introduce you to high performance computing on the Great Lakes cluster. After a brief overview of the components of the cluster and the resources available there, the main…
OVERVIEW This workshop will cover some more advanced topics in computing on the U-M Great Lakes Cluster. Topics to be covered include a brief review of common parallel programming models…
The Great Lakes Slurm cluster is a campus-wide computing cluster that serves the broad needs of researchers across the university. The Great Lakes HPC Cluster has replaced Flux, the shared research computing cluster that currently serves over 300 research projects and 2,500 active users.
The Great Lakes HPC Cluster is available to all researchers on campus for simulation, modeling, machine learning, data science, genomics, and more. The platform provides a balanced combination of computing power, I/O performance, storage capability, and accelerators.
Based on extensive input from faculty and other stakeholders across campus, the Great Lakes HPC Cluster is designed to deliver similar services and capabilities as Flux, including access to GPUs and large-memory nodes and improved support for emerging uses such as machine learning and genomics. The Great Lakes HPC Cluster consists of approximately 13,000 cores.
U-M Research Computing Package
The University of Michigan Research Computing Package (UMRCP) is an investment into the U-M research community via simple, dependable access to several ITS-provided high-performance computing clusters and data storage resources. CPU credits are allocated on the Great Lakes cluster and can be used for standard, larger memory, or GPU resources.
Grant Support
See the Grant Resources page for information regarding grant proposals using the Great Lakes HPC Cluster.
LSA-specific Information
See the LSA funding page for information on funding courses at the College of Literature, Science, and the Arts. LSA researchers who do not have access to any other account may be eligible to use the accounts provided centrally by LSA. The usage policy and restrictions on these accounts is described in detail on the LSA’s public Great Lakes accounts page.
LSA increased their cost-sharing for the rest of the 2021-2022 fiscal year. Read about the details here.
Questions about access or use of these accounts should be sent to arc-support@umich.edu.
Due to maintenance, the high-performance computing (HPC) clusters and their storage systems (/home and /scratch) will be unavailable: Great Lakes, Armis2, Lighthouse : Monday, June 5, 2023, 8am – Friday, June…
OVERVIEW This workshop will introduce you to high performance computing on the Great Lakes cluster. After a brief overview of the components of the cluster and the resources available there, the main…
OVERVIEW This workshop will cover some more advanced topics in computing on the U-M Great Lakes Cluster. Topics to be covered include a brief review of common parallel programming models…
If you would like to create a Great Lakes Cluster account or have any questions, contact arc-support@umich.edu with lists of users, admins, and a Shortcode. UMRCP accounts are also available to eligible researchers. For more information, please visit our UMRCP page.
The 2021-22 rates for the Lighthouse Cluster have been approved. These rates represent cost recovery for the Lighthouse Cluster, and do not include any support your unit may choose to provide.
The Lighthouse cluster is meant to support researchers with grants that require the purchase of computing hardware or require hardware configurations not available on Great Lakes. Lighthouse allows researchers to work with ARC to purchase their own hardware and have it within the Lighthouse Slurm environment, with ARC providing the data center, staff, networking, storage and software. More information is available on the Lighthouse webpage. In addition to the hardware purchase cost, there is monthly charge to support the hardware in the Lighthouse environment.
Lighthouse Service Cost
$743.07/node/year ($61.92/node/month)
COLLEGE OF LITERATURE, SCIENCE, & THE ARTS LIGHTHOUSE SUPPORT
To support research using high performance computing, the College of Literature, Science, and the Arts (LSA) will cost-share the monthly operating charge for researchers paying on LSA unit shortcodes. For fiscal years 2022-23, 2023-24, and 2024-25 LSA will provide a 88% subsidy on the monthly service charge of existing LSA Lighthouse nodes. Replacement nodes will also be cost-shared for this period of time given the replacement is a 1:1 replacement with similar or lower physical space, power, and cooling requirements. Any additional Lighthouse nodes will not be cost-shared with the College. The table below provides an estimate of Lighthouse Operating Expenses using published ITS rates for FY2021-22. The rates for LSA researchers will change, and the table updated, if ITS rates change. LSA does not anticipate continuing the LSA subsidy for Lighthouse nodes after FY2025 due to the small number of people who use service across U-M and LSA. Please plan accordingly for the change in cost that would occur at the start of FY2026 if you continue to use Lighthouse.
The University of Michigan Research Computing Package (UMRCP), provided by ITS, is an investment into the U-M research community via simple, dependable access to several ITS-provided high-performance computing clusters and data storage resources.
As a research faculty member with the University of Michigan, it’s very likely you are eligible for Advanced Research Computing (ARC) resources through the University of Michigan Research Computing Package program.
This program makes research computing resources widely available for eligible faculty and researchers. Every UMRCP is customized:
The specific, and total, UMRCP allocations that would be available to you depend on the selections you make in the UMRCP request form.
After making selections in the request form, and once the requests have been approved and resources are created: then each research project enrolled in our UMRCP program has one or more cost-covered annual allocations of research computing or storage resources from ARC.
Some research scientists and PhD students are also eligible for U-M Research Computing Package allocations with ARC. The UMRCP program is designed so that most U-M researchers will have access to at least one set of UMRCP resources.
What is included with the UMRCP
The U-M Research Computing Package provides:
80,000 CPU HOURS OF HIGH-PERFORMANCE COMPUTING
CPU credits are allocated on the Great Lakes cluster or the Armis2 protected data cluster; they can be used for standard, larger memory, or GPU resources, and renewed every July 1. Please note – If you have an existing high-performance computing account it cannot be included under the UMRCP. Any request for computing hours through the UMRCP requires the creation of a new account.
10 TB OF REPLICATED, HIGH-PERFORMANCE STORAGE
The Turbo Research Storage service can be used on many ARC/ITS, unit, Globus.org, and laboratory systems. Turbo is best for short-term, active storage that researchers access regularly. Capacity can be split across sensitive and non-sensitive science data.
100 TB OF ARCHIVE STORAGE
Replicated, durable storage for massive data volumes. The Data Den Research Archive storage service also supports reliable data transfer and international data sharing via Globus.org. Capacity can be split across sensitive and non-sensitive science data. Data Den is best for storing data that researchers do not need regularly, but need to retain because of NIH and NSF grant requirements.
ADDITIONAL HPC ALLOCATIONS FOR COURSEWORK
Additional allocations are available for instructors and student teams. This is not part of the UMRCP, but these allocations are covered by ITS.
The UMRCP request form allows you to choose the ARC resource options that will best meet your research needs.
Each completed set of UMRCP resources will be a customized set of research computing allocations. If you decide to submit a request for UMRCP resources, the first step will be determining which resources to select, based on your research needs. The UMRCP request form includes multiple options to choose from for each type of resource.
Please note that of the ARC resource allocations available through the UMRCP, only those that are specifically requested through selection in the form will be created and allocated for your project.
The request form allows you to create a customized UMRCP by opting-in to the ARC services that will best meet your research needs.
Only the resources selected in the form will be included in your request.
Determine research needs: HPC, HPC storage, longer-term data retention requirements, data sensitivity
To make sure all ARC resource options that would best meet your research needs are selected in the UMRCP request form, please review the information on this site:
In addition to a form option to select an annual high-performance computing allocation — with the option of selecting either (or both) the Great Lakes cluster or Armis2 (HIPAA-aligned) cluster for this annual allocation — there are also high-performance computing storage and archival storage options available.
The UMRCP Requesting Resources Guide includes ARC resource overviews, as well as additional information through links, that describe the differences between the options that are available.
Depending on your research needs, you can plan on at least 30 minutes to review the information in the Requesting Resources Guide and complete the request form.
Complete the UMRCP request form
Only the eligible PI can submit the UMRCP request form:
The form validation process creates groups, and if the request is approved these groups and all allocated ARC resources will be linked to the person logged in — who can only be the eligible primary faculty member or researcher.
After submitting the UMRCP form an approval process is initiated. You will be contacted if any additional information is required from you for determining your eligibility.
Once approved, the creation and allocation process for each ARC resource involves multiple steps for each resource, and coordination between multiple teams.
As each ARC resource allocation that was requested is completed, a notification is sent to let you know that the resource is ready to be accessed.
The notification you receive will include information about your ARC resource.
If after reviewing the information provided in one of these notifications you have any questions, or there are any issues, please feel free to contact us directly.
Manage access to ARC resources provided through the UMRCP
Once UMRCP resource allocations have been created, then project owners (PIs), or any additional project administrators who have been designated, can grant any person who has a fully active U-M account access to those ARC resources.
Access to ARC resources is restricted:
Only project owners, and any designated ARC project administrators, are permitted to manage user access.
Most ARC projects can be managed by project owners, and any designated ARC project administrators, in the Resource Management Portal application.
The ARC-TS 2020 winter maintenance will start on March 9, 2020. You can read more about it here. https://arc-ts.umich.edu/event/winter-2020-maintenance/ Great Lakes, Lighthouse, and Armis2…
Billing for Great Lakes will begin on January 6, 2020 at 9am. Rates for using Great Lakes can be found here: https://arc-ts.umich.edu/greatlakes/rates . No…
All HPC accounts, users, and workflows must be migrated to either Great Lakes (Standard, GPU, Large-Memory, and On-demand Flux) or Lighthouse (Flux Operating Environment nodes)…
The ARC-TS annual summer maintenance will begin. The details, services impacted, and length of the maintenance will be determined when we get closer to the date. …
Great Lakes primary installation complete and ARC-TS begins loading and configuration
May 30, 2019
The Great Lakes installation is primarily complete other than waiting for final HDR firmware testing for the new InfiniBand system. The current InfiniBand system is…
The Beta HPC cluster was introduced to enable HPC users to begin migrating their Torque job scripts to Slurm and test their workflows on a Slurm-based…
Dell, Mellanox, and DDN will be delivering and installing the hardware to deliver the new Great Lakes service. These teams will be working along side the…
Billing for the Great Lakes service began on January 6, 2020. Existing, active Flux accounts and logins have been added to the Great Lakes Cluster. Complete this form to get a new Great Lakes cluster login.
If you would like to create a Great Lakes Cluster account or have any questions, contact arcts-support@umich.edu with lists of users, admins, and a shortcode. Trial accounts are also available for new PIs.
Advanced Research Computing (ARC), a division of ITS, is pleased to offer a pilot called Scientific Computing and Research Consulting Services to help researchers implement data analytics and workflows within their research projects. This includes navigating technical resources like high-performance computing and storage.
The ARC Scientific Computing and Research Consulting Services team will be your guide to navigating the complex technical world: from implementing intense data projects, to teaching you how the technical systems work, to assist in identifying proper tools, to guiding you on how to hire a programmer.
If you have any questions or wish to setup a consult, please contact us at arc-consulting@umich.edu. Be sure to include as much information as possible from the “Get started” section noted above.
If you have more general questions about ARC services or software please contact us at arc-support@umich.edu
We are available to assist researchers along the entire lifecycle of the data workflow, from the conceptual stage to ingest, preprocessing, cleansing, and storage solutions. We can advise in the following areas:
Establishing and troubleshooting dataflows between systems
Selecting the appropriate systems for short-term and long-term storage
Transformation of raw data into structured formats
Data deduplication and cleansing
Conversion of data between different formats to aide in analysis
Automation of dataflow tasks
Analytics
The data science consulting team can assist with data analytics to support research:
Choosing the appropriate tools and techniques for performing analysis
Development of data analytics in a variety of frameworks
Cloud-based (Hadoop) analytic development
Machine Learning
Machine learning is an application of artificial intelligence (AI) that focuses on the development of computer programs to learn information from data.
We are available to consult on the following. This includes a general overview of concepts, discussion into what tools and architectures best fit your needs, or technical support on implementation.
Language
Tools/Architectures
Models
Python
Python data tools (scikit, numpy, etc)
Neural networks
C++
TensorFlow
Decision trees
Java
Jupyter notebooks
Support vector machines
Matlab
Programming
We also provide consulting on programming in a variety of programming languages (including but not limited to: C++, Java, and Python) to support your data science needs. We can assist in algorithm design and implementation, as well as optimizing and parallelizing code to efficiently utilize high performance computing (HPC) resources where possible/necessary. We can help identify available commercial and open-source software packages to simplify your data analysis.
The Great Lakes HPC Cluster is the university-wide, shared computational discovery and high-performance computing (HPC) service. It is designed to support both compute- and data-intensive research.
Great Lakes is available without charge for student teams and organizations who need HPC resources. This program aims to enable students access to high-performance computing to enhance their team’s mission; it is not meant to be used for faculty-led research. Jobs submitted from a student team account will have lower priority and will be able to run when there are sufficient resources available. Currently, we are limiting the resources available to each team (see below), but in the future, we expect to increase the available resources. If your team or organization needs more resources or access to the large memory nodes, an administrator for your team can also request a paid Slurm account which will not have any of the restrictions mentioned below and can work in conjunction with the no-cost team account.
Access
Your student team/organization must be registered as aSponsored Student Organizations (SSO). If your team is an SSO and would like an account on Great Lakes, please have a sponsoring administrator email us at arcts-support@umich.edu with your team name, the uniqnames of the users who are allowed to use the account, and a list of the uniqnames of the account administrators who can make decisions about the account.
If you are a member of a team or organization that isn’t registered as an SSO, please email us at arc-support@umich.edu with the details of your organization and what your HPC needs are.
Limits
Once your account is created, your team will be able to use up to 100 CPUs in the standard partition and 1 GPU in the gpu partition. Jobs will be limited to 24 hours. If members of your team do not have Linux experience at the command-line, they can also use Great Lakes through their browser via Open OnDemand.
Billing for the Great Lakes service began on January 6, 2020. Existing, active Flux accounts and logins have been added to the Great Lakes Cluster. Complete this form to get a new Great Lakes cluster login.
If you would like to create a Great Lakes Cluster account or have any questions, contact arc-support@umich.edu with lists of users, admins, and a shortcode. Trial accounts are also available for new PIs.