Category

Uncategorized

Technology supports researchers’ quest to understand parental discipline behaviors

By | Feature, HPC, News, Research, Systems and Services, Uncategorized

Image by Rajesh Balouria from Pixabay

How do different types of parental discipline behaviors affect children’s development in low- and middle-income countries (LMICs)? A group of researchers set out to understand that question. They used a large data set from UNICEF of several hundred thousand families. The data came from the fourth (2009–2013) and fifth (2012–2017) rounds of the UNICEF Multiple Indicator Cluster Surveys. 

“The majority of parenting research is conducted in higher income and Westernized settings. We need more research that shows what types of parenting behaviors are most effective at promoting children’s development in lower resourced settings outside of the United States. I wanted to conduct an analysis that provided helpful direction for families and policymakers in LMICs regarding what parents can do to raise healthy, happy children,” said Kaitlin Paxton Ward, People Analytics Researcher at Google and Research Affiliate at the University of Michigan.

Dr. Paxton Ward is the lead author on the recently-released paper, “Associations between 11 parental discipline behaviors and child outcomes across 60 countries.” Other authors are also cited in the article: Andrew Grogan-Kaylor, Julie Ma, Garrett T. Pace, and Shawna Lee.

Together, they tested associations between 11 parental discipline behaviors and outcomes (aggression, distraction, and prosocial peer relations) of children under five years in 60 LMICs:

  • Verbal reasoning (i.e., explaining why the misbehavior was wrong)
  • Shouting
  • Name calling
  • Shaking
  • Spanking
  • Hitting/slapping the body
  • Hitting with an object 
  • Beating as hard as one could
  • Removing privileges 
  • Explaining
  • Giving the child something else to do

Results

Verbal reasoning and shouting were the most common parental discipline behaviors towards young children. Psychological and physical aggression were associated with higher child aggression and distraction. Verbal reasoning was associated with lower odds of aggression, and higher odds of prosocial peer relations. Taking away privileges was associated with higher odds of distraction, and lower odds of prosocial peer relations. Giving the child something else to do was associated with higher odds of distraction. The results indicated that there was some country-level variation in the associations between parenting behaviors and child socioemotional outcomes, but also that no form of psychological or physical aggression benefitted children in any country.

Conclusion 

Parental use of psychological and physical aggression were disadvantageous for children’s socioemotional development across countries. Only verbal reasoning was associated with positive child socioemotional development. The authors suggest that greater emphasis should be dedicated to reducing parental use of psychological and physical aggression across cultural contexts, and increasing parental use of verbal reasoning.

The technology used to analyze the data

The researchers relied on a complicated Bayesian multilevel model. This type of analysis incorporated knowledge from previous studies to inform the current analysis, and also provided a way for the researchers to look in more detail at variation across countries. To accomplish this task, the team turned to ITS Advanced Research Computing (ARC) and the Great Lakes High-Performance Computing Cluster. Great Lakes is the largest and fastest HPC service on U-M’s campus. 

“I know for me as a parent of young children, you want the best outcome. I have known people to grow up with different forms of discipline and what the negative or positive influence of those are,” said Brock Palen, ARC director. 

The researchers also created a visual interpretation of their paper for public outreach using a web app called ArcGIS StoryMaps. This software helps researchers tell the story of their work. With no coding required, StoryMaps combine images, text, audio, video, and interactive maps in a captivating web experience. StoryMaps can be shared with groups of users, with an organization, or with the world. 

All students, faculty, and staff have access to ArcGIS StoryMaps. Since 2014, U-M folks have authored over 7,500 StoryMaps, and the number produced annually continues to increase year-over-year. Explore examples of how people around the world are using this technology in the StoryMaps Gallery.

“This intuitive software empowers the U-M community to author engaging, multimedia, place-based narratives, without involving IT staff,” said Peter Knoop, research consultant with LSA Technology Services. 

Correspondence to Dr. Kaitlin Paxton Ward, kpward@umich.edu.

Related article

Secure Enclave Service rate approved, shortcode needed by July 25 

By | Data, News, Systems and Services, Uncategorized

The Yottabyte Research Cloud (YBRC) migrated to the Secure Enclave Services (SES) in 2022. The new system provides improved performance for researcher workloads. Due to this transition, ARC began billing researchers who consume more than 16 gigabytes (GB) of RAM (memory) per month on July 1, 2023. 

The first 16 GB of RAM (memory) is covered by the U-M Research Computing Package (UMRCP). If you have not already requested or been granted the UMRCP, learn more and request it on the UMRCP service page on the ARC website.

Approved rate 

The approved rate for a Secure Enclave Services machine is $7.00 per GB of RAM (memory) per machine, per month. Visit the Rates page on the ARC website for information about billing for all ARC services. 

Action requested: Submit a shortcode 

A shortcode is needed to accommodate billing for any resources consumed that are not covered by the UMRCP. Please submit a shortcode no later than July 25, 2023. Access to your machine will be removed or reduced if a shortcode is not on file by July 25. Contact us at arc-support@umich.edu to submit your shortcode, or make any changes to the configuration or use of your machines. 

Some schools and colleges (including the U-M Medical School) are subsidizing the use of Secure Enclave Services beyond the 16 GB of RAM (memory). Talk to your unit’s IT staff or email ARC to learn more. 

Contact ARC (arc-support@umich.edu) if you would like to meet with the ARC storage manager to ask questions or get clarification.

U-M Research Computing Package automatic renewal begins July 1

By | News, Research, Uncategorized

** Looking for the LARCC Application?

____

The no-cost bundle of supercomputing resources known as the U-M Research Computing Package (UMRCP) automatically renews for most on July 1. 

Provided by Information and Technology Services, the UMRCP offers qualified researchers on all campuses (Ann Arbor, Dearborn, Flint, and Michigan Medicine) with allocations of high-performance computing, secure enclave, and research storage services. (Many units, including Michigan Medicine, provide additional resources to researchers. Be sure to check with your school or college.) 

If a faculty researcher has left the university (or is about to), and their research remains at the university, an alternative administrator must be assigned via the ARC Resource management Portal (RMP) so that the allocations can continue uninterrupted. ARC is available to help researchers make this transition. 

Don’t have the UMRCP? Here’s how to request resources 

Faculty, as well as staff and PhD students with their own funded research on all campuses (Ann Arbor, Dearborn, Flint, and Michigan Medicine), are welcome to request allocations. Full details are available on the Advanced Research Computing website

PhD researchers who do not have their own funded research can work with their advisor to be added to their allocations via the ARC Resource Management Portal (RMP).

“The UMRCP was launched in 2021 to meet the needs of a diversity of disciplines and to provide options for long-term data management, sharing, and protecting sensitive data,” said Brock Palen, director, ITS Advanced Research Computing. “The UMRCP alleviates a lot of the pressure that researchers feel in terms of managing the technology they need to achieve breakthroughs.”

More information

Globus can now be used with Armis2 

By | Armis2, HPC, News, Uncategorized

Researchers who have an Armis2 High-Performance Computing account can now move data to and from other Protected Health Information (PHI)-approved systems using Globus File Transfer. (The endpoint is umich#armis2.) 

To learn more about your responsibility and approved services, visit the Sensitive Data Guide and the Protected Health Information (PHI) webpage on the Safe Computing Website. Send an email to ARC at arc-support@umich.edu to get started using Globus with PHI on your own system (this is not needed for researchers using ARC services including Armis2, and Data Den and Turbo with Sensitive Data).

“With the addition of Globus on Armis2, researchers using ITS ARC services can use the same Globus tools and processes to securely and reliably move their data on all ARC systems and across the university and beyond,” said Matt Britt, ARC HPC systems manager.

Globus allows the transfer and collaborative access of data between different storage systems, lab computers, and personal desktops and laptops. Globus enables researchers to use a web browser to submit transfer and data synchronization requests between destinations. 

As a robust, cloud-based, file transfer service, Globus is designed to securely move your data, ranging from megabytes to petabytes. ARC is a Globus Subscription Provider for the U-M community, which allows U-M resources to serve as endpoints or collections for file transfers.

“There are many interesting research collaborations happening at U-M, as well as nationally and internationally. Globus can facilitate all of those interactions securely,” said Brock Palen, ARC director. “Globus is the go-to tool we recommend for data transfer.”

Learn more 

How can we help you?

For assistance or questions, contact ARC at arc-support@umich.edu.

Data Den now supports sensitive data

By | News, Uncategorized

Data Den Research Archive is a service for preserving electronic data generated from research activities. It is a low-cost, highly durable storage system and is the largest storage system operated by ARC. Storing of sensitive data (including HIPAA, PII, and FERPA) is now supported (visit the Sensitive Data Guide for full details). This service is part of the U-M Research Computing Package (UMRCP) that provides storage allocations to researchers. Most researchers will not have to pay for Data Den. 

A disk-caching, tape-backed archive, this storage service is best for data that researchers do not need regularly, but still need to keep because of grant requirements. 

“Data Den is a good place to keep research data past the life of the grant,” said Jeremy Hallum, ARC research computing manager. “ARC can store data that researchers need to keep for five to ten years.” 

Hallum goes on to say that Data Den is only available in a replicated format. “Volumes of data are duplicated between servers or clusters for disaster recovery so research data is very safe.”

Data Den can be part of a well-organized data management plan providing international data sharing, encryption, and data durability. Jerome Kinlaw, ARC research storage lead, said that the Globus File Transfer service works well for data management. “Globus is easy to use for moving data in and out of Data Den.”

The ITS U-M Research Computing Package (UMRCP) provides 100 terabytes (TB) of Data Den storage to qualified researchers. This 100 TB can be divided between restricted and non-restricted variants of Data Den for use as needed. (The ITS Data Storage Finder can help researchers find the right storage solutions to meet their needs.)

“I’m pleased that Data Den now offers options for sensitive data, and that researchers can take advantage of the UMRCP allocations,” said Brock Palen, ARC director. “We want to lighten the load so that researchers can do what they do best, and our services are now more cost effective than ever.”

Globus maintenance happening at 9 a.m. on March 11

By | Armis2, Data, General Interest, Great Lakes, HPC, News, Research, Uncategorized

Due to planned maintenance by the vendor, Globus services will be unavailable for up to two hours beginning at 9 a.m. U.S. Eastern Time (10 a.m. Central Time) on Saturday, March 11, 2023.

Customers will not be able to authenticate or initiate any transfers during that time. Any transfers that have started before the outage will be stalled until the outage is over. Transfers will resume once maintenance is complete.

More details are available on the Globus blog.

For assistance or questions, please contact ARC at arc-support@umich.edu.

Protein structure prediction team achieved top rankings

By | Great Lakes, News, Uncategorized

CASP15 is a bi-annual competition assessment of methods of protein structure modeling. Independent assessors then compared the models with experiments, and the results and their implications were discussed at the CASP15 Conference, held December 2022, in Turkey.

A joint team with members from the labs of Dr. Peter Freddolino and Dr. Yang Zhang took first place in the Multimer and Interdomain Prediction categories, and was again the top-ranked server in the Regular (domains) category according to the CASP assessor’s criteria.

These wins are well-earned. Freddolino noted, “This is a highly competitive event, against some of the very best minds and powerful companies in the world.”

The Zhang/Freddolino team competed against nearly 100 other groups which include other academic institutions, as well as major cloud and commercial companies. Groups from around the world submitted more than 53,000 models on 127 modeling targets in 5 prediction categories. 

“Wei’s predictions did amazingly well in CASP15!,” said Freddolino. Wei Zheng, Ph.D., is a lab member and a research fellow with the Department of Computational Medicine and Bioinformatics (DCMB). 

Zheng said that the team participates in the regular protein structure prediction and protein complex structure prediction categories. “The results are assessed as regular protein domain modeling, regular protein inter-domain modeling, and protein complex modeling. In all categories, our models performed very well!” 

The technology that supported this impressive work 

The resources to achieve these results were grant-funded, which allowed the team to leverage a number of university resources, including:  

  • The Lighthouse High-Performance Computing Cluster (HPC) service. Lighthouse is managed by the Advanced Research Computing (ARC) team, and ARC is a division of Information and Technology Services (ITS). 
  • The algorithms were GPU-intensive and run on the Great Lakes HPC Cluster. Graphics processing units (GPUs) are specialized processors designed to accelerate graphics rendering. The Great Lakes cluster provided additional space for running compute cycles. Kenneth Weiss, IT project manager senior with DCMB and HITS, said that many of the algorithms used by Zheng benefited from the increased performance of being able to compute the data on a GPU.
  • Multiple storage systems, including Turbo Research Storage. High-speed storage was crucial for storing AI-trained models and sequence libraries used by the methods developed by Zhang, Freddolino, and Zheng called D-I-TASSER/DMFold-Multimer. 
  • Given the scale of the CASP targets, the grant-funded compute augmented capacity by utilizing the Great Lakes cluster, Freddolino and his team took advantage of the allocations provided by the ITS U-M Research Computing Package (UMRCP) and the HITS Michigan Medicine Research Computing Investment (MMRCI) programs which defrayed the cost of computing substantially.
  • The collaboration tool Slack was used to keep Freddolino and Zheng in close contact with ARC and the DCMB teams. This provided the ability to deal with issues promptly, avoiding delays that would have had a detrimental impact on meeting CASP targets.

Technology staff from ARC, DCMB, and Health Information and Technology Services (HITS) provided assistance to the research team. All of the teams helped with the mitigation of bottlenecks that affected speed and throughput that Zheng needed for results. Staff also located and helped leverage resources including those on Great Lakes, utilizing available partitions and queues on the clusters.

“Having the flexibility and capacity provided by Great Lakes was instrumental in meeting competition deadlines,” said Weiss.

DCMB staff and the HITS HPC Teams team took the lead on triaging software problems giving Freddolino’s group high priority.

ARC Director Brock Palen provided monitoring and guidance on real-time impact and utilization of resources. “It was an honor to support this effort. It has always been ARC’s goal to take care of the technology so researchers can do what they do best. In this case, Freddelino and Zheng knocked it out of the park.” 

Jonathan Poisson, technical support manager with DCMB, was instrumental in helping to select and configure the equipment purchased by the grant. “This assistance was crucial in meeting the tight CASP15 targets, as each target is accompanied by a deadline for results.” 

Read more on the Computational Medicine and Bioinformatics website and the Department of Biological Chemistry website.

Related presentation: D-I-TASSER: Integrating Deep Learning with Multi-MSAs and Threading Alignments for Protein Structure Prediction

The resources to achieve these results were provided by an NIH-funded grant (“High-Performance Computing Cluster for Biomedical Research,” SIG: S10OD026825). 

Dailey receives U-M Robotics’ first-ever alumni award 

By | General Interest, Happenings, News, Research, Uncategorized

Meghan Dailey will be presenting The Future of Machine Learning in Robotics on September 23 at 2 p.m., at FMCRB or Zoom

Meghan Dailey is the U-M Robotics department’s first Alumni Merit Award recipient!

Dailey is a member of the first-ever class in U-M Robotics. She earned a Masters of Science degree in 2015 with a focus in artificial intelligence. She is currently a machine learning specialist with Advanced Research Computing (ARC), a division of Information and Technology Service (ITS)

You’re invited 

In honor of the award, Dailey will be presenting “The Future of Machine Learning in Robotics” on Friday, September 23, 2 p.m., Ford Robotics Building (FMCRB) or on Zoom (meeting ID: 961 1618 4387, passcode: 643563). Machine learning is becoming widely prevalent in many different fields, including robotics. In a future where robots and humans assist each other in completing tasks, what is the role of machine learning, and how should it evolve to effectively serve both humans and robots? Dailey will discuss her past experiences in robotics and machine learning, and how she envisions machine learning contributing to the growth of the robotics field.

About Dailey

A member of the ARC Scientific Computing and Research Consulting Services team, Dailey helps researchers with machine learning and artificial intelligence programming. She has consulted with student and faculty teams to build neural networks for image analysis and classification. She also has extensive experience in natural language processing and has worked on many projects analyzing text sentiment and intent.

Image courtesy U-M Robotics

Maintenance on May 3: Great Lakes, Armis2, and Lighthouse 

By | Uncategorized

On Tuesday, May 3, from 7-7:30 a.m., ARC will perform a vendor upgrade to Slurm on Great Lakes, Armis2, and Lighthouse HPC Clusters and their storage systems. ARC apologizes for the inconvenience.

Impact

  • This maintenance will cause a Slurm outage from 7-7:30 a.m.

  • Access to the cluster file systems will be available during the update.

  • Jobs not able to complete prior to 7 a.m. May 3 will not be able to start until after maintenance has ended. These jobs will be eligible to start once maintenance has been completed.

Status updates

Check the ITS Service Status page and follow ARC on Twitter for progress updates.

How can we help you?

For assistance or questions, please contact ARC at arc-support@umich.edu, or visit Virtual Drop-in Office Hours (CoderSpaces) for hands-on help, available 9:30-11 a.m. and 2-3:30 p..m. on Tuesdays; 1:30-3 p.m. on Wednesdays; and 2-3:30 p.m. on Thursdays.

For other topics, contact the ITS Service Center:

Understanding the strongest electromagnetic fields in the universe

By | Data, Great Lakes, HPC, Research, Uncategorized

Alec Thomas is part of the team from the U-M College of Engineering Gérard Mourou Center for Ultrafast Optical Science that is building the most powerful laser in the U.S.

Dubbed “ZEUS,” the laser will be 3-petawatts of power. That’s a ‘3’ with 15 zeros. All the power generated in the entire world is 10-terawatts, or 1000 times less than the ZEUS laser. 

The team’s goal is to use the laser to explore how matter behaves in the most extreme electric and magnetic fields in the universe, and also to generate new sources of radiation beams, which may lead to developments in medicine, materials science, and national security. 

A simulation of a plasma wake.

This simulation shows a plasma wake behind a laser pulse. The plasma behaves like water waves generated behind a boat. In this image, the “waves” are extremely hot plasma matter, and the “boat” is a short burst of powerful laser light. (Image courtesy of Daniel Seipt.)

“In the strong electric fields of a petawatt laser, matter becomes ripped apart into a `plasma,’ which is what the sun is made of. This work involves very complex and nonlinear physical interactions between matter particles and light. We create six-dimensional models of particles to simulate how they might behave in a plasma in the presence of these laser fields to learn how to harness it for new technologies. This requires a lot of compute power,” Thomas said. 

That compute power comes from the Great Lakes HPC cluster, the university’s fastest high-performance computing cluster. The team created equations to solve a field of motion for each six-dimensional particle. The equations run on Great Lakes and help Thomas and his team to learn how the particle might behave within a cell. Once the field of motion is understood, solutions can be developed. 

“On the computing side, this is a very complex physical interaction. Great Lakes is designed to handle this type of work,” said Brock Palen, director of Advanced Research Computing, a division of Information and Technology Services. 

Thomas has signed up for allocations on the Great Lakes HPC cluster and Data Den storage. “I just signed up for the no-cost allocations offered by the U-M Research Computing Package. I am planning to use those allocations to explore ideas and concepts in preparation for submitting grant proposals.”

Learn more and sign up for the no-cost U-M Research Computing Package (UMRCP).

Prof. Thomas’ work is funded by a grant from the National Science Foundation.