Category

Research

Understanding the strongest electromagnetic fields in the universe

By | Data, Great Lakes, HPC, Research, Uncategorized

Alec Thomas is part of the team from the U-M College of Engineering Gérard Mourou Center for Ultrafast Optical Science that is building the most powerful laser in the U.S.

Dubbed “ZEUS,” the laser will be 3-petawatts of power. That’s a ‘3’ with 15 zeros. All the power generated in the entire world is 10-terawatts, or 1000 times less than the ZEUS laser. 

The team’s goal is to use the laser to explore how matter behaves in the most extreme electric and magnetic fields in the universe, and also to generate new sources of radiation beams, which may lead to developments in medicine, materials science, and national security. 

A simulation of a plasma wake.

This simulation shows a plasma wake behind a laser pulse. The plasma behaves like water waves generated behind a boat. In this image, the “waves” are extremely hot plasma matter, and the “boat” is a short burst of powerful laser light. (Image courtesy of Daniel Seipt.)

“In the strong electric fields of a petawatt laser, matter becomes ripped apart into a `plasma,’ which is what the sun is made of. This work involves very complex and nonlinear physical interactions between matter particles and light. We create six-dimensional models of particles to simulate how they might behave in a plasma in the presence of these laser fields to learn how to harness it for new technologies. This requires a lot of compute power,” Thomas said. 

That compute power comes from the Great Lakes HPC cluster, the university’s fastest high-performance computing cluster. The team created equations to solve a field of motion for each six-dimensional particle. The equations run on Great Lakes and help Thomas and his team to learn how the particle might behave within a cell. Once the field of motion is understood, solutions can be developed. 

“On the computing side, this is a very complex physical interaction. Great Lakes is designed to handle this type of work,” said Brock Palen, director of Advanced Research Computing, a division of Information and Technology Services. 

Thomas has signed up for allocations on the Great Lakes HPC cluster and Data Den storage. “I just signed up for the no-cost allocations offered by the U-M Research Computing Package. I am planning to use those allocations to explore ideas and concepts in preparation for submitting grant proposals.”

Learn more and sign up for the no-cost U-M Research Computing Package (UMRCP).

Prof. Thomas’ work is funded by a grant from the National Science Foundation.

Preserving Michigan’s musical history and culture

By | Feature, News, Research

From Kentucky bluegrass to Louisiana Zydeco to German hurdy-gurdy to East European Klezmer to Indian Manipuri dancing to Native American pow wows, and much more, these musical traditions from around the country and around the world have found their way to Michigan. Beginning in 2014, the Musical Heritage Project has been documenting Michigan’s folk music history.

Lester Monts

Lester Monts Lester Monts specializes in ethnomusicology and has been documenting Michigan’s folk cultural heritage since 2014. (Image courtesy Lester Monts)

The project is led by ethnomusicologist Dr. Lester P. Monts, Arthur F. Thurnau Professor Emeritus of Music, who began his musical journey as an orchestral trumpet player. He earned bachelor’s and master’s degrees in trumpet performance and teaching trumpet at the college level before completing the doctoral degree in ethnomusicology and embarking on a research career. In the mid-1970s, Monts began to focus his research on music and culture in Liberia and Sierra Leone in West Africa. The fourteen-year Liberian civil war thwarted his fieldwork in that region.

Noting that there has been no systematic effort to collect and archive Michigan’s rich folk music heritage, the Michigan Musical Heritage Project was launched. Monts has embraced the study of music from the cultural and social aspects of the people who make it. He notes that “music brings people together; it has the power to create community, and we witnessed this occurring throughout our many journeys around the state.”

Using his charm, passion, likeability, and keen musical knowledge to cultivate trust with his interviewees, Monts captured more than 400 hours of audio and video data over the years, amassing a total of 80 terabytes of data. He believes this to be the most extensive collection of Michigan folk music in the state and that U-M is the right place to house this collection.

The Michigan Musical Heritage Project crew.

The Michigan Musical Heritage Project crew wraps up at the end of recording session. (Image courtesy Lester Monts)

With a videography crew consisting primarily of former U-M students, Monts traveled all around the state to record performances at folk music festivals and cultural gatherings, such as the Celtic Festival (Saline), Irish Folk Music Festival (Muskegon) Hispanic Heritage Festival (Hart), Hiawatha Traditional Music Festival (Marquette), Port Sanilac Blues Festival (Port Sanilac), Africa World Festival (Detroit), Aura Jamboree (Aura), Oldtime Fiddlers Convention and Traditional Music Festival (Hillsdale).

He says, “The creative talents of the state’s outstanding musicians must be preserved, not only for my research but for that of others as well. If properly preserved, I’m confident that in the future, the ethnomusicology program and the American Cultures department will find these data provide important insights into Michigan’s diverse musical heritage.”

How technology supports this project 

Monts’ crew includes a strong partnership with Tom Bray, converging technologies consultant and adjunct assistant professor of Art and Design, Penny W. Stamps School of Art and Design. Bray has been instrumental in pairing the right technology for the long-term preservation of this collection, which includes converting older footage to digital media. 

Tom Bray

Tom Bray (image courtesy LSA)

Bray has collaborated with Monts to convert older technologies, such as VHS, 8mm, and high-8 video, to digital files. The files are both compressed and uncompressed and are very large and of high resolution.

All of this wonderful and important audio and video footage needs to be preserved somewhere. But where do you turn when you have 80 terabytes of data? Monts said, “I’ve been desperately searching for a way to archive the video data collected under the auspices of the Michigan Musical Heritage Project.” 

Enter the U-M Research Computing Package (UMRCP) and the team from Advanced Research Computing (ARC), a division of Information and Technology Services. The UMRCP offers researchers across all campuses several resources at no additional cost to researchers, including 100 terabytes of long-term storage.

Bray said, “I had to read the UMRCP email announcement twice because I couldn’t believe my eyes. I was so excited that ITS and the university are supporting researchers in this way. We jumped on this opportunity right away.” 

ARC Director Brock Palen is excited about this work, too. “This is super interesting, and not like the usual types of research ARC normally sees, like climate and genomics. We’re happy to help Dr. Monts and Mr. Bray, and anyone who needs it, anytime. The archive is intentionally built for holding large-volume, raw data such as 4k video, and we are proud to be their go-to for this important cultural preservation project.” 

Old media in Dr. Monts' office

Hours and hours of media is being converted to a digital format. (Photo by Stephanie Dascola)

ARC replicates and encrypts in two secure locations that are miles apart, so those who use ARC services will not have to worry about crashes that they might experience if they are using their own equipment. The UMRCP also includes technical expertise by talented ARC staff to further remove barriers so researchers can do what they do best.

Monts and Bray also leverage the university’s network and WiFi services to transfer the files from their studio in the Duderstadt Center to storage. The network is designed to minimize bottlenecks so that data transfers quickly and efficiently. 

Dr. Monts said, “Although the pandemic temporarily disrupted my plans to complete the video documentary, I take solace in knowing that the many hours of data we collected is in a much safer environment than we had. The UMRCP storage resource is truly a boon!”

Related links

An old reel-to-reel tape player.

A reel-to-reel tape player. (Photo by Stephanie Dascola)

Lester Monts plays footage from a special women's only dance in Iberia.

Dr. Monts shows footage from a special women-only dance in Iberia. He earned permission to record this rarely-documented group of women. (Photo by Stephanie Dascola)

No-cost research computing allocations now available

By | HPC, News, Research, Systems and Services, Uncategorized

U-M Research Computing PackageResearchers on all university campuses can now sign up for the U-M Research Computing Package, a new package of no-cost supercomputing resources provided by Information and Technology Services.

As of Sept. 1, university researchers have access to a base allocation for 80,000 CPU hours of high-performance computing and research storage services at no cost. This includes 10 terabytes of high-speed and 100 terabytes of archival storage.

These base allocations will meet the needs of approximately 75 percent of current high-performance-computing users and 90 percent of current research storage users. Researchers must sign up on ITS’s Advanced Research Computing website to receive the allocation.

“With support from President (Mark) Schlissel and executive leadership, this initiative provides a unified set of resources, both on campus and in the cloud, that meet the needs of the rich diversity of disciplines. Our goal is to encourage the use, support and availability of high-performance computing resources for the entire research community,” said Ravi Pendse, vice president for information technology and chief information officer.

The computing package was developed to meet needs across a diversity of disciplines and to provide options for long-term data management, sharing and protecting sensitive data, and more competitive cost structures that give faculty and research teams more flexibility to procure resources on short notice.

“It is incredibly important that we provide our research community with the tools necessary so they can use their experience and expertise to solve problems and drive innovation,” said Rebecca Cunningham, vice president for research and the William G. Barsan Collegiate Professor of Emergency Medicine. “The no-cost supercomputing resources provided by ITS and Vice President Pendse will greatly benefit our university community and the countless individuals who are positively impacted by their research.”

Ph.D. students may qualify for their own UMRCP resources depending on who is overseeing their research and their adviser relationship. Students should consult with their Ph.D. program administrator to determine their eligibility. ITS will confirm this status when a UMRCP request is submitted.

Undergraduate and master’s students do not currently qualify for their own UMRCP, but they can be added as users or administrators of another person’s UMRCP. Students can also access other ITS programs such as Great Lakes for Course Accounts, and Student Teams.

“If you’re a researcher at Michigan, these resources are available to you without financial impact. We’re going to make sure you have what you need to do your research. We’re investing in you as a researcher because you are what makes Michigan Research successful,” Brock Palen, Advanced Research Computing director.

Services that are needed beyond the base allocation provided by the UMRCP are available at reduced rates and are automatically available for all researchers on the Ann Arbor, Dearborn, Flint and Michigan Medicine campuses.

More Information

Using tweets to understand climate change sentiment

By | HPC, News, Research, Systems and Services

A team from Urban Sustainability Research Group of the School for Environment and Sustainability (UM-SEAS) has been studying public tweets to understand climate change and global warming attitudes in the U.S. 

Dimitris Gounaridis, is a fellow with the study. The team is mentored by Joshua Newell, and combines work about perceptions on climate change by Jianxun Yang and proprietary level vulnerability assessment by Wanja Waweru

“This research is timely and urgent. It helps us identify hazards, and elevated risks of flooding and heat, for socially vulnerable communities across the U.S. This risk is exacerbated especially for populations that do not believe climate change is happening,” Dimitris stated. 

The research team used a deep learning algorithm that is able to recognize text and predict whether the person tweeting believes in climate change or not. The algorithm analyzed a total of 7 million public tweets from a combination of datasets from a dataset called the U-M Twitter Decahose and the George Washington University Libraries Dataverse. This dataset consists of an historical archive of Decahose tweets and an ongoing collection from the Decahose. The current deep learning model has an 85% accuracy rate and is validated at multiple levels.

The map below shows the prediction of specific users that believe or are skeptical of climate change and global warming. Dimitris used geospatial modeling techniques to identify clusters of American skepticism and belief to create the map.

A map of the United States with blue and red dots indicating climate change acceptance.

(Image courtesy Dimitris Gounaridis.)

The tweet stream is sampled in real-time. Armand Burks, a research data scientist with ARC, wrote the Python code that is responsible for continuously collecting the data and storing it in Turbo Research Storage. He says that many researchers across the university are using this data for various research projects as well as classes. 

“We are seeing an increased demand for shared community data sets like the Decahose. ARC’s platforms like Turbo, ThunderX, and Great Lakes, hold and process that data, and our data scientists are available, in partnership with CSCAR, to assist in deriving meaning from such large data. 

“This is proving to be an effective way to combine compute services, methodology, and campus research mission leaders to make an impact quickly,” said Brock Palen, director of ARC.

In the future, Dimitris plans to refine the model to increase its accuracy, and then combine that with climate change vulnerability for flooding and heat stress.

“MIDAS is pleased that so many U-M faculty members are interested in using the Twitter Decahose. We currently have over 40 projects with faculty in the Schools of Information, Kinesiology, Social Work, and Public Health, as well as at Michigan Ross, the Ford School, LSA and more,” said H.V. Jagadish, MIDAS director and professor of Electrical Engineering and Computer Science

The Twitter Decahose is co-managed and supported by MIDAS, CSCAR, and ARC, and is available to all researchers without any additional charge. For questions about the Decahose, email Kristin Burgard, MIDAS outreach and partnership manager.

Global research uses computing services to advance parenting and child development

By | General Interest, Great Lakes, HPC, News, Research, Uncategorized

Andrew Grogan-Kaylor, professor of Social Work, has spent the past 15 years studying the impact of physical discipline on children within the United States. 

Working with a team of other researchers at the School of Social Work, co-led by professors Shawna Lee and Julie Ma, he recently expanded his research to include children from all over the world, rather than exclusively the U.S. Current data for 62 low- and middle-income countries has been provided by UNICEF, a United Nations agency responsible for providing humanitarian and developmental aid to children worldwide. This data provides a unique opportunity to study the positive things that parents do around the world.

a group of smiling children

(Image by Eduardo Davad from Pixabay)

“We want to push research on parenting and child development in new directions. We want to do globally-based, diversity-based work, and we can’t do that without ARC services,” said Grogan-Kaylor. “I needed a bigger ‘hammer’ than my laptop provided.” 

The “hammer” he’s referring to is the Great Lakes HPC cluster. It can handle processing the large data set easily. When Grogan-Kaylor first heard about ARC, he thought it sounded like an interesting way to grow his science, and that included the ability to run more complicated statistical models that were overwhelming his laptop and department desktop computers. 

He took a workshop led by Bennet Fauber, ARC senior applications programmer/analyst, and found Bennet to be sensible and friendly. Bennet made HPC resources feel within reach to a newcomer. Typically, Grogan-Kaylor says, this type of resource is akin to learning a new language, and he’s found that being determined and persistent and finding the right people are key to maximizing ARC services. Bennet has explained error messages, how to upload data, and how to schedule jobs on Great Lakes. He also found a friendly and important resource at the ARC Help Desk, which is staffed by James Cannon. Lastly, departmental IT director Ryan Bankston has been of enormous help in learning about the cluster.

“We’re here to help researchers do what they do best. We can handle the technology, so they can solve the world’s problems,” said Brock Palen, ARC director. 

“Working with ARC has been a positive, growthful experience, and has helped me contribute significantly to the discussion around child development and physical punishment,” said Grogan-Kaylor. “I have a vision of where I’d like our research to go, and I’m pleased to have found friendly, dedicated people to help me with the pragmatic details.” 

More information

ARC, LSA support groundbreaking global energy tracking

By | General Interest, Great Lakes, HPC, News, Research, Uncategorized

How can technology services like high-performance computing and storage help a political scientist contribute to more equal access to electricity around the world? 

Brian Min, associate professor of political science and research associate professor with the Center for Political Studies, and lead researcher Zachary O’Keeffe have been using nightly satellite imagery to generate new indicators of electricity access and reliability across the world as part of the High-Resolution Electricity Access (HREA) project. 

The collection of satellite imagery is unique in its temporal and spatial coverage. For more than three decades, images have captured nighttime light output over every corner of the globe, every single night. By studying small variations in light output over time, the goal is to identify patterns and anomalies to determine if an area is electrified, when it got electrified, and when the power is out. This work yields the highest resolution estimates of energy access and reliability anywhere in the world.

A satellite image of Kenya in 2017

This image of Kenya from 2017 shows a model-based classification of electrification status based upon all night statistically recalibrated 2017 VIIRS light output. (Image courtesy Dr. Min. Sources: NOAA, VIIRS DNB, Facebook/CIESIN HRSL).

LSA Technology Services and ARC both worked closely with Min’s team to relieve pain points and design highly-optimized, automated workflows. Mark Champe, application programmer/analyst senior, LSA Technology Services, explained that, “a big part of the story here is finding useful information in datasets that were created and collected for other purposes. Dr. Min is able to ask these questions because the images were previously captured, and then it becomes the very large task of finding a tiny signal in a huge dataset.”

There are more than 250 terabytes of satellite imagery and data, across more than 3 million files. And with each passing night, the collection continues to grow. Previously, the images were not easily accessible because they were archived in deep storage in multiple locations. ARC provides processing and storage at a single place, an important feature for cohesive and timely research. 

The research team created computational models that run on the Great Lakes High-Performance Computing Cluster, and that can be easily replicated and validated. They archive the files on the Locker Large-File Storage service

One challenge Min and O’Keeffe chronically face is data management. Images can be hundreds of megabytes each, so just moving files from the storage service to the high-performance computing cluster can be challenging, let alone finding the right storage service. Using Turbo Research Storage and Globus File Transfer, Min and O’Keeffe found secure, fast, and reliable solutions to easily manage their large, high-resolution files.

Brock Palen, director of ARC, said that top speeds were reached when moving files from Great Lakes to Turbo at 1,400 megabytes per second. 

Min and team used Globus extensively in acquiring historical data from the National Oceanic and Atmospheric Administration (NOAA). Champe worked with the research team to set up a Globus connection to ARC storage services. The team at NOAA was then able to push the data to U-M quickly and efficiently. Rather than uploading the data to later be downloaded by Min’s team, Globus streamlined and sped up the data transfer process. 

Champe noted, “Over 100TB of data was being unarchived from tape and transferred between institutions. Globus made that possible and much less painful to manage.”

“The support we’ve gotten from ARC and LSA Technology has been incredible. They have made our lives easier by removing bottlenecks and helping us see new ways to draw insights from this unique data,” said Min. 

Palen added, “We are proud to partner with LSA Technology Services and ITS Infrastructure networking services to provide support to Dr. Min’s and O’Keeffe’s work. Their work has the potential to have a big impact in communities around the world.” 

“We should celebrate work such as this because it is a great example of impactful research done at U-M that many people helped to support,” Champe continued.

Min expressed his gratitude to the project’s partners. “We have been grateful to work with the World Bank and NOAA to generate new insights on energy access that will hopefully improve lives around the world.”

These images are now available via open access (free and available to all)

This is made possible by a partnership between the University of Michigan, the World Bank, Amazon Web Services, and NOAA

DNA sequencing productivity increases with ARC-TS services

By | HPC, News, Research, Systems and Services
NovaSeq, the DNA sequencer that is about the size of large laser printer.

The Advanced Genomics Core’s Illumina NovaSeq 6000 sequencing platform. It’s about the size of large laser printer.

On the cutting-edge of research at U-M is the Advanced Genomics Core’s Illumina NovaSeq 6000 sequencing platform. The AGC is one of the first academic core facilities to optimize this exciting and powerful instrument, that is about the size of a large laser printer. 

The Advanced Genomics Core (AGC), part of the Biomedical Research Core Facilities within the Medical School Office of Research, provides high-quality, low-cost next generation sequencing analysis for research clients on a recharge basis. 

One NovaSeq run can generate as much as 4TB of raw data. So how is the AGC able to generate, process, analyze, and transfer so much data for researchers? They have partnered with Advanced Research Computing – Technology Services (ARC-TS) to leverage the speed and power of the Great Lakes High-Performance Computing Cluster

With Great Lakes, AGC can process the data, and then store the output on other ARC-TS services: Turbo Research Storage and Data Den Research Archive, and share with clients using Globus File Transfer. All three services work together. Turbo offers the capacity and speed to match the computational performance of Great Lakes, Data Den provides an archive of raw data in case of catastrophic failure, and Globus has the performance needed for the transfer of big data. 

“Thanks to Great Lakes, we were able to process dozens of large projects simultaneously, instead of being limited to just a couple at a time with our in-house system,” said Olivia Koues, Ph.D., AGC managing director. 

“In calendar year 2020, the AGC delivered nearly a half petabyte of data to our research community. We rely on the speed of Turbo for storage, the robustness of Data Den for archiving, and the ease of Globus for big data file transfers. Working with ARC-TS has enabled incredible research such as making patients resilient to COVID-19. We are proudly working together to help patients.”

“Our services process more than 180,000GB of raw data per year for the AGC. That’s the same as streaming the three original Star Wars movies and the three prequels more than 6,000 times,” said Brock Palen, ARC-TS director. “We enjoy working with AGC to assist them into the next step of their big data journey.”

ARC-TS is a division of Information and Technology Services (ITS). The Advanced Genomics Core (ACG) is part of the Biomedical Research Core Facilities (BRCF) within the Medical School Office of Research.

Using machine learning and the Great Lakes HPC Cluster for COVID-19 research

By | General Interest, Great Lakes, HPC, News, Research, Uncategorized

A researcher in the College of Literature, Science, and the Arts (LSA) is pioneering two separate, ongoing efforts for measuring and forecasting COVID-19: pandemic modeling and a risk tracking site

The projects are led by Sabrina Corsetti, a senior undergraduate student pursuing dual degrees in honors physics and mathematical sciences, and supervised by Thomas Schwarz, Ph.D., associate professor of physics. 

The modeling uses a machine learning algorithm that can forecast future COVID-19 cases and deaths. The weekly predictions are made using the ARC-TS Great Lakes High-Performance Computing Cluster, which provides the speed and dexterity to run the modeling algorithms and data analysis needed for data-informed decisions that affect public health. 

Each week, 51 processes (one for each state and one for the U.S.) are run in parallel (at the same time). “Running all 51 analyses on our own computers would take an extremely long time. The analysis places heavy demands on the hardware running the computations, which makes crashes somewhat likely on a typical laptop. We get all 51 done in the time it would take to do 1,” said Corsetti. “It is our goal to provide accurate data that helps our country.”

The predictions for the U.S. at the national and state levels are fed into the COVID-19 Forecasting Hub, which is led by the UMass-Amherst Influenza Forecasting Center of Excellence based at the Reich Lab. The weekly predictions generated by the hub are then read out by the CDC for their weekly forecast updates Center for Disease Control (CDC) COVID-19 Forecasting Hub

The second project, a risk tracking site, involves COVID-19 data-acquisition from a Johns Hopkins University repository and the Michigan Safe Start Map. This is done on a daily basis, and the process runs quickly. It only takes about five minutes, but the impact is great. The data populates the COVID-19 risk tracking site for the State of Michigan that shows by county the total number of COVID-19 cases, the average number of new cases in the past week, and the risk level.

“Maintaining the risk tracking site requires us to reliably update its data every day. We have been working on implementing these daily updates using Great Lakes so that we can ensure that they happen at the same time each day. These updates consist of data pulls from the Michigan Safe Start Map (for risk assessments) and the Johns Hopkins COVID-19 data repository (for case counts),” remarked Corsetti.

“We are proud to support this type of impactful research during the global pandemic,” said Brock Palen, director of Advanced Research Computing – Technology Services. “Great Lakes provides quicker answers and optimized support for simulation, machine learning, and more. It is designed to meet the demands of the University of Michigan’s most intensive research.”

ARC is a division of Information and Technology Services (ITS). 

Related information 

SEAS study wildlife refuge wetlands habitats using machine learning

By | General Interest, News, Research

This article was written by Taylor Gribble, the ARC summer 2020 intern. 

A U-M School for Environment and Sustainability (SEAS) student team is working with the Shiawassee National Wildlife Refuge to study how fish move through different wetland habitats. Their work is primarily dependent on being in the field, but in March the pandemic delayed fieldwork. In June, the team of SEAS master students was allowed to begin socially distant field work. But the question was: How? 

With the help of the ARC Scientific Computing and Research Consulting Services, the SEAS students were able to pivot their research methodology and develop advanced analysis approaches for hydroacoustic data using strategically placed cameras and machine learning.

The Shiawassee refuge is divided into separately managed wetland units. These wetland units can be connected or cut off from one another and the Shiawassee river. An Adaptive Resolution Imaging Sonar (ARIS) camera has been placed at the connection point between the refuge’s “control” wetland units and the river to track fish movements between these two ecosystems. They are created through human-made dikes and water control structures.

In order to find answers about fish movement, the SEAS team is divided into three separate parts: 

  1. In-the-field monitoring of fish, macroinvertebrates, water quality, and vegetation
  2. ARIS camera work: understanding how to use ARIS footage to answer ecological questions using machine learning facilitated by the ARC Data Consultation Service
  3. Community education and outreach regarding restoration work at the refuge

Meghan Dailey, machine learning specialist, and Armand Burks, research data scientist, are part of the ARC Data Science Consultation team. Together they are working to see the project through by understanding the needs of the SEAS research team and providing the necessary coding expertise. In addition, they are working to provide the SEAS team with tools to become independent programmers so they can implement programming/coding into their future research endeavors.  

Richey works with the machine learning team. Machine learning is a tool for turning information into knowledge. It automatically finds patterns in complex data that are difficult for a human to find. While traditional problem solving uses data and rules to find an answer, machine learning uses data and answers to find the rules that apply to a problem. Together, they count the number of fish moving in front of the camera that was originally placed in mid-March but removed in mid-May due to flooding from the dam breaches in Midland, Mich. The camera has since been placed back into the “avenue” between one of the managed wetland pools and the river. With the help of a written machine algorithm, Richey and the SEAS team are able to count the number of fish they’re seeing in front of the camera feed. There is one camera placed in the water that is taking underwater images of the fish. The fish swim by the camera, and the team captures these frames.

Burks is responsible for the data conversion stages of the project. “They have a large amount of data that’s generated from the underwater camera. These aren’t the typical cameras as we would think of; they work with sonar which is based on sound. It is generating a lot of data in this sound-based sonar format that needs to be converted into something that is usable by the machine learning model.”

In order for the program to run smoothly and be able to count the fish, Burks and SEAS team had to develop a tool that allows them to turn the raw data into an actual video feed. Once this is completed the SEAS research team watch a series of pre-recorded videos that are saved to files. In order to receive the raw data, a large data conversion must happen to transform raw sonar data into videos. From there, the machine learning algorithms can be built and analyzed.

The ARC team plans to continue working with the students and the team at the refuge to refine their methods and test with recently collected footage.

Beta tool helps researchers manage IT services

By | General Interest, News, Research, Uncategorized

Since August 2019, ARC-TS has been developing a tool that would give researchers and their delegates the ability to directly manage the IT research services they consume from ARC-TS, such as user access and usage stats.

The ARC-TS Resource Management Portal (RMP) beta tool is now available for U-M researchers.

The RMP is a self-service-only user portal with tools and APIs for research managers, unit support staff, and delegates to manage their ARC-TS IT resources. Common activities such as managing user access (adding and removing users), viewing historical usage to make informed decisions about lab resource needs, and determining volume capacity at a glance are just some of the functionality the ARC-TS RMP provides.

The portal currently provides tools for use with Turbo Research Storage, a high-capacity, reliable, secure, and fast storage solution. Longer-term, RMP will scale to include the other storage and computing services offered by ARC-TS. It is currently read-view only.

To get started or find help, contact arcts-support@umich.edu.