Robotics / en Learning rewired: U of T researcher sparks kids’ interest in tech with animatronic critters /news/learning-rewired-u-t-researcher-sparks-kids-interest-tech-animatronic-critters <span class="field field--name-title field--type-string field--label-hidden">Learning rewired: U of T researcher sparks kids’ interest in tech with animatronic critters</span> <div class="field field--name-field-featured-picture field--type-image field--label-hidden field__item"> <img loading="eager" srcset="/sites/default/files/styles/news_banner_370/public/2024-07/UofT95338_2024-04-26-Paul-Dietz_Polina-Teif-8-crop.jpg?h=235aba82&amp;itok=MkfLbn0X 370w, /sites/default/files/styles/news_banner_740/public/2024-07/UofT95338_2024-04-26-Paul-Dietz_Polina-Teif-8-crop.jpg?h=235aba82&amp;itok=CBI6GjsG 740w, /sites/default/files/styles/news_banner_1110/public/2024-07/UofT95338_2024-04-26-Paul-Dietz_Polina-Teif-8-crop.jpg?h=235aba82&amp;itok=zA141Z86 1110w" sizes="(min-width:1200px) 1110px, (max-width: 1199px) 80vw, (max-width: 767px) 90vw, (max-width: 575px) 95vw" width="740" height="494" src="/sites/default/files/styles/news_banner_370/public/2024-07/UofT95338_2024-04-26-Paul-Dietz_Polina-Teif-8-crop.jpg?h=235aba82&amp;itok=MkfLbn0X" alt="Dietz holds up animatronic paper cutouts"> </div> <span class="field field--name-uid field--type-entity-reference field--label-hidden"><span>bresgead</span></span> <span class="field field--name-created field--type-created field--label-hidden"><time datetime="2024-07-16T14:22:39-04:00" title="Tuesday, July 16, 2024 - 14:22" class="datetime">Tue, 07/16/2024 - 14:22</time> </span> <div class="clearfix text-formatted field field--name-field-cutline-long field--type-text-long field--label-above"> <div class="field__label">Cutline</div> <div class="field__item"><p><em>Paul Dietz, a&nbsp;distinguished engineer in residence and director of fabrication in U of T’s computer science department, hopes his paper animatronic creations can engage more kids in STEM through the power of storytelling&nbsp;(photo by Polina Teif)</em></p> </div> </div> <div class="field field--name-field-author-reporters field--type-entity-reference field--label-hidden field__items"> <div class="field__item"><a href="/news/authors-reporters/adina-bresge" hreflang="en">Adina Bresge</a></div> </div> <div class="field field--name-field-topic field--type-entity-reference field--label-above"> <div class="field__label">Topic</div> <div class="field__item"><a href="/news/topics/our-community" hreflang="en">Our Community</a></div> </div> <div class="field field--name-field-story-tags field--type-entity-reference field--label-hidden field__items"> <div class="field__item"><a href="/news/tags/computer-science" hreflang="en">Computer Science</a></div> <div class="field__item"><a href="/news/tags/education" hreflang="en">Education</a></div> <div class="field__item"><a href="/news/tags/faculty-arts-science" hreflang="en">Faculty of Arts &amp; Science</a></div> <div class="field__item"><a href="/news/tags/ontario-institute-studies-education" hreflang="en">Ontario Institute for Studies in Education</a></div> <div class="field__item"><a href="/news/tags/robotics" hreflang="en">Robotics</a></div> <div class="field__item"><a href="/news/tags/stem" hreflang="en">STEM</a></div> </div> <div class="field field--name-field-subheadline field--type-string-long field--label-above"> <div class="field__label">Subheadline</div> <div class="field__item">Paul Dietz says robotic paper creations are a creative – and more inclusive – way to get kids interested in STEM fields</div> </div> <div class="clearfix text-formatted field field--name-body field--type-text-with-summary field--label-hidden field__item"><p>Could a talking paper octopus be the key to igniting kids' curiosity about technology?</p> <p>Ƶ engineer <strong>Paul Dietz</strong> certainly thinks so. With the help of a menagerie of mechanically controlled puppets, he has a plan to help students learn to think creatively across a wide range of fields.</p> <p>All it takes is some simple circuitry, a few arts and crafts supplies – and a lot of imagination.</p> <p>A distinguished engineer in residence and director of fabrication in the Faculty of Arts and Science’s computer science department, Dietz is the whimsical mind behind the <a href="http://animatronicsworkshop.com/">Animatronics Workshop</a>. The program collaborates with schools to provide opportunities for children to create, design and build their own robotic shows.</p> <p>Dietz has been partnering with schools where kids create their own animatronic stories – from staging <a href="https://www.youtube.com/watch?v=il2lIbSpHzM&amp;list=UUfg1rcYPNw4o7QziVaprF8Q&amp;index=20&amp;ab_channel=PaulDietz">pre-programmed puppet shows</a> to <a href="https://www.youtube.com/watch?v=LRjBil0Z2rM&amp;list=UUfg1rcYPNw4o7QziVaprF8Q&amp;index=6&amp;t=77s&amp;ab_channel=PaulDietz">hosting Q-and-As with Shakespeare</a> – departing from the competition-based competitions typical of many youth robotics efforts.</p> <div> <div class="field field--name-field-media-oembed-video field--type-string field--label-hidden field__item"><iframe src="/media/oembed?url=https%3A//youtu.be/LRjBil0Z2rM%3Fsi%3D-Ym3yp883AtnExY7&amp;max_width=0&amp;max_height=0&amp;hash=3VAr-AYOVJtz9YDQyEwBXSiMl16kIvR40CMvFOzsoP0" width="200" height="113" class="media-oembed-content" loading="eager" title="Colbert Questionert with William Shakespeare"></iframe> </div> </div> <p>&nbsp;</p> <p>Dietz’s program has been his passion project for a decade and a half, developed on the side while he worked day jobs engineering innovations for companies like Microsoft, Mitsubishi and Disney, as well as his own startups.</p> <p>Now, at U of T, Dietz is focusing on bringing accessible and affordable animatronics to classrooms across Canada. The goal, he says, is to teach kids to use technology as a tool for storytelling, dismantling what he sees as a false divide between the arts and sciences.</p> <p>“One of the first participants in this program was a young girl who was really into writing creative stories and really loved science. And she saw these as two conflicting parts of her world,” says Dietz, who is also a faculty affiliate at the Schwartz Reisman Institute for Technology and Society.</p> <p>“After what she did in animatronics, it suddenly dawned on her that you can do both. If you do engineering right, it is a creative art.”</p> <figure role="group" class="caption caption-drupal-media align-center"> <div> <div class="field field--name-field-media-image field--type-image field--label-hidden field__item"> <img loading="lazy" src="/sites/default/files/styles/scale_image_750_width_/public/2024-07/UofT95342_2024-04-26-Paul-Dietz_Polina-Teif-12-crop.jpg?itok=eWI6UDuC" width="750" height="500" alt="&quot;&quot;" class="image-style-scale-image-750-width-"> </div> </div> <figcaption><em>In a capstone course on physical computing in K-12, Dietz encouraged undergraduate students to explore how computer-based systems can bring stories to life in the classroom (photo by Polina Teif)</em></figcaption> </figure> <p>Dietz had a similar realization as a teenager in the late 1970s, when a behind-the-scenes tour of Walt Disney Imagineering got him tinkering with an animatronic robot penguin.&nbsp;</p> <p>This early fusion of technical skills and storytelling sensibilities set Dietz on a path that turned flights of imagination into real-world breakthroughs that shape our engagement with technology.</p> <p>A prolific inventor and researcher, Dietz is best known for co-creating <a href="https://www.youtube.com/watch?v=PpldnaOHjqk&amp;ab_channel=PaulDietz">an early progenitor of the multi-touch display technology</a> that’s ubiquitous in today’s smartphones and tablets. Other innovations include&nbsp;<a href="https://en.wikipedia.org/wiki/Pal_Mickey">'Pal Mickey,'</a>&nbsp;an interactive plush toy that guided visitors through Disney theme parks,&nbsp;and&nbsp;<a href="https://www.youtube.com/watch?v=vwRO16n7hVA">parallel reality displays</a> that <a href="https://www.youtube.com/watch?v=p1b3wEsFlCY&amp;ab_channel=TUX">allow multiple viewers to see individualized content on the same screen</a>.</p> <p>Dietz says his storied career debunks the common misconception – often reinforced in schools – that creativity is exclusive to artistic pursuits, while science is the domain of strict rationality, where there are prescribed methods of inquiry to arrive at a single correct answer.</p> <p>As Dietz sees it, weaving a narrative and programming a robot are propelled by the same creative impulse – they just exercise different skills. He believes a well-rounded education should equip students with a diverse arsenal of tools to explore new ideas.</p> <p>“If you’re an artist, you have to learn the mechanics of sculpting or painting or whatever your medium is,” he says. “We should be looking at engineering and technology as those tools, and the key is … learning how to use them creatively to achieve things that are actually positive for our society.”</p> <p>The universal appeal of storytelling also serves to make technology accessible and exciting to kids of all ages and genders, Dietz adds.</p> <p>Bridging the gender divide in STEM has been core to Dietz’s animatronics mission since its inception.</p> <p>When his daughter was in middle school, Dietz took her to a robotics competition – but she was turned off by the contest, which seemed pointless to her. However, when the two of them worked together on an animatronic raccoon, he saw her passion for creating ignite.</p> <p>“This light bulb went off in my head: Maybe the problem isn’t that we’re doing tech,” says Dietz. “Maybe kids like my daughter need to see some application that makes sense to them – like telling a story.”</p> <figure role="group" class="caption caption-drupal-media align-center"> <div> <div class="field field--name-field-media-image field--type-image field--label-hidden field__item"> <img loading="lazy" src="/sites/default/files/styles/scale_image_750_width_/public/2024-07/jics-group-crop-2.jpg?itok=PLmkIb9q" width="750" height="500" alt="&quot;&quot;" class="image-style-scale-image-750-width-"> </div> </div> <figcaption><em>Kids at the Eric Jackman Institute of Child Study are encouraged to develop creative and computer science skills (photo courtesy of JICS)</em></figcaption> </figure> <p>Over the years, Dietz has partnered with several schools to set up animatronics workshops that attracted an even number of boys and girls&nbsp; and ensured every kid participated in all aspects of the projects – from storytelling and character design to robot building and programming.</p> <p>But as his career took him across the U.S., Dietz found it difficult to sustain and replicate the success of the programs because of the prohibitive costs of full-scale animatronic robots and the significant technical expertise required from teachers.</p> <p>At U of T, Dietz is working to bring animatronics to schools of all resources, allowing students to develop creative and computer science skills by harnessing the endless storytelling possibilities of paper.</p> <figure role="group" class="caption caption-drupal-media align-center"> <div> <div class="field field--name-field-media-image field--type-image field--label-hidden field__item"> <img loading="lazy" src="/sites/default/files/styles/scale_image_750_width_/public/2024-07/UofT95332_2024-04-26-Paul-Dietz_Polina-Teif-2-crop.jpg?itok=amwQqKwU" width="750" height="500" alt="&quot;&quot;" class="image-style-scale-image-750-width-"> </div> </div> <figcaption><em>Undergraduate students demo an interactive diorama during a capstone showcase at the Bahen Centre for Information Technology (photo by Polina Teif)</em></figcaption> </figure> <p>At the <a href="https://www.oise.utoronto.ca/jics">Dr. Eric Jackman Institute of Child Study</a> (JICS) at U of T’s Ontario Institute for Studies in Education, students from kindergarten through Grade 6 have put Dietz’s paper animatronics kits to the test, bringing characters to life with kinetic, vocal creations.</p> <p>The laboratory school has hosted a series of pilot projects where kids fashioned characters out of construction paper, recorded voices and wired motorized movements to animate creations ranging from a chomping, sharp-toothed maw to a bouncing kitten.</p> <figure role="group" class="caption caption-drupal-media align-center"> <div> <div class="field field--name-field-media-image field--type-image field--label-hidden field__item"> <img loading="lazy" src="/sites/default/files/styles/scale_image_750_width_/public/2024-07/86ec45_3b7b8cc0e1ea454098ebea496ee7419e-crop.jpg?itok=X5gRDYsR" width="750" height="500" alt="&quot;&quot;" class="image-style-scale-image-750-width-"> </div> </div> <figcaption><em>Dietz hopes the pilot program at JICS, pictured, can be scaled up to schools across the country (photo courtesy of JICS)</em></figcaption> </figure> <p><strong>Nick Song</strong>, a special education and technology teacher at JICS, says he sees enormous educational potential for paper animatronics to engage students in hands-on, interactive learning that simultaneously develops technology skills and fosters creative expression.</p> <p>“The kids love doing things with technology because it gives them a really cool feedback loop where they can try something and see it work immediately,” says Song. “All of this is very motivating for kids, seeing something pick up their voice and start moving, and you almost feel like it’s coming to life.”</p> <p>Building on the pilots at JICS, Dietz is aiming to scale up the program to schools across the country in hopes of nurturing the next generation of out-of-the-box innovators.</p> <p>“It’s very different from the technical work that I’ve generally done … but it feels very right,” says Dietz. “I think we’re doing something important for Canada.”</p> <p>&nbsp;</p> </div> <div class="field field--name-field-news-home-page-banner field--type-boolean field--label-above"> <div class="field__label">News home page banner</div> <div class="field__item">On</div> </div> Tue, 16 Jul 2024 18:22:39 +0000 bresgead 308452 at U of T researchers enhance object-tracking abilities of self-driving cars /news/u-t-researchers-enhance-object-tracking-abilities-self-driving-cars <span class="field field--name-title field--type-string field--label-hidden">U of T researchers enhance object-tracking abilities of self-driving cars</span> <div class="field field--name-field-featured-picture field--type-image field--label-hidden field__item"> <img loading="eager" srcset="/sites/default/files/styles/news_banner_370/public/2024-05/PXL_20230608_181335793-crop.jpg?h=7575563c&amp;itok=mDJZAkzx 370w, /sites/default/files/styles/news_banner_740/public/2024-05/PXL_20230608_181335793-crop.jpg?h=7575563c&amp;itok=VS33Oojz 740w, /sites/default/files/styles/news_banner_1110/public/2024-05/PXL_20230608_181335793-crop.jpg?h=7575563c&amp;itok=lwAIt_Pp 1110w" sizes="(min-width:1200px) 1110px, (max-width: 1199px) 80vw, (max-width: 767px) 90vw, (max-width: 575px) 95vw" width="740" height="494" src="/sites/default/files/styles/news_banner_370/public/2024-05/PXL_20230608_181335793-crop.jpg?h=7575563c&amp;itok=mDJZAkzx" alt="&quot;&quot;"> </div> <span class="field field--name-uid field--type-entity-reference field--label-hidden"><span>rahul.kalvapalle</span></span> <span class="field field--name-created field--type-created field--label-hidden"><time datetime="2024-05-29T10:59:42-04:00" title="Wednesday, May 29, 2024 - 10:59" class="datetime">Wed, 05/29/2024 - 10:59</time> </span> <div class="clearfix text-formatted field field--name-field-cutline-long field--type-text-long field--label-above"> <div class="field__label">Cutline</div> <div class="field__item"><p><em>Sandro Papais, a PhD student, is the co-author of a new paper that introduces a graph-based optimization method to improve object tracking for self-driving cars&nbsp;(photo courtesy of aUToronto)</em></p> </div> </div> <div class="field field--name-field-author-reporters field--type-entity-reference field--label-hidden field__items"> <div class="field__item"><a href="/taxonomy/term/6738" hreflang="en">Safa Jinje</a></div> </div> <div class="field field--name-field-topic field--type-entity-reference field--label-above"> <div class="field__label">Topic</div> <div class="field__item"><a href="/news/topics/breaking-research" hreflang="en">Breaking Research</a></div> </div> <div class="field field--name-field-story-tags field--type-entity-reference field--label-hidden field__items"> <div class="field__item"><a href="/news/tags/artificial-intelligence" hreflang="en">Artificial Intelligence</a></div> <div class="field__item"><a href="/news/tags/faculty-applied-science-engineering" hreflang="en">Faculty of Applied Science &amp; Engineering</a></div> <div class="field__item"><a href="/news/tags/graduate-students" hreflang="en">Graduate Students</a></div> <div class="field__item"><a href="/news/tags/robotics" hreflang="en">Robotics</a></div> <div class="field__item"><a href="/news/tags/self-driving-cars" hreflang="en">Self-Driving Cars</a></div> <div class="field__item"><a href="/news/tags/utias" hreflang="en">UTIAS</a></div> </div> <div class="field field--name-field-subheadline field--type-string-long field--label-above"> <div class="field__label">Subheadline</div> <div class="field__item">The new tools could help robotic systems of autonomous vehicles better track the position and motion of vehicles, pedestrians and cyclists<br> </div> </div> <div class="clearfix text-formatted field field--name-body field--type-text-with-summary field--label-hidden field__item"><p>Researchers at the Ƶ Institute for Aerospace Studies (UTIAS) have introduced a pair of high-tech tools that could improve the safety and reliability of autonomous vehicles by enhancing the reasoning ability of their robotic systems.</p> <p>The innovations address multi-object tracking, a process used by robotic systems to track the position and motion of objects – including vehicles, pedestrians and cyclists – to plan the path of self-driving cars in densely populated areas.</p> <p>Tracking information is collected from computer vision sensors (2D camera images and 3D LIDAR scans) and filtered at each time stamp, 10 times a second, to predict the future movement of moving objects.&nbsp;&nbsp;</p> <p>“Once processed, it allows the robot to develop some reasoning about its environment. For example, there is a human&nbsp;crossing the street at the intersection, or a cyclist changing lanes up ahead,” says&nbsp;<strong>Sandro Papais</strong>, a PhD student in UTIAS in the Faculty of Applied Science &amp; Engineering. "At each time stamp, the robot’s software tries to link the current detections with objects it saw in the past, but it can only go back so far in time.”&nbsp;</p> <p><a href="https://arxiv.org/pdf/2402.17892">In a new paper</a> presented at the 2024 International Conference on Robotics and Automation in Yokohama, Japan, Papais and co-authors <strong>Robert Ren</strong>, a third-year engineering science student, and Professor <strong>Steven Waslander</strong>, director of UTIAS’s <a href="https://www.trailab.utias.utoronto.ca/">Toronto Robotics and AI Laboratory</a>, introduce Sliding Window Tracker (SWTrack) – a graph-based optimization method that uses additional temporal information to prevent missed objects.</p> <p>The tool is designed to improve the performance of tracking methods, particularly when objects are occluded from the robot’s point of view.&nbsp;</p> <figure role="group" class="caption caption-drupal-media align-center"> <div> <div class="field field--name-field-media-image field--type-image field--label-hidden field__item"> <img loading="lazy" src="/sites/default/files/styles/scale_image_750_width_/public/2024-05/Objects%20and%20Labels.jpg?itok=mTZFj1NL" width="750" height="426" alt="&quot;&quot;" class="image-style-scale-image-750-width-"> </div> </div> <figcaption><em>A visualization of a nuScenes dataset used by the researchers. The image is a mosaic of the six different camera views around the car with the object bounding boxes rendered overtop of the images (image courtesy of the Toronto Robotics and AI Laboratory)</em></figcaption> </figure> <p>&nbsp;</p> <p>“SWTrack widens how far into the past a robot considers when planning,” says Papais. “So instead of being limited by what it just saw one frame ago and what is happening now, it can look over the past five seconds and then try to reason through all the different things it has seen.” &nbsp;&nbsp;</p> <p>The team tested, trained and validated their algorithm on field data obtained through nuScenes, a public, large-scale dataset for autonomous driving vehicles that have operated on roads in cities around the world. The data includes human annotations that the team used to benchmark the performance of SWTrack.&nbsp;&nbsp;</p> <p>They found that each time they extended the temporal window, to a maximum of five seconds, the tracking performance got better. But past five seconds, the algorithm’s performance was slowed by computation time.&nbsp;&nbsp;&nbsp;</p> <p>“Most tracking algorithms would have a tough time reasoning over some of these temporal gaps. But in our case, we were able to validate that we can track over these longer periods of time and maintain more consistent tracking for dynamic objects around us,” says Papais.&nbsp;</p> <p>Papais says he’s looking forward to building on the idea of improving robot memory and extending it to other areas of robotics infrastructure.&nbsp;“This is just the beginning,” he says. “We’re working on the tracking problem, but also other robot problems, where we can incorporate more temporal information to enhance perception and robotic reasoning.”&nbsp;&nbsp;</p> <p>Another paper, <a href="https://arxiv.org/pdf/2402.12303">co-authored by master’s student <strong>Chang Won (John) Lee</strong> and Waslander</a>, introduces UncertaintyTrack, a collection of extensions for 2D tracking-by-detection methods that leverages probabilistic object detection.&nbsp;&nbsp;&nbsp;</p> <p>“Probabilistic object detection quantifies the uncertainty estimates of object detection,” explains Lee. “The key thing here is that for safety-critical tasks, you want to be able to know when&nbsp;the predicted detections are likely to cause errors in downstream tasks such as multi-object tracking. These errors can occur because of low-lighting conditions or heavy object occlusion.&nbsp;&nbsp;</p> <p>“Uncertainty estimates give us an idea of when the model is in doubt, that is, when it is highly likely to give errors in predictions. But there’s this gap because probabilistic object detectors aren’t currently used in multi-tracking object tracking.” &nbsp;&nbsp;</p> <p>Lee worked on the paper as part of his undergraduate thesis in engineering science. Now a master’s student in Waslander’s lab, he is researching visual anomaly detection for the Canadarm3, Canada’s contribution to the U.S.-led Gateway lunar outpost.&nbsp;&nbsp;“In my current research, we are aiming to come up with a deep-learning-based method that detects objects floating in space that pose a potential risk to the robotic arm,” Lee says.</p> <p>Waslander says the advancements outlined in the two papers build on work that his lab has been focusing on for a number of years.</p> <p>“[The Toronto Robotics and AI Laboratory] has been working on assessing perception uncertainty and expanding temporal reasoning for robotics for multiple years now, as they are the key roadblocks to deploying robots in the open world more broadly,” Waslander says.</p> <p>“We desperately need AI methods that can understand the persistence of objects over time, and ones that are aware of their own limitations and will stop and reason when something new or unexpected appears in their path. This is what our research aims to do.”&nbsp;</p> </div> <div class="field field--name-field-news-home-page-banner field--type-boolean field--label-above"> <div class="field__label">News home page banner</div> <div class="field__item">Off</div> </div> Wed, 29 May 2024 14:59:42 +0000 rahul.kalvapalle 307958 at U of T 'self-driving lab' to focus on next-gen human tissue models /news/u-t-self-driving-lab-focus-next-gen-human-tissue-models <span class="field field--name-title field--type-string field--label-hidden">U of T 'self-driving lab' to focus on next-gen human tissue models</span> <div class="field field--name-field-featured-picture field--type-image field--label-hidden field__item"> <img loading="eager" srcset="/sites/default/files/styles/news_banner_370/public/2023-10/organ-on-a-chip-well-plate_Rick-Lu-crop_0.jpg?h=afdc3185&amp;itok=HnIQjx4h 370w, /sites/default/files/styles/news_banner_740/public/2023-10/organ-on-a-chip-well-plate_Rick-Lu-crop_0.jpg?h=afdc3185&amp;itok=S9Vdg4Km 740w, /sites/default/files/styles/news_banner_1110/public/2023-10/organ-on-a-chip-well-plate_Rick-Lu-crop_0.jpg?h=afdc3185&amp;itok=dDzw8E-g 1110w" sizes="(min-width:1200px) 1110px, (max-width: 1199px) 80vw, (max-width: 767px) 90vw, (max-width: 575px) 95vw" width="740" height="494" src="/sites/default/files/styles/news_banner_370/public/2023-10/organ-on-a-chip-well-plate_Rick-Lu-crop_0.jpg?h=afdc3185&amp;itok=HnIQjx4h" alt="&quot;&quot;"> </div> <span class="field field--name-uid field--type-entity-reference field--label-hidden"><span>Christopher.Sorensen</span></span> <span class="field field--name-created field--type-created field--label-hidden"><time datetime="2023-10-26T11:15:29-04:00" title="Thursday, October 26, 2023 - 11:15" class="datetime">Thu, 10/26/2023 - 11:15</time> </span> <div class="clearfix text-formatted field field--name-field-cutline-long field--type-text-long field--label-above"> <div class="field__label">Cutline</div> <div class="field__item"><p><em>The Self-Driving Lab for Human Organ Mimicry will use organoids and organs-on-chips –&nbsp;a well plate is pictured here – to allow researchers to move potential therapeutics to human clinical trials more rapidly&nbsp;(photo by&nbsp;Rick Lu)</em></p> </div> </div> <div class="field field--name-field-author-reporters field--type-entity-reference field--label-hidden field__items"> <div class="field__item"><a href="/news/authors-reporters/anika-hazra" hreflang="en">Anika Hazra</a></div> </div> <div class="field field--name-field-topic field--type-entity-reference field--label-above"> <div class="field__label">Topic</div> <div class="field__item"><a href="/news/topics/our-community" hreflang="en">Our Community</a></div> </div> <div class="field field--name-field-story-tags field--type-entity-reference field--label-hidden field__items"> <div class="field__item"><a href="/news/tags/acceleration-consortium" hreflang="en">Acceleration Consortium</a></div> <div class="field__item"><a href="/news/tags/institutional-strategic-initiatives" hreflang="en">Institutional Strategic Initiatives</a></div> <div class="field__item"><a href="/news/tags/princess-margaret-cancer-centre" hreflang="en">Princess Margaret Cancer Centre</a></div> <div class="field__item"><a href="/news/tags/temerty-faculty-medicine" hreflang="en">Temerty Faculty of Medicine</a></div> <div class="field__item"><a href="/news/tags/donnelly-centre-cellular-biomolecular-research" hreflang="en">Donnelly Centre for Cellular &amp; Biomolecular Research</a></div> <div class="field__item"><a href="/news/tags/artificial-intelligence" hreflang="en">Artificial Intelligence</a></div> <div class="field__item"><a href="/news/tags/faculty-applied-science-engineering" hreflang="en">Faculty of Applied Science &amp; Engineering</a></div> <div class="field__item"><a href="/news/tags/research-innovation" hreflang="en">Research &amp; Innovation</a></div> <div class="field__item"><a href="/news/tags/robotics" hreflang="en">Robotics</a></div> <div class="field__item"><a href="/news/tags/university-health-network" hreflang="en">University Health Network</a></div> </div> <div class="field field--name-field-subheadline field--type-string-long field--label-above"> <div class="field__label">Subheadline</div> <div class="field__item">The Self-Driving Laboratory for Human Organ Mimicry is one of six self-driving labs launched by the Acceleration Consortium to drive research across a range of fields</div> </div> <div class="clearfix text-formatted field field--name-body field--type-text-with-summary field--label-hidden field__item"><p>The Ƶ is home to a new “self-driving lab” that will allow researchers to better understand health and disease&nbsp;– and to more rapidly test the efficacy and toxicity of new drugs and materials.</p> <p>Based at the Donnelly Centre for Cellular and Biomolecular Research, the Self-Driving Laboratory for Human Organ Mimicry is the latest self-driving lab to spring from <a href="/news/u-t-receives-200-million-grant-support-acceleration-consortium-s-self-driving-labs-research">a historic $200-million grant</a> from the Canada First Research Excellence Fund&nbsp;to the&nbsp;<a href="https://acceleration.utoronto.ca/">Acceleration Consortium</a>&nbsp;– a global effort to speed the discovery of materials and molecules that is one of&nbsp;several U of T <a href="https://isi.utoronto.ca/">institutional strategic initiatives</a>.</p> <p>The new lab will be led by&nbsp;<strong>Milica Radisic</strong>, Canada Research Chair in Organ-on-a-Chip Engineering and professor of&nbsp;biomedical engineering in the Faculty of Applied Science &amp; Engineering, and&nbsp;<strong>Vuk Stambolic</strong>, senior scientist at the&nbsp;Princess Margaret Cancer Centre, University Health Network, and a professor of&nbsp;medical biophysics in the Temerty Faculty of Medicine.</p> <p>“The lab will innovate new complex cellular models of human tissues, such as from the heart, liver, kidney and brain, through stem-cell-derived organoids and organ-on-a-chip technologies,” said Radisic. “In partnership with the Princess Margaret Cancer Centre, the lab will also enable automation of patient-derived tumour organoid cultures to accelerate the discovery of new cancer treatments.”</p> <figure role="group" class="caption caption-drupal-media align-center"> <div> <div class="field field--name-field-media-image field--type-image field--label-hidden field__item"> <img loading="lazy" src="/sites/default/files/styles/scale_image_750_width_/public/2023-10/tumour%20organoids%20stained%20for%20a%20couple%20of%20markers%20with%20fluorescent%20dyes_Laura%20Tamblyn%20and%20Nikolina%20Radulovich.jpg?itok=RwyEZc1Q" width="750" height="395" alt="&quot;&quot;" class="image-style-scale-image-750-width-"> </div> </div> <figcaption><em>Tumour organoids stained with fluorescent dyes (image courtesy of Nikolina Radulovich and Laura Tamblyn)</em></figcaption> </figure> <p>The Self-Driving Laboratory for Human Organ Mimicry is one of six self-driving labs launched by the Acceleration Consortium at U of T to drive research across a range of fields, including materials,&nbsp;drug formulation, drug discovery and sustainable energy.</p> <p>How does a self-driving lab work? Once set up, it runs with robots and artificial intelligence performing as much as 90 per cent of the work. That, in turn, speeds up the process of discovery by freeing researchers from&nbsp;the tedious process of trial and error so they can focus on&nbsp;higher-level analysis.</p> <p>“The Self-Driving Lab for Human Organ Mimicry will enable other self-driving labs to develop new materials and drugs by rapidly determining their efficacy, as well as their potential toxic effects and other impacts on human tissues,” said Stambolic. “While animal testing is typically the go-to method to assess the safety of new molecules made for humans, this lab will replace trials involving animals with organoids and organs-on-chips. This will allow us to advance to human clinical trials much more quickly.”</p> <figure role="group" class="caption caption-drupal-media align-center"> <div> <div class="field field--name-field-media-image field--type-image field--label-hidden field__item"> <img loading="lazy" src="/sites/default/files/styles/scale_image_750_width_/public/2023-10/Headshots-of-Milica-Radisic-and-Vuk-Stamboli-crop_0.jpg?itok=Ih638T_n" width="750" height="500" alt="&quot;&quot;" class="image-style-scale-image-750-width-"> </div> </div> <figcaption><em>Professors Milica Radisic and Vuk Stambolic (supplied images)</em></figcaption> </figure> <p>“The goal of our self-driving labs is to use AI to move the discovery process forward at the necessary pace to tackle global issues,” said&nbsp;<strong>Alán Aspuru-Guzik</strong>, director of the Acceleration Consortium and professor of&nbsp;chemistry&nbsp;and&nbsp;computer science in the Faculty of Arts &amp; Science. “The Human Organ Mimicry SDL, as well as other self-driving labs launched through the Acceleration Consortium, will establish U of T and our extended research community as a global leader in AI for science.”</p> <p>Donnelly Centre Director <strong>Stephane Angers</strong> says the centre is an ideal environment for the new lab, citing the the international hub for&nbsp;cross-disciplinary health and medical research’s reputation as a hotspot for technological innovation&nbsp;– one that offers resources to the wider research community.</p> <p>“The Donnelly Centre is a thriving research community because it was founded on the principle of interdisciplinary collaboration,” said<strong>&nbsp;</strong>Angers, a professor of&nbsp;biochemistry&nbsp;and&nbsp;pharmaceutical sciences. “Our research strengths in computational biology, functional genomics and stem cell biology will catalyze the development and success of the Self-Driving Lab for Human Organ Mimicry.”</p> <p>The launch of the new lab will also expand the Donnelly Centre’s team of experts with the hiring of five new staff who will work to make the self-driving lab fully automated. The lab is expected to be operational by the end of the year</p> <p>“The Donnelly Centre is one of the foremost research institutes in the world, with outstanding strength in genomics, model organisms, organoids, computational biology and many other areas,” said&nbsp;<strong>Justin Nodwell</strong>, vice-dean of research and health science education at the Temerty Faculty of Medicine.</p> <p>“I’m delighted to hear about the addition of the Acceleration Consortium’s artificial intelligence-powered self-driving lab to the centre’s existing technical base. It will facilitate new lines of research by some of the best minds in the country.”</p> </div> <div class="field field--name-field-news-home-page-banner field--type-boolean field--label-above"> <div class="field__label">News home page banner</div> <div class="field__item">Off</div> </div> Thu, 26 Oct 2023 15:15:29 +0000 Christopher.Sorensen 304034 at Robotic nano-surgery shown to be effective at treating brain cancer in pre-clinical models /news/robotic-nano-surgery-shown-be-effective-treating-brain-cancer-pre-clinical-models <span class="field field--name-title field--type-string field--label-hidden">Robotic nano-surgery shown to be effective at treating brain cancer in pre-clinical models</span> <div class="field field--name-field-featured-picture field--type-image field--label-hidden field__item"> <img loading="eager" srcset="/sites/default/files/styles/news_banner_370/public/2023-04/GettyImages-179795271-crop.jpeg?h=afdc3185&amp;itok=eNiLUXfv 370w, /sites/default/files/styles/news_banner_740/public/2023-04/GettyImages-179795271-crop.jpeg?h=afdc3185&amp;itok=h0vetkOg 740w, /sites/default/files/styles/news_banner_1110/public/2023-04/GettyImages-179795271-crop.jpeg?h=afdc3185&amp;itok=mo4T1gGt 1110w" sizes="(min-width:1200px) 1110px, (max-width: 1199px) 80vw, (max-width: 767px) 90vw, (max-width: 575px) 95vw" width="740" height="494" src="/sites/default/files/styles/news_banner_370/public/2023-04/GettyImages-179795271-crop.jpeg?h=afdc3185&amp;itok=eNiLUXfv" alt="A scan depicting brain cancer."> </div> <span class="field field--name-uid field--type-entity-reference field--label-hidden"><span>Christopher.Sorensen</span></span> <span class="field field--name-created field--type-created field--label-hidden"><time datetime="2023-04-14T10:45:25-04:00" title="Friday, April 14, 2023 - 10:45" class="datetime">Fri, 04/14/2023 - 10:45</time> </span> <div class="clearfix text-formatted field field--name-field-cutline-long field--type-text-long field--label-above"> <div class="field__label">Cutline</div> <div class="field__item"><p>(Photo by BSIP/Universal Images Group/Getty Images)</p> </div> </div> <div class="field field--name-field-topic field--type-entity-reference field--label-above"> <div class="field__label">Topic</div> <div class="field__item"><a href="/news/topics/breaking-research" hreflang="en">Breaking Research</a></div> </div> <div class="field field--name-field-story-tags field--type-entity-reference field--label-hidden field__items"> <div class="field__item"><a href="/news/tags/institutional-strategic-initiatives" hreflang="en">Institutional Strategic Initiatives</a></div> <div class="field__item"><a href="/news/tags/temerty-faculty-medicine" hreflang="en">Temerty Faculty of Medicine</a></div> <div class="field__item"><a href="/news/tags/faculty-applied-science-engineering" hreflang="en">Faculty of Applied Science &amp; Engineering</a></div> <div class="field__item"><a href="/news/tags/hospital-sick-children" hreflang="en">Hospital for Sick Children</a></div> <div class="field__item"><a href="/news/tags/research-innovation" hreflang="en">Research &amp; Innovation</a></div> <div class="field__item"><a href="/news/tags/robotics" hreflang="en">Robotics</a></div> </div> <div class="clearfix text-formatted field field--name-body field--type-text-with-summary field--label-hidden field__item"><p>Researchers at The Hospital for Sick Children (SickKids) and the&nbsp;<a href="http://robotics.utoronto.ca/">Ƶ Robotics Institute</a>&nbsp;– an&nbsp;<a href="https://isi.utoronto.ca/">institutional strategic initiative</a>&nbsp;– have teamed up to develop a new treatment option for patients diagnosed with glioblastoma (GBM).&nbsp;</p> <p>Glioblastoma is the most common and aggressive form of brain cancer – the average life expectancy after a diagnosis is around 15 months.&nbsp;&nbsp;</p> <p><a href="https://www.mie.utoronto.ca/faculty_staff/sun/"><strong>Yu Sun</strong></a>, a professor in U of T's&nbsp;department of mechanical and industrial engineering in the Faculty of Applied Science and Engineering,&nbsp;and&nbsp;<a href="https://moleculargenetics.utoronto.ca/faculty/xi-huang"><strong>Xi Huang</strong></a>, a senior scientist at&nbsp;SickKids and an associate professor in the department of molecular genetics at the Temerty Faculty of Medicine,&nbsp;hope to change this dire statistic with the help of magnetically guided robotic nano-scalpels that can precisely target cancer cells and kill them. Findings from their research were recently shared in a&nbsp;new&nbsp;<a href="https://www.science.org/doi/10.1126/sciadv.ade5321">study published in&nbsp;<em>Science Advances</em></a>.&nbsp;</p> <p>For decades, scientists have searched for ways to treat GBM, including&nbsp;conventional surgery, radiation, chemotherapy and targeted therapy. GBM cells quickly reproduce and invade nearby brain tissue and are notoriously difficult to eradicate by conventional surgery.&nbsp;These cells also develop resistance to chemotherapy or targeted therapy. As a result, patients usually relapse after undergoing currently available treatment protocols.&nbsp;&nbsp;</p> <p>Sun and&nbsp;Huang&nbsp;believe that a mechanical nano-surgical approach targeting tumour cells could provide a new and effective treatment option.&nbsp;&nbsp;</p> <p>&nbsp;</p> <p><iframe allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen frameborder="0" height="422px" src="https://www.youtube.com/embed/NFujNA0Ugj8" title="YouTube video player" width="750px"></iframe></p> <p>&nbsp;</p> <p>Sun, who is joint appointed to the department of electrical and computer engineering as well as the department of computer science in the Faculty of Arts and Science&nbsp;and is director of the&nbsp;U of T Robotics Institute,&nbsp;has spent more than 20 years developing micro- and nano-robotic systems&nbsp;– including infertility treatment systems that can select sperm with high DNA integrity and inject it into a human egg.&nbsp;Huang, whose&nbsp;<a href="https://lab.research.sickkids.ca/huang/">lab at SickKids</a>&nbsp;specializes in developmental and stem-cell biology, investigates the physical properties and mechano-electrical-chemical signaling of brain cancer to develop new therapeutic strategies.</p> <p>Together, they designed a precision control system that applies a rotating magnetic field to mobilize magnetic carbon nanotubes (mCNTs) filled with iron oxide particles and demonstrated that mCNT swarms could be activated inside a single cell to function as nano-scalpels.&nbsp;&nbsp;</p> <p>They showed that mechanical stimulations provided by mobilized mCNTs inside GBM cells disrupt cancer cells’ internal structures leading to cell death. Importantly, the team demonstrated that the nano-surgical treatment reduced tumour size and extended the survival of mice bearing chemotherapy-resistant GBM.&nbsp;&nbsp;&nbsp;</p> <p>With evidence from multiple preclinical models confirming the effectiveness of their approach, the researchers are next optimizing the material compositions of mCNTs, the control strategy and the treatment protocol.&nbsp;</p> <p><img alt src="/sites/default/files/2023-04/Robotic%20Brain%20Surgery%20Story%5B1%5D.jpeg"></p> <p><em>As a PhD student at U of T&nbsp;Robotics Institute, Xian Wang worked with Professor Yu Sun to develop a magnetic nano-scale robot that can be moved anywhere inside a human cell&nbsp;(photo by&nbsp;Tyler Irving)</em></p> <p><strong>Xian&nbsp;Wang</strong>&nbsp;–&nbsp;a&nbsp;former post-doctoral researcher&nbsp;in Huang’s lab&nbsp;and a recent graduate of Sun’s lab, where&nbsp;he began this work building&nbsp;magnetic nano-tweezers –&nbsp;is&nbsp;the first author of the paper. His work developing&nbsp;the&nbsp;nano-tweezers is what laid&nbsp;the research foundations for the nano-scalpels&nbsp;used in the&nbsp;current&nbsp;study.&nbsp;He&nbsp;recently joined Queen’s University&nbsp;as an assistant professor.</p> <p>“In addition to physically disrupting cellular structures, mechanically mobilized mCNTs can also modulate specific biomedical pathways,” Wang says. “Based on this, we are now developing a combination therapy to tackle untreatable brain tumours.”&nbsp;&nbsp;</p> <p>While there is still more&nbsp;research to conduct before human trials are initiated, this innovation in mechanical nano-surgery is giving patients, families&nbsp;and the medical community hope that new treatment options are on the horizon for an&nbsp;otherwise untreatable disease.&nbsp;</p> <p>The research was supported by the Natural Sciences and Engineering Research Council of Canada and the Canadian Institutes of Health Research, among others.</p> </div> <div class="field field--name-field-news-home-page-banner field--type-boolean field--label-above"> <div class="field__label">News home page banner</div> <div class="field__item">Off</div> </div> <div class="field field--name-field-add-new-author-reporter field--type-entity-reference field--label-above"> <div class="field__label">Add new author/reporter</div> <div class="field__items"> <div class="field__item"><a href="/news/authors-reporters/hallie-siegel" hreflang="en">Hallie Siegel</a></div> </div> </div> <div class="field field--name-field-add-new-story-tags field--type-entity-reference field--label-above"> <div class="field__label">Add new story tags</div> <div class="field__items"> <div class="field__item"><a href="/news/tags/robotics-institute" hreflang="en">Robotics Institute</a></div> </div> </div> Fri, 14 Apr 2023 14:45:25 +0000 Christopher.Sorensen 301068 at Raquel Urtasun’s self-driving startup Waabi brings on Volvo as strategic investor: Reports /news/raquel-urtasun-s-self-driving-startup-waabi-brings-volvo-strategic-investor-reports <span class="field field--name-title field--type-string field--label-hidden">Raquel Urtasun’s self-driving startup Waabi brings on Volvo as strategic investor: Reports</span> <div class="field field--name-field-featured-picture field--type-image field--label-hidden field__item"> <img loading="eager" srcset="/sites/default/files/styles/news_banner_370/public/RaquelUrtasun_2021_86-crop.jpg?h=afdc3185&amp;itok=YfYM8Dgz 370w, /sites/default/files/styles/news_banner_740/public/RaquelUrtasun_2021_86-crop.jpg?h=afdc3185&amp;itok=sFbAgf-T 740w, /sites/default/files/styles/news_banner_1110/public/RaquelUrtasun_2021_86-crop.jpg?h=afdc3185&amp;itok=WSVlBhoe 1110w" sizes="(min-width:1200px) 1110px, (max-width: 1199px) 80vw, (max-width: 767px) 90vw, (max-width: 575px) 95vw" width="740" height="494" src="/sites/default/files/styles/news_banner_370/public/RaquelUrtasun_2021_86-crop.jpg?h=afdc3185&amp;itok=YfYM8Dgz" alt="A portrait of Raquel Urtasun with her arms crossed, wearing a Waabi T-shirt"> </div> <span class="field field--name-uid field--type-entity-reference field--label-hidden"><span>bresgead</span></span> <span class="field field--name-created field--type-created field--label-hidden"><time datetime="2023-01-20T16:00:46-05:00" title="Friday, January 20, 2023 - 16:00" class="datetime">Fri, 01/20/2023 - 16:00</time> </span> <div class="clearfix text-formatted field field--name-field-cutline-long field--type-text-long field--label-above"> <div class="field__label">Cutline</div> <div class="field__item">(Photo courtesy of Waabi)</div> </div> <div class="field field--name-field-topic field--type-entity-reference field--label-above"> <div class="field__label">Topic</div> <div class="field__item"><a href="/news/topics/our-community" hreflang="en">Our Community</a></div> </div> <div class="field field--name-field-story-tags field--type-entity-reference field--label-hidden field__items"> <div class="field__item"><a href="/news/tags/artificial-intelligence" hreflang="en">Artificial Intelligence</a></div> <div class="field__item"><a href="/news/tags/computer-science" hreflang="en">Computer Science</a></div> <div class="field__item"><a href="/news/tags/faculty-arts-science" hreflang="en">Faculty of Arts &amp; Science</a></div> <div class="field__item"><a href="/news/tags/research-innovation" hreflang="en">Research &amp; Innovation</a></div> <div class="field__item"><a href="/news/tags/robotics" hreflang="en">Robotics</a></div> <div class="field__item"><a href="/news/tags/self-driving-cars" hreflang="en">Self-Driving Cars</a></div> <div class="field__item"><a href="/news/tags/vector-institute" hreflang="en">Vector Institute</a></div> </div> <div class="clearfix text-formatted field field--name-body field--type-text-with-summary field--label-hidden field__item"><p style="margin-bottom:11px">Ƶ researcher <b>Raquel Urtasun’s </b>self-driving startup Waabi <a href="https://waabi.ai/welcoming-volvo-group-venture-capital-as-a-strategic-investor-in-waabi/">has added the venture capital arm of Swedish carmaker Volvo&nbsp;to its list of high-profile investors</a>.</p> <p style="margin-bottom:11px">Volvo Group Venture Capital AB is investing an undisclosed amount in Toronto-based Waabi’s AI-powered autonomous trucking technology, according to <a href="https://techcrunch.com/2023/01/18/self-driving-truck-startup-waabi-brings-on-volvo-vc-as-strategic-investor/">TechCrunch</a>.</p> <p style="margin-bottom:11px">“We’ve been extremely selective in terms of who we bring on board as an investor and this is the right time for Waabi to bring on a strategic OEM [original equipment manufacturer],” Urtasun, a U of T professor of computer science and Waabi’s founder and CEO, tells the high-profile U.S. tech website.</p> <p style="margin-bottom:11px">Waabi has already raised more than $100 million from investors including Khosla Ventures, Uber and other Silicon Valley giants. Also among its backers are AI luminaries <b>Ƶ</b>, a U of T <a href="https://www.provost.utoronto.ca/awards-funding/university-professors/">University Professor</a> Emeritus of computer science, and <b>Sanja Fidler</b>, an associate professor of computer science.</p> <p style="margin-bottom:11px">Urtasun tells the <i><a href="https://www.theglobeandmail.com/business/article-volvo-invests-in-toronto-driverless-vehicle-startup-waabi/?utm_medium=Referrer:+Social+Network+/+Media&amp;utm_campaign=Shared+Web+Article+Links">Globe and Mail</a></i> that Volvo’s investment “significantly” increases Waabi’s valuation.</p> <h3 style="margin-bottom: 11px;"><a href="http://techcrunch.com/2023/01/18/self-driving-truck-startup-waabi-brings-on-volvo-vc-as-strategic-investor/">Read more at<i>&nbsp;</i>TechCrunch</a></h3> <h3 style="margin-bottom: 11px;"><a href="http://www.theglobeandmail.com/business/article-volvo-invests-in-toronto-driverless-vehicle-startup-waabi/?utm_medium=Referrer:+Social+Network+/+Media&amp;utm_campaign=Shared+Web+Article+Links">Read more at the <em>Globe and Mail</em></a><i></i></h3> <p style="margin-bottom:11px">&nbsp;</p> </div> <div class="field field--name-field-news-home-page-banner field--type-boolean field--label-above"> <div class="field__label">News home page banner</div> <div class="field__item">Off</div> </div> Fri, 20 Jan 2023 21:00:46 +0000 bresgead 179294 at Researchers help robots navigate crowded spaces with new visual perception method /news/researchers-help-robots-navigate-crowded-spaces-new-visual-perception-method <span class="field field--name-title field--type-string field--label-hidden">Researchers help robots navigate crowded spaces with new visual perception method</span> <div class="field field--name-field-featured-picture field--type-image field--label-hidden field__item"> <img loading="eager" srcset="/sites/default/files/styles/news_banner_370/public/iStock-1279493735-crop.jpg?h=afdc3185&amp;itok=FnXXVi6F 370w, /sites/default/files/styles/news_banner_740/public/iStock-1279493735-crop.jpg?h=afdc3185&amp;itok=7k3rU_TC 740w, /sites/default/files/styles/news_banner_1110/public/iStock-1279493735-crop.jpg?h=afdc3185&amp;itok=mtI0yfdN 1110w" sizes="(min-width:1200px) 1110px, (max-width: 1199px) 80vw, (max-width: 767px) 90vw, (max-width: 575px) 95vw" width="740" height="494" src="/sites/default/files/styles/news_banner_370/public/iStock-1279493735-crop.jpg?h=afdc3185&amp;itok=FnXXVi6F" alt="crowded downtown city street with many people walking across an intersection"> </div> <span class="field field--name-uid field--type-entity-reference field--label-hidden"><span>Christopher.Sorensen</span></span> <span class="field field--name-created field--type-created field--label-hidden"><time datetime="2022-11-09T15:10:52-05:00" title="Wednesday, November 9, 2022 - 15:10" class="datetime">Wed, 11/09/2022 - 15:10</time> </span> <div class="clearfix text-formatted field field--name-field-cutline-long field--type-text-long field--label-above"> <div class="field__label">Cutline</div> <div class="field__item">Researchers from the U of T Institute for Aerospace Studies have developed a system that improves how robots stitch together a set of images taken from a moving camera to build a 3D model of their environments (photo by iStock/LeoPatrizi)</div> </div> <div class="field field--name-field-author-reporters field--type-entity-reference field--label-hidden field__items"> <div class="field__item"><a href="/taxonomy/term/6738" hreflang="en">Safa Jinje</a></div> </div> <div class="field field--name-field-topic field--type-entity-reference field--label-above"> <div class="field__label">Topic</div> <div class="field__item"><a href="/news/topics/breaking-research" hreflang="en">Breaking Research</a></div> </div> <div class="field field--name-field-story-tags field--type-entity-reference field--label-hidden field__items"> <div class="field__item"><a href="/news/tags/alumni" hreflang="en">Alumni</a></div> <div class="field__item"><a href="/news/tags/faculty-applied-science-engineering" hreflang="en">Faculty of Applied Science &amp; Engineering</a></div> <div class="field__item"><a href="/news/tags/graduate-students" hreflang="en">Graduate Students</a></div> <div class="field__item"><a href="/news/tags/research-innovation" hreflang="en">Research &amp; Innovation</a></div> <div class="field__item"><a href="/news/tags/robotics" hreflang="en">Robotics</a></div> <div class="field__item"><a href="/news/tags/utias" hreflang="en">UTIAS</a></div> </div> <div class="clearfix text-formatted field field--name-body field--type-text-with-summary field--label-hidden field__item"><p>A team of researchers at the Ƶ&nbsp;has found a way to enhance the visual perception of robotic systems by coupling two different types of neural networks.</p> <p>The innovation could help autonomous vehicles navigate busy streets or enable medical robots to work effectively in crowded hospital hallways.&nbsp;</p> <p>“What tends to happen in our field is that when systems don’t perform as expected, the designers make the networks bigger – they add more parameters,” says <strong>Jonathan Kelly</strong>, an assistant professor at the&nbsp;<a href="https://www.utias.utoronto.ca/">Ƶ Institute for Aerospace Studies</a> in the Faculty of Applied Science &amp; Engineering.</p> <p>“What we’ve done instead is to carefully study how the pieces should fit together. Specifically, we investigated how two pieces of the motion estimation problem – accurate perception of depth and motion – can be joined together in a robust way.”&nbsp;&nbsp;</p> <p>Researchers in Kelly’s&nbsp;<a href="https://starslab.ca/">Space and Terrestrial Autonomous Robotic Systems</a>&nbsp;lab aim to build reliable systems that can help humans accomplish a variety of tasks. For example, they’ve designed&nbsp;<a href="https://news.engineering.utoronto.ca/wheelchairs-get-robotic-retrofit-become-self-driving/">an electric wheelchair that can automate some common tasks</a>&nbsp;such as navigating through doorways.&nbsp;&nbsp;</p> <p>More recently, they’ve focused on techniques that will help robots move out of the carefully controlled environments in which they are commonly used today and into the less predictable world&nbsp;humans are accustomed to navigating.&nbsp;&nbsp;</p> <p>“Ultimately, we are looking to develop situational awareness for highly dynamic environments where people operate, whether it’s a crowded hospital hallway, a busy public square&nbsp;or a city street full of traffic and pedestrians,” says Kelly.&nbsp;&nbsp;</p> <p>One challenging problem that robots must solve in all of these spaces is known to the robotics community as “structure from motion.” This is the process by which robots stitch together a set of images taken from a moving camera to build a 3D model of the environment they are in. The process is analogous to the way humans use their eyes to perceive the world around them.&nbsp;&nbsp;</p> <p>In today’s robotic systems, structure from motion is typically achieved in two steps, each of which uses different information from a set of monocular images. One is depth perception, which tells the robot how far away the objects in its field of vision are. The other, known as egomotion, describes the 3D movement of the robot in relation to its environment.&nbsp;</p> <p>“Any robot navigating within a space needs to know how far static and dynamic objects are in relation to itself, as well as how its motion changes a scene,” says Kelly. “For example, when a train moves along a track, a passenger looking out a window can observe that objects at a distance appear to move slowly, while objects nearby zoom past.”&nbsp;&nbsp;</p> <p>&nbsp;</p> <div class="media_embed" height="500px" width="750px"><iframe allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen frameborder="0" height="500px" src="https://www.youtube.com/embed/8Oij81bEoH0" title="YouTube video player" width="750px"></iframe></div> <p>&nbsp;</p> <p>The challenge is that in many current systems, depth estimation is separated from motion estimation – there is no explicit sharing of information between the two neural networks. Joining depth and motion estimation together ensures that each&nbsp;is consistent with the other.&nbsp;&nbsp;&nbsp;</p> <p>“There are constraints on depth that are defined by motion, and there are constraints on motion that are defined by depth,” says Kelly. “If the system doesn’t couple these two neural network components, then&nbsp;the end result is an inaccurate estimate of where everything is in the world and where the robot is in relation.”&nbsp;</p> <p>In a recent study, two of Kelly’s&nbsp;students –&nbsp;<strong>Brandon Wagstaff</strong>, a PhD candidate, and former PhD student&nbsp;<strong>Valentin Peretroukhin</strong>&nbsp;–&nbsp;investigated and improved on existing structure from motion methods.&nbsp;</p> <p>Their new system makes the egomotion prediction a function of depth, increasing the system’s overall accuracy and reliability.&nbsp;<a href="https://www.youtube.com/watch?v=6QEDCooyUjE">They recently presented their work</a> at the International Conference on Intelligent Robots and Systems (IROS) in Kyoto, Japan.&nbsp;&nbsp;</p> <p>“Compared with existing learning-based methods, our new system was able to reduce the motion estimation error by approximately 50 per cent,” says Wagstaff.&nbsp;&nbsp;</p> <p>“This improvement in motion estimation accuracy was demonstrated not only on data similar to that used to train the network, but also on significantly different forms of data, indicating that the proposed method was able to generalize across many different environments.”&nbsp;</p> <p>Maintaining accuracy when operating within novel environments is challenging for neural networks. The team has since expanded their research beyond visual motion estimation to include inertial sensing – an extra sensor that is akin to the vestibular system in the human ear.&nbsp;&nbsp;</p> <p>“We are now working on robotic applications that can mimic a human’s eyes and inner ears, which provides information about balance, motion and acceleration,” says Kelly.&nbsp;&nbsp;&nbsp;</p> <p>“This will enable even more accurate motion estimation to handle situations like dramatic scene changes — such as an environment suddenly getting darker when a car enters a tunnel, or a camera failing when it looks directly into the sun.”&nbsp;&nbsp;</p> <p>The potential applications for such new approaches are diverse, from improving the handling of self-driving vehicles to enabling aerial drones to fly safely through crowded environments to deliver goods or carry out environmental monitoring.&nbsp;&nbsp;</p> <p>“We are not building machines that are left in cages,” says Kelly. “We want to design robust robots that can move safely around people and environments.”&nbsp;</p> </div> <div class="field field--name-field-news-home-page-banner field--type-boolean field--label-above"> <div class="field__label">News home page banner</div> <div class="field__item">Off</div> </div> Wed, 09 Nov 2022 20:10:52 +0000 Christopher.Sorensen 177980 at Students push the boundaries of research and innovation: Groundbreakers S2 Ep.4 /news/students-push-boundaries-research-and-innovation-groundbreakers-s2-ep4 <span class="field field--name-title field--type-string field--label-hidden">Students push the boundaries of research and innovation: Groundbreakers S2 Ep.4</span> <span class="field field--name-uid field--type-entity-reference field--label-hidden"><span>Christopher.Sorensen</span></span> <span class="field field--name-created field--type-created field--label-hidden"><time datetime="2022-11-08T10:04:32-05:00" title="Tuesday, November 8, 2022 - 10:04" class="datetime">Tue, 11/08/2022 - 10:04</time> </span> <div class="field field--name-field-youtube field--type-youtube field--label-hidden field__item"><figure class="youtube-container"> <iframe src="https://www.youtube.com/embed/AnvrorYh_pY?wmode=opaque" width="450" height="315" id="youtube-field-player" class="youtube-field-player" title="Embedded video for Students push the boundaries of research and innovation: Groundbreakers S2 Ep.4" aria-label="Embedded video for Students push the boundaries of research and innovation: Groundbreakers S2 Ep.4: https://www.youtube.com/embed/AnvrorYh_pY?wmode=opaque" frameborder="0" allowfullscreen></iframe> </figure> </div> <div class="field field--name-field-topic field--type-entity-reference field--label-above"> <div class="field__label">Topic</div> <div class="field__item"><a href="/news/topics/our-community" hreflang="en">Our Community</a></div> </div> <div class="field field--name-field-story-tags field--type-entity-reference field--label-hidden field__items"> <div class="field__item"><a href="/news/tags/groundbreakers" hreflang="en">Groundbreakers</a></div> <div class="field__item"><a href="/news/tags/institutional-strategic-initiatives" hreflang="en">Institutional Strategic Initiatives</a></div> <div class="field__item"><a href="/news/tags/temerty-faculty-medicine" hreflang="en">Temerty Faculty of Medicine</a></div> <div class="field__item"><a href="/news/tags/alumni" hreflang="en">Alumni</a></div> <div class="field__item"><a href="/news/tags/graduate-students" hreflang="en">Graduate Students</a></div> <div class="field__item"><a href="/news/tags/medicine-design" hreflang="en">Medicine by Design</a></div> <div class="field__item"><a href="/news/tags/research-innovation" hreflang="en">Research &amp; Innovation</a></div> <div class="field__item"><a href="/news/tags/robotics" hreflang="en">Robotics</a></div> <div class="field__item"><a href="/news/tags/u-t-mississauga" hreflang="en">U of T Mississauga</a></div> </div> <div class="clearfix text-formatted field field--name-body field--type-text-with-summary field--label-hidden field__item"><p class="xx"><span style="background:white"><span style="border:1pt none windowtext; padding:0cm">From launching rovers into space to exploring whether green cannabinoids can treat epilepsy in children, students&nbsp;at the Ƶ are taking research and innovation in bold new directions.</span></span></p> <p class="xx"><span style="background:white"><span style="border:1pt none windowtext; padding:0cm">In Ep. 4 of the&nbsp;<i>Groundbreakers</i>&nbsp;video series,&nbsp;</span><span style="border:1pt none windowtext; padding:0cm">host&nbsp;<b>Ainka&nbsp;Jess</b></span><b>&nbsp;</b><span style="border:1pt none windowtext; padding:0cm">goes behind the scenes with student researchers from&nbsp;</span><span style="border:1pt none windowtext; padding:0cm">the&nbsp;</span><a href="https://robotics.utoronto.ca/" target="_blank"><span style="border:1pt none windowtext; padding:0cm">Robotics Institute</span></a><span style="border:1pt none windowtext; padding:0cm">&nbsp;and&nbsp;</span><a href="https://mbd.utoronto.ca/" target="_blank"><span style="border:1pt none windowtext; padding:0cm">Medicine by Design</span></a><span style="border:1pt none windowtext; padding:0cm">&nbsp;strategic initiatives, as well as the&nbsp;</span><a href="https://entrepreneurs.utoronto.ca/for-entrepreneurs/black-founders-network/" target="_blank"><span style="border:1pt none windowtext; padding:0cm">Black Founders Network</span></a><span style="border:1pt none windowtext; padding:0cm">.</span></span></p> <p class="xx"><span style="background:white"><span style="border:1pt none windowtext; padding:0cm">Some of their work is literally out of this world.</span></span></p> <p class="x"><span style="background:white">“I think as humans we are very curious creatures and I see planetary robots as a way to extend our reach in the solar system, so I’m actually really excited about what these rovers can do in the future,” says <b>Olivier Lamarre</b>, a PhD candidate in planetary robotics at U of T’s&nbsp;<a href="https://starslab.ca/" target="_blank"><span style="border:1pt none windowtext; padding:0cm">STARS Laboratory</span></a>.</span></p> <p class="x"><span style="background:white"><span style="border:1pt none windowtext; padding:0cm">The episode also features&nbsp;<b>Kareem Abdur-Rashid</b> and<b>&nbsp;Kamaluddin Abdur-Rashid</b> – both alumni of U of T and co-founders and <a href="https://www.chemistry.utoronto.ca/news/father-and-son-team-create-green-cannabinoids">co-directors of Kare Chemical Technologies</a> – and<b>&nbsp;Justine&nbsp;Bajohr</b>,&nbsp;a PhD candidate <a href="https://www.faiz-lab.com/">in the lab of <b>Maryam Faiz</b></a>, an assistant professor in the department of surgery in the Temerty Faculty of Medicine.</span></span></p> <p class="xx"><span style="background:white"><i><span style="border:1pt none windowtext; padding:0cm">Groundbreakers</span></i><span style="border:1pt none windowtext; padding:0cm">&nbsp;is a multimedia series that includes articles at <i>U of T News</i> and features research leaders involved with U of T’s <a href="https://isi.utoronto.ca/">Institutional Strategic Initiatives</a>, whose work will transform lives.</span></span></p> <h3 class="xx"><a href="https://www.youtube.com/watch?v=AnvrorYh_pY"><span style="background:white"><span style="border:1pt none windowtext; padding:0cm">Watch S2 Ep.4 of Groundbreakers</span></span></a></h3> <p>&nbsp;</p> <p>&nbsp;</p> </div> <div class="field field--name-field-news-home-page-banner field--type-boolean field--label-above"> <div class="field__label">News home page banner</div> <div class="field__item">Off</div> </div> Tue, 08 Nov 2022 15:04:32 +0000 Christopher.Sorensen 178033 at ‘It’s a really cool place’: U of T Mississauga students get hands-on experience in new robotics teaching lab /news/it-s-really-cool-place-u-t-mississauga-undergrads-get-hands-experience-new-robotics-teaching <span class="field field--name-title field--type-string field--label-hidden">‘It’s a really cool place’: U of T Mississauga students get hands-on experience in new robotics teaching lab</span> <div class="field field--name-field-featured-picture field--type-image field--label-hidden field__item"> <img loading="eager" srcset="/sites/default/files/styles/news_banner_370/public/0907RoboticsLabOpen017%20%282%29.jpg?h=81d682ee&amp;itok=aKl9NMNJ 370w, /sites/default/files/styles/news_banner_740/public/0907RoboticsLabOpen017%20%282%29.jpg?h=81d682ee&amp;itok=6GA4F7l4 740w, /sites/default/files/styles/news_banner_1110/public/0907RoboticsLabOpen017%20%282%29.jpg?h=81d682ee&amp;itok=V4Rgo6CX 1110w" sizes="(min-width:1200px) 1110px, (max-width: 1199px) 80vw, (max-width: 767px) 90vw, (max-width: 575px) 95vw" width="740" height="494" src="/sites/default/files/styles/news_banner_370/public/0907RoboticsLabOpen017%20%282%29.jpg?h=81d682ee&amp;itok=aKl9NMNJ" alt="&quot;&quot;"> </div> <span class="field field--name-uid field--type-entity-reference field--label-hidden"><span>rahul.kalvapalle</span></span> <span class="field field--name-created field--type-created field--label-hidden"><time datetime="2022-09-29T14:49:46-04:00" title="Thursday, September 29, 2022 - 14:49" class="datetime">Thu, 09/29/2022 - 14:49</time> </span> <div class="clearfix text-formatted field field--name-field-cutline-long field--type-text-long field--label-above"> <div class="field__label">Cutline</div> <div class="field__item">Students and visitors get a hands-on demonstration during the official opening of U of T Mississauga's Undergraduate Robotics Teaching Laboratory (photo by Nick Iwanyshyn)</div> </div> <div class="field field--name-field-author-reporters field--type-entity-reference field--label-hidden field__items"> <div class="field__item"><a href="/news/authors-reporters/kristy-strauss" hreflang="en">Kristy Strauss</a></div> </div> <div class="field field--name-field-topic field--type-entity-reference field--label-above"> <div class="field__label">Topic</div> <div class="field__item"><a href="/news/topics/our-community" hreflang="en">Our Community</a></div> </div> <div class="field field--name-field-story-tags field--type-entity-reference field--label-hidden field__items"> <div class="field__item"><a href="/news/tags/institutional-strategic-initiatives" hreflang="en">Institutional Strategic Initiatives</a></div> <div class="field__item"><a href="/news/tags/computer-science" hreflang="en">Computer Science</a></div> <div class="field__item"><a href="/news/tags/faculty-applied-science-engineering" hreflang="en">Faculty of Applied Science &amp; Engineering</a></div> <div class="field__item"><a href="/news/tags/faculty-arts-science" hreflang="en">Faculty of Arts &amp; Science</a></div> <div class="field__item"><a href="/news/tags/graduate-students" hreflang="en">Graduate Students</a></div> <div class="field__item"><a href="/news/tags/mechanical-industrial-engineering" hreflang="en">Mechanical &amp; Industrial Engineering</a></div> <div class="field__item"><a href="/news/tags/robotics" hreflang="en">Robotics</a></div> <div class="field__item"><a href="/news/tags/u-t-mississauga" hreflang="en">U of T Mississauga</a></div> <div class="field__item"><a href="/news/tags/undergraduate-students" hreflang="en">Undergraduate Students</a></div> </div> <div class="clearfix text-formatted field field--name-body field--type-text-with-summary field--label-hidden field__item"><p style="margin-bottom:16px"><span style="box-sizing:border-box"><span style="box-sizing:border-box"><span style="box-sizing:border-box"><span style="box-sizing:border-box"><span style="box-sizing:border-box"><span style="font-weight:bolder"><span style="box-sizing:border-box"><span style="box-sizing:border-box">Laura Maldonado</span></span></span></span><span style="box-sizing:border-box"><span style="box-sizing:border-box">&nbsp;beams as she describes her first day learning in U of T Mississauga’s Undergraduate Robotics Teaching Laboratory.</span></span></span></span></span></span></p> <p style="margin-bottom:16px"><span style="box-sizing:border-box"><span style="box-sizing:border-box">“It’s a really cool place,” she says, as she&nbsp;takes out her phone to show a video of one of the lab’s robots in action. “We actually play with these robots and get hands-on experience.”</span></span></p> <p style="margin-bottom:16px"><span style="box-sizing:border-box"><span style="box-sizing:border-box">The computer science specialist student was among the first group of undergraduates in the third-year Fundamentals of Robotics class to use the new lab, which officially opened its doors on Sept.&nbsp;7.</span></span></p> <p><b>Jessica Burgner-Kahrs</b>, an associate professor in the department of mathematical and computational sciences who spearheaded the lab's creation, said the lab will be used primarily for computer science students who take robotics courses in their third and fourth years.</p> <p>However, the lab will also be available&nbsp;to the broader U of T community – including graduate students in computer science, mechanical engineering and aerospace studies, says Burgner-Kahrs, who is cross-appointed to the department&nbsp;of computer science in the Faculty of Arts &amp; Science and the department mechanical and industrial engineering in the Faculty of Applied Science &amp; Engineering;&nbsp;and is founding director&nbsp;of the&nbsp;<a href="https://crl.utm.utoronto.ca/" target="_blank">Continuum Robotics Laboratory</a>.</p> <p>She adds that it will also be a resource for the <a href="http://robotics.utoronto.ca/">U of T&nbsp;Robotics Institute</a>, where she is an associate director.</p> <p>“I’m very thankful and very grateful that we now have this teaching lab for students, and you can tell by the students’ faces how happy they are,” she says. “I think it’s, by far, the most up-to-date teaching lab I’ve seen anywhere in Canada.”</p> <p>Burgner-Kahrs, who teaches Maldonado’s "Fundamentals of Robotics" class, says the lab includes revolutionary types of robots called “cobots,” or “collaborative robots,” that have multiple movable joints and safety sensors so they can safely interact with humans. She says these kinds of robots have become more prevalent in recent years.</p> <p>“They are more used for co-operative tasks, where humans and robots work alongside (each other),” explains Burgner-Kahrs, adding that they can be used to automate some tasks or perform more delicate tasks. “Robotics engineering is one of the fastest growing job markets, and all these new jobs will entail being familiar with these collaborative robots. It will put our students at an advantage in the job market.”</p> <p><b>Sven Lilge</b>, PhD student and teaching assistant for&nbsp;the robotics fundamentals course, says the lab offers a rare opportunity for undergraduate students to learn&nbsp;how robots work through practical experience.&nbsp;“What’s really unique is we have the ability for all of our students in our class, which is more than 60 students, to work hands-on with a robot,” he says. “It’s really a game-changer.”</p> <p>Maldonado says the lab is also helping her apply learning from previous math courses – a subject that she admits isn’t her strongest.</p> <p>“I used to think, ‘Why are we learning linear algebra? Why are we learning calculus? What’s the point of this?’” she says. “Now, I understand that I need to know how the robot moves. I need to understand three dimensions. I can physically see it with the robot.”</p> <p>Maldonado adds that she feels lucky to be able to use a robotics lab during her undergraduate degree.</p> <p>“The only way (students before me) really got to work on robotic stuff was if they maybe got research positions, or maybe they tried getting some internships – which was hard if you didn’t have physical experience,” she says. “Now, we get this hands-on experience. I think it’s a privilege.”</p> </div> <div class="field field--name-field-news-home-page-banner field--type-boolean field--label-above"> <div class="field__label">News home page banner</div> <div class="field__item">Off</div> </div> Thu, 29 Sep 2022 18:49:46 +0000 rahul.kalvapalle 177047 at South Korean President Yoon Suk-yeol visits U of T for AI roundtable /news/south-korean-president-yoon-suk-yeol-visits-u-t-ai-roundtable <span class="field field--name-title field--type-string field--label-hidden">South Korean President Yoon Suk-yeol visits U of T for AI roundtable</span> <div class="field field--name-field-featured-picture field--type-image field--label-hidden field__item"> <img loading="eager" srcset="/sites/default/files/styles/news_banner_370/public/2022-09-22-AI-Leaders-Roundtable-Polina-Teif--13-crop.jpg?h=afdc3185&amp;itok=Y0qiQPcq 370w, /sites/default/files/styles/news_banner_740/public/2022-09-22-AI-Leaders-Roundtable-Polina-Teif--13-crop.jpg?h=afdc3185&amp;itok=KQtKeFQj 740w, /sites/default/files/styles/news_banner_1110/public/2022-09-22-AI-Leaders-Roundtable-Polina-Teif--13-crop.jpg?h=afdc3185&amp;itok=cjQy4gIA 1110w" sizes="(min-width:1200px) 1110px, (max-width: 1199px) 80vw, (max-width: 767px) 90vw, (max-width: 575px) 95vw" width="740" height="494" src="/sites/default/files/styles/news_banner_370/public/2022-09-22-AI-Leaders-Roundtable-Polina-Teif--13-crop.jpg?h=afdc3185&amp;itok=Y0qiQPcq" alt="South Korean president Yoon Suk-yeol shakes hands with U of T President Meric Gertler outside of Simcoe Hall at the Ƶ, St. George campus"> </div> <span class="field field--name-uid field--type-entity-reference field--label-hidden"><span>Christopher.Sorensen</span></span> <span class="field field--name-created field--type-created field--label-hidden"><time datetime="2022-09-26T10:02:18-04:00" title="Monday, September 26, 2022 - 10:02" class="datetime">Mon, 09/26/2022 - 10:02</time> </span> <div class="clearfix text-formatted field field--name-field-cutline-long field--type-text-long field--label-above"> <div class="field__label">Cutline</div> <div class="field__item">South Korean President Yoon Suk-yeol, left, shakes hands with U of T President Meric Gertler outside of Simcoe Hall (photo by Polina Teif)</div> </div> <div class="field field--name-field-author-reporters field--type-entity-reference field--label-hidden field__items"> <div class="field__item"><a href="/news/authors-reporters/rahul-kalvapalle" hreflang="en">Rahul Kalvapalle</a></div> </div> <div class="field field--name-field-topic field--type-entity-reference field--label-above"> <div class="field__label">Topic</div> <div class="field__item"><a href="/news/topics/global-lens" hreflang="en">Global Lens</a></div> </div> <div class="field field--name-field-story-tags field--type-entity-reference field--label-hidden field__items"> <div class="field__item"><a href="/news/tags/leah-cowen" hreflang="en">Leah Cowen</a></div> <div class="field__item"><a href="/news/tags/artificial-intelligence" hreflang="en">Artificial Intelligence</a></div> <div class="field__item"><a href="/news/tags/cifar" hreflang="en">CIFAR</a></div> <div class="field__item"><a href="/news/tags/computer-science" hreflang="en">Computer Science</a></div> <div class="field__item"><a href="/news/tags/deep-learning" hreflang="en">Deep Learning</a></div> <div class="field__item"><a href="/news/tags/faculty-applied-science-engineering" hreflang="en">Faculty of Applied Science &amp; Engineering</a></div> <div class="field__item"><a href="/news/tags/faculty-arts-science" hreflang="en">Faculty of Arts &amp; Science</a></div> <div class="field__item"><a href="/news/tags/geoffrey-hinton" hreflang="en">Ƶ</a></div> <div class="field__item"><a href="/news/tags/global" hreflang="en">Global</a></div> <div class="field__item"><a href="/news/tags/meric-gertler" hreflang="en">Meric Gertler</a></div> <div class="field__item"><a href="/news/tags/research-innovation" hreflang="en">Research &amp; Innovation</a></div> <div class="field__item"><a href="/news/tags/robotics" hreflang="en">Robotics</a></div> <div class="field__item"><a href="/news/tags/south-korea" hreflang="en">South Korea</a></div> <div class="field__item"><a href="/news/tags/vector-institute" hreflang="en">Vector Institute</a></div> </div> <div class="clearfix text-formatted field field--name-body field--type-text-with-summary field--label-hidden field__item"><p style="margin-bottom:11px">The Ƶ welcomed South Korean President Yoon Suk-yeol to campus last week to discuss artificial intelligence (AI) – its rise, potential applications and opportunities for further collaboration between U of T and South Korean partners.</p> <p style="margin-bottom:11px">President Yoon hailed Toronto as an AI powerhouse, saying that Canada’s status as a world leader in AI and a centre of the global AI supply chain was the result of the country recognizing the potential economic and social impacts of the technology early on.</p> <p style="margin-bottom:11px">He also said the tenacity and persistence of researchers such as <a href="https://www.provost.utoronto.ca/awards-funding/university-professors/">University Professor</a> Emeritus <b>Ƶ</b>, a pioneer of the AI field of deep learning, served as a “benchmark” for South Korean efforts to advance the technologies of the future, adding that he was delighted to visit U of T, which he described as “one of the most prestigious universities in North America.”</p> <p style="margin-bottom:11px">U of T President <b>Meric Gertler</b>, for his part,<b> </b>said he was “deeply honoured” to welcome President Yoon, who, he said, “has made it a priority to work closely with South Korea's allies and partners, advancing openness, human rights, democracy and the rule of law, with clear purpose and integrity.”</p> <p style="margin-bottom:11px"><span id="cke_bm_324S" style="display: none;">&nbsp;</span><img alt="South Korean President Yoon Suk-yeol and U of T President Meric Gertler stand in front of a sign welcoming the South Koreans in South Korean text at Simcoe Hall" src="/sites/default/files/2022-09-22-AI-Leaders-Roundtable-Polina-Teif--16-crop.jpg" style="width: 750px; height: 500px;"></p> <p style="margin-bottom:11px"><em>(Photo by Polina Teif)</em></p> <p style="margin-bottom:11px">President Gertler noted that the South Korean delegation’s visit comes at a time when Toronto has emerged as the <a href="/news/toronto-quietly-experiences-massive-tech-boom-new-york-times">third-largest tech hub in North America</a>, with the city’s AI and machine learning ecosystem at the heart of this growth.</p> <p style="margin-bottom:11px">“Together with the Vector Institute, the Canadian Institute for Advanced Research (CIFAR), MaRS and other partners – all within a walking distance of this room – we have created one of the world’s richest pools of talent,” President Gertler said.</p> <p style="margin-bottom:11px">He added that U of T, its local partners and South Korean organizations stand to learn much from each other when it comes to AI research, development, innovation and education.</p> <p style="margin-bottom:11px">“Partnering with Korea’s leading universities, innovative firms and exceptionally talented researchers is an extraordinary opportunity for all parties to benefit as we deepen our collective commitment to excellence and to tackling the world’s most pressing challenges.”</p> <p style="margin-bottom:11px"><span id="cke_bm_1273S" style="display: none;">&nbsp;</span><img alt="Minister of Science and ICT Lee Jong-ho speaks at the roundtable in Simcoe Hall. Alyssa Strome, Lisa Austin, President Yoon Suk-yeol, Garth Gibson, Meric Gerler and Leah Cowen are present at the table." src="/sites/default/files/2022-09-22-AI-Leaders-Roundtable-%2820%29-crop.jpg" style="width: 750px; height: 500px;"></p> <p style="margin-bottom:11px"><em>(Photo by Johnny Guatto)</em></p> <p style="margin-bottom:11px">President Yoon’s visit to U of T took place during the first day of his two-day visit to Canada, which included a meeting with Prime Minister Justin Trudeau in Ottawa the following day.</p> <p style="margin-bottom:11px">It also came less than two weeks after the government of Ontario concluded a trade mission to South Korea and Japan, led by Vic Fedeli, the province’s minister of economic development job creation and trade.</p> <p style="margin-bottom:11px">Fedeli, who attended the U of T event, said Toronto’s reputation as a global hub in AI was regularly impressed upon him during his time in South Korea.</p> <p style="margin-bottom:11px">“At every single stop that we made, we heard people talk about Canada, AI, U of T, the Vector Institute – they see Canada as a real leader in AI and they’re very eager to learn,” Fedeli said.</p> <p style="margin-bottom:11px">He noted there was a strong desire in South Korea to see more Korean students come to Canada to further their education in STEM fields, including in AI. “They want a bigger influx of Korean students – and we told them, ‘The door’s open,’ because we really believe this is going to help society. We’ve seen some examples of what AI has done and we’re very eager to continue to see the development of AI.”</p> <p style="margin-bottom:11px"><span id="cke_bm_4560S" style="display: none;">&nbsp;</span><img alt="President Yoon Suk-yeol shakes hands with guests inside Simcoe Hall" src="/sites/default/files/2022-09-22-AI-Leaders-Roundtable-%2850%29-crop.jpg" style="width: 750px; height: 500px;"></p> <p style="margin-bottom:11px"><em>(Photo by Johnny Guatto)</em></p> <p style="margin-bottom:11px">Fedeli added that he hoped the high-level meeting would further strengthen the economic relationship between Ontario and South Korea, helping to spark AI advances that give both Ontarian and Korean companies a competitive edge on the global stage.</p> <p style="margin-bottom:11px">Held at Simcoe Hall, the meeting included a roundtable discussion titled “AI for the Better Future of Humanity,” that featured AI leaders and luminaries, including Hinton and Lee Jong-ho, the Republic of Korea’s Minister of Science and ICT (information and communication technology).</p> <p style="margin-bottom:11px">The talk, moderated by <b>Leah Cowen </b>(pictured below), U of T’s vice-president, research and innovation, and strategic initiatives, also included contributions from <b>Garth Gibson</b>, president and CEO of the Vector Institute for Artificial Intelligence; <b>Elissa Strome</b>, executive director of Pan-Canadian AI Strategy at CIFAR; and Professor <b>Lisa Austin</b>, chair in law and technology at U of T’s Faculty of Law and associate director at the <a href="https://srinstitute.utoronto.ca/">Schwartz Reisman Institute for Technology and Society</a>.</p> <p style="margin-bottom:11px"><span id="cke_bm_1983S" style="display: none;">&nbsp;</span><img alt="Professor Leah Cowen speaks at the roundtable" src="/sites/default/files/2022-09-22-AI-Leaders-Roundtable-%2839%29-crop.jpg" style="width: 750px; height: 500px;"></p> <p style="margin-bottom:11px"><em>(Photo by Johnny Guatto)</em></p> <p style="margin-bottom:11px">Attendees watched demonstrations by U of T professors and graduate students from the U of T Robotics Institute, as well as presentations by South Korean companies, including Samsung and LG – both of which have expanded their presence and <a href="/news/samsung-chooses-u-t-s-sven-dickinson-lead-new-toronto-ai-centre">connections with Toronto</a> <a href="/news/lg-expands-research-partnership-u-t-focuses-ai-applications-businesses">and U of T</a> in recent years – and was also used to announce a new U of T exchange program with the South Korean government’s Institute for Information &amp; communication Technology Planning &amp; evaluation (IITP).&nbsp;</p> <p style="margin-bottom:11px"><span id="cke_bm_5118S" style="display: none;">&nbsp;</span><img alt="Guests including Scott Mabury, Kelly Hannah Moffat and Wisdom Tettey applaud following remarks by Ƶ" src="/sites/default/files/2022-09-22-AI-Leaders-Roundtable-Polina-Teif--41-crop.jpg" style="width: 750px; height: 500px;"></p> <p style="margin-bottom:11px"><em>(Photo by Polina Teif)</em></p> <p style="margin-bottom:11px">On the subject of AI, Hinton said he believes the deep learning revolution is just getting underway and that he expects tremendous growth in the years ahead.</p> <p style="margin-bottom:11px">“We now know that if you take a neural net and you just make it bigger and give it more data and more computing power, it’ll work better. So even with no new scientific insights, things are going to improve,” Hinton said during the roundtable discussion. “But we also know there are tens of thousands of brilliant young minds now thinking about how to make these networks better, so there will be many new scientific insights.”</p> <p style="margin-bottom:11px"><span id="cke_bm_2621S" style="display: none;">&nbsp;</span><img alt="Ƶ speaks at the podium at Simcoe Hall" src="/sites/default/files/2022-09-22-AI-Leaders-Roundtable-%2841%29-crop.jpg" style="width: 750px; height: 500px;"></p> <p style="margin-bottom:11px"><em>(Photo by Johnny Guatto)</em></p> <p style="margin-bottom:11px">In the long-term, Hinton (pictured at the lecturn above)&nbsp;said he envisions a revolution in AI hardware led by advancements in “neuromorphic hardware” – computers and hardware that model artificial neural networks.</p> <p style="margin-bottom:11px">“I think Korea may have a big role to play in this,” Hinton said, noting one of the world’s leading experts in this area is Sebastian Seung, Samsung’s president and head of research – who attended the Simcoe Hall event.</p> <p style="margin-bottom:11px">When asked to share his thoughts on how Canada achieved its leadership position in AI, Hinton cited three foundational factors: a tolerant, liberal society that encourages leading researchers to settle here; the federal government’s funding for curiosity-driven basic research; and CIFAR’s funding, in 2004, of the Neural Computation and Adaptive Perception program, which is credited with kickstarting the revolution in deep learning.</p> <p style="margin-bottom:11px">Following the discussion, event attendees, including U of T students, watched presentations on avenues for AI research and collaboration by representatives of five South Korean companies: LG, Samsung, Naver, KT (formerly Korea Telecom) and SK Telecom.</p> <p style="margin-bottom:11px"><span id="cke_bm_3226S" style="display: none;">&nbsp;</span><img alt="Brokoslaw Laschowski runs a robotics demonstration for Alex Mihalidis, President Yoon Suk-yeol and President Meric Gertler" src="/sites/default/files/2022-09-22-AI-Leaders-Roundtable-%284%29-crop.jpg" style="width: 750px; height: 500px;"></p> <p style="margin-bottom:11px"><em>(Photo by Johnny Guatto)</em></p> <p style="margin-bottom:11px"><b>Alex Mihailidis</b>, U of T’s associate vice-president, international partnerships, then announced that U of T had signed a memorandum of understanding with IITP, based in Seoul, to launch a bi-national education program in AI.</p> <p style="margin-bottom:11px">“We expect that in the fall of 2023, we will be accepting 30 students from Korea who will be going through a custom-made program around AI and its applications,” Mihailidis said. “This is a groundbreaking program that we expect will not only flourish here in Toronto but will grow – hopefully across our two great countries and around the world.”</p> <p style="margin-bottom:11px"><span id="cke_bm_3794S" style="display: none;">&nbsp;</span><img alt="Xinyu Liu runs a robotic hand demonstration for President Yoon Suk-yeol and President Meric Gertler" src="/sites/default/files/2022-09-22-AI-Leaders-Roundtable-%289%29-crop.jpg" style="width: 750px; height: 500px;"></p> <p style="margin-bottom:11px"><em>(Photo by Johnny Guatto)</em></p> <p style="margin-bottom:11px">Earlier, Mihailidis and President Gertler led President Yoon and Fedeli through four demonstrations showcasing some of the cutting-edge technologies being developed by U of T professors and their graduate students.</p> <p style="margin-bottom:11px">The technologies included: a wearable robotic exoskeleton for walking assistance and rehab demonstrated by Mihailidis and post-doctoral researcher <b>Brokoslaw Laschowski</b>; a sensory soft robotic hand for human-robot interaction demonstrated by Professor <b>Xinyu Liu </b>of the department of mechanical and industrial engineering in the Faculty of Applied Science &amp; Engineering, graduate student <b>Zhanfeng Zhou </b>and post-doctoral researcher <b>Peng Pan</b>; a multimodal perception system for autonomous vehicles showcased by <b>Jiachen (Jason) Zhou</b>, a graduate student in robotics and aerospace engineering; and a nanorobot for precision manipulation under electron microscope that was demonstrated by <b>Yu Sun</b>, professor in the department of mechanical and industrial engineering and director of the U of T Robotics Institute.</p> <p style="margin-bottom:11px"><span id="cke_bm_5676S" style="display: none;">&nbsp;</span><img alt="Professor Yu Sun shows President Yoon Suk-yeol an electronic device" src="/sites/default/files/2022-09-22-AI-Leaders-Roundtable-%2810%29-crop.jpg" style="width: 750px; height: 500px;"></p> <p style="margin-bottom:11px"><em>(Photo by Johnny Guatto)</em></p> <h3 style="margin-bottom: 11px;"><a href="https://www.koreatimes.co.kr/www/nation/2022/09/356_336616.html">Read a story about the visit in the <i>Korea Times</i></a></h3> <p style="margin-bottom:11px">&nbsp;</p> <p style="margin-bottom:11px">&nbsp;</p> </div> <div class="field field--name-field-news-home-page-banner field--type-boolean field--label-above"> <div class="field__label">News home page banner</div> <div class="field__item">Off</div> </div> Mon, 26 Sep 2022 14:02:18 +0000 Christopher.Sorensen 176928 at Researchers design 'socially aware' robots that can anticipate – and safely avoid – people on the move /news/researchers-design-socially-aware-robots-can-anticipate-and-safely-avoid-people-move <span class="field field--name-title field--type-string field--label-hidden">Researchers design 'socially aware' robots that can anticipate – and safely avoid – people on the move</span> <div class="field field--name-field-featured-picture field--type-image field--label-hidden field__item"> <img loading="eager" srcset="/sites/default/files/styles/news_banner_370/public/Hugues-Thomas-robotics-story-weblead.jpg?h=afdc3185&amp;itok=92aueC8y 370w, /sites/default/files/styles/news_banner_740/public/Hugues-Thomas-robotics-story-weblead.jpg?h=afdc3185&amp;itok=POt2dsrM 740w, /sites/default/files/styles/news_banner_1110/public/Hugues-Thomas-robotics-story-weblead.jpg?h=afdc3185&amp;itok=weHgrGz7 1110w" sizes="(min-width:1200px) 1110px, (max-width: 1199px) 80vw, (max-width: 767px) 90vw, (max-width: 575px) 95vw" width="740" height="494" src="/sites/default/files/styles/news_banner_370/public/Hugues-Thomas-robotics-story-weblead.jpg?h=afdc3185&amp;itok=92aueC8y" alt="&quot;&quot;"> </div> <span class="field field--name-uid field--type-entity-reference field--label-hidden"><span>Christopher.Sorensen</span></span> <span class="field field--name-created field--type-created field--label-hidden"><time datetime="2022-05-17T12:54:39-04:00" title="Tuesday, May 17, 2022 - 12:54" class="datetime">Tue, 05/17/2022 - 12:54</time> </span> <div class="clearfix text-formatted field field--name-field-cutline-long field--type-text-long field--label-above"> <div class="field__label">Cutline</div> <div class="field__item">Hugues Thomas and his collaborators at the U of T Institute for Aerospace Studies created a new method for robot navigation based on self-supervised deep learning (photo by Safa Jinje)</div> </div> <div class="field field--name-field-author-reporters field--type-entity-reference field--label-hidden field__items"> <div class="field__item"><a href="/taxonomy/term/6738" hreflang="en">Safa Jinje</a></div> </div> <div class="field field--name-field-topic field--type-entity-reference field--label-above"> <div class="field__label">Topic</div> <div class="field__item"><a href="/news/topics/our-community" hreflang="en">Our Community</a></div> </div> <div class="field field--name-field-story-tags field--type-entity-reference field--label-hidden field__items"> <div class="field__item"><a href="/news/tags/artificial-intelligence" hreflang="en">Artificial Intelligence</a></div> <div class="field__item"><a href="/news/tags/faculty-applied-science-engineering" hreflang="en">Faculty of Applied Science &amp; Engineering</a></div> <div class="field__item"><a href="/news/tags/machine-learning" hreflang="en">machine learning</a></div> <div class="field__item"><a href="/news/tags/research-innovation" hreflang="en">Research &amp; Innovation</a></div> <div class="field__item"><a href="/news/tags/robotics" hreflang="en">Robotics</a></div> <div class="field__item"><a href="/news/tags/utias" hreflang="en">UTIAS</a></div> </div> <div class="clearfix text-formatted field field--name-body field--type-text-with-summary field--label-hidden field__item"><p>A team of researchers led by Ƶ Professor&nbsp;<strong>Tim Barfoot&nbsp;</strong>is using a&nbsp;new strategy that allows robots to&nbsp;avoid colliding&nbsp;with people by predicting the future locations of dynamic obstacles in their path.&nbsp;</p> <p>The project, which is supported by&nbsp;Apple Machine Learning, will be presented at the International Conference on Robotics and Automation in Philadelphia at the end of May.</p> <p>The results from a simulation, which are not yet peer-reviewed,&nbsp;<a href="https://arxiv.org/abs/2108.10585">are available on the arXiv preprint service</a>.&nbsp;</p> <p>“The principle of our work is to have a robot predict what people are going to do in the immediate future,” says&nbsp;<strong>Hugues Thomas</strong>, a post-doctoral researcher in Barfoot’s lab at the U of T&nbsp;Institute for Aerospace Studies in Faculty of Applied Science &amp; Engineering. “This allows the robot to anticipate the movement of people it encounters rather than react once confronted with those obstacles.”&nbsp;</p> <p>To decide where to move, the robot makes use of Spatiotemporal Occupancy Grid Maps (SOGM). These are 3D grid maps maintained in the robot’s processor, with each 2D grid cell containing predicted information about the activity in that space at a specific time.&nbsp;The robot choses its future actions by processing these maps through existing trajectory-planning algorithms.&nbsp;&nbsp;</p> <p>Another key tool used by the team is light detection and ranging (lidar), a remote sensing technology similar to radar&nbsp;except that it uses light instead of sound. Each ping&nbsp;of the lidar creates a point stored in the robot’s memory.&nbsp;Previous work by the team has focused on labeling these points based on their dynamic properties. This helps the robot recognize different types of objects within its surroundings.&nbsp;</p> <p>The team’s SOGM network is currently able to recognize four lidar point categories:&nbsp;the ground; permanent fixtures, such as walls; things that are moveable but motionless, such as chairs and tables; and dynamic obstacles, such as people. No human labelling of the data is needed.&nbsp;&nbsp;</p> <p>“With this work, we hope to enable robots to navigate through crowded indoor spaces in a more socially aware manner,” says Barfoot. “By predicting where people and other objects will go, we can plan paths that anticipate what dynamic elements will do.”&nbsp;&nbsp;</p> <p>In the paper, the team reports successful results from the algorithm carried out in simulation. The next challenge is to show similar performance&nbsp;in real-world settings, where&nbsp;human actions can be difficult to predict. As part of this effort, the team has tested their design on the first floor of U of T’s Myhal Centre for Engineering Innovation &amp; Entrepreneurship, where the robot was able to move past busy students.&nbsp;&nbsp;</p> <p>“When we do experiment in simulation, we have agents that are encoded to a certain behaviour&nbsp;and they will go to a certain point by following the best trajectory to get there,” says Thomas. “But that’s not what people do in real life.”&nbsp;</p> <p>&nbsp;</p> <div class="media_embed" height="422px" width="750px"><iframe allow="autoplay" height="422px" src="https://drive.google.com/file/d/1wbq3lVdHZbU_4WSIz7-ArQN-g9fah-gL/preview" width="750px"></iframe></div> <p>&nbsp;</p> <p>When people move through spaces, they may hurry or stop abruptly to talk to someone else or turn in a completely different direction. To deal with this kind of behaviour,&nbsp;the network employs a machine learning technique known as self-supervised learning.&nbsp;&nbsp;</p> <p>Self-supervised learning contrasts with other machine-learning techniques, such as reinforced learning, where the algorithm learns to perform a task by maximizing a notion of reward in a trial-and-error manner. While this approach works well for some tasks – for example, a computer learning to play a game&nbsp;such as chess or Go – it is not ideal for this type of navigation.&nbsp;</p> <p>“With reinforcement learning, you create a black box that makes it difficult to understand the connection between the input – what the robot sees – and the output, or the robot does,” says Thomas. “It would also require the robot to fail many times before it learns the right calls, and we didn’t want our robot to learn by crashing into people.”&nbsp;&nbsp;&nbsp;</p> <p>By contrast, self-supervised learning is simple and comprehensible, meaning that it’s easier to see how the robot is making its decisions. This approach is also point-centric rather than object-centric, which means the network has a closer interpretation of the raw sensor data, allowing for multimodal predictions.&nbsp;&nbsp;</p> <p>“Many traditional methods detect people as individual objects and create trajectories for them.&nbsp;But since our model is point-centric, our algorithm does not quantify people as individual objects, but recognizes areas where people should be. And if you have a larger group of people, the area gets bigger,” says Thomas.&nbsp;&nbsp;&nbsp;</p> <p>“This research offers a promising direction that&nbsp;could have positive implications in areas such as autonomous driving and robot delivery, where an environment is not entirely predictable.”&nbsp;&nbsp;</p> <p>In the future, the team wants to see if they can scale up their network to learn more subtle cues from dynamic elements in a scene.&nbsp;</p> <p>“This will take a lot more training data,” says Barfoot. “But it should be possible because we’ve set ourselves up to generate the data in more automatic way: where the robot can gather more data itself while navigating, train better predictive models when not in operation&nbsp;and then use these the next time it navigates a space.”&nbsp;&nbsp;</p> </div> <div class="field field--name-field-news-home-page-banner field--type-boolean field--label-above"> <div class="field__label">News home page banner</div> <div class="field__item">Off</div> </div> Tue, 17 May 2022 16:54:39 +0000 Christopher.Sorensen 174762 at