Chris Heckman /cs/ en Robots can save us if they can see us: Heckman receives CAREER award /cs/2024/10/15/robots-can-save-us-if-they-can-see-us-heckman-receives-career-award <span>Robots can save us if they can see us: Heckman receives CAREER award</span> <span><span>Emily Adams</span></span> <span><time datetime="2024-10-15T14:07:19-06:00" title="Tuesday, October 15, 2024 - 14:07">Tue, 10/15/2024 - 14:07</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/cs/sites/default/files/styles/focal_image_wide/public/2024-10/MARBLE%20robot%20edgar%20mine.JPG?h=95f4d75d&amp;itok=s5T8XTud" width="1200" height="600" alt="A SPOT robot with a light enters a dark mine tunnel"> </div> </div> <div role="contentinfo" class="container ucb-article-categories" itemprop="about"> <span class="visually-hidden">Categories:</span> <div class="ucb-article-category-icon" aria-hidden="true"> <i class="fa-solid fa-folder-open"></i> </div> <a href="/cs/taxonomy/term/457"> Research </a> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/cs/taxonomy/term/478" hreflang="en">Chris Heckman</a> </div> <a href="/cs/grace-wilson">Grace Wilson</a> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default 3"> <div class="ucb-article-row-subrow row"> <div class="ucb-article-text col-lg d-flex align-items-center" itemprop="articleBody"> <div><p dir="ltr"><span>Autonomous robots could save human lives more easily if they could “see” and react better in adverse environmental conditions. By pursuing the possibilities of using millimeter wave radar for robotic perception,&nbsp;</span><a href="/cs/christoffer-heckman" rel="nofollow"><span>Christoffer Heckman</span></a><span> is making this fundamental shift possible.</span></p><p dir="ltr"><span>An associate professor of computer science at Boulder, Heckman will receive $600,000 over the next five years through the National Science Foundation's CAREER award for this research.</span></p><p dir="ltr"><span>Currently, most robots use sensors based on the visible spectrum of light, like cameras or lasers. In environments with smoke, fog or dust, however, visible light bounces off these particles.</span></p><p dir="ltr"><span>Robots, like humans, can't plan their movements accurately if they don't know where they are or what is around them.</span></p><p dir="ltr"><span>"Humans operating in a visually degraded environment are in trouble. We cannot solve that problem, but incorporating millimeter wave radar could enable our robots to do things that even humans can't do,"&nbsp; Heckman said.</span></p><p dir="ltr"><span>This is because millimeter waves pass through smoke, fog and dust.&nbsp;</span></p><p class="lead"><span>A new path</span></p><p dir="ltr"><span>Traditionally, Heckman explained, radar has been viewed with skepticism for these kinds of tasks. The sensors have been too large and energy-intensive for agile robots. The long wavelength of radar creates complex, confused signals.</span></p><p dir="ltr"><span>With the advent of new, smaller system-on-a-chip radar sensors, the traditional energy and size limitations have been removed. This leaves the complexity of radar waveform signals.</span></p><p dir="ltr"><span>"This is a fascinating problem," Heckman explained. "People really understand how radar works, down to equations that have existed for almost a century, but radar can be difficult to precisely interpret in cluttered environments. It bounces around within an enclosed area, and can pass right through small objects."</span></p><p dir="ltr"><span>Heckman's solution is to fuse the knowledge we have about electromagnetic waves with supervised machine learning.</span></p><p dir="ltr"><span>Datasets from high-fidelity optical sensors are paired with low-fidelity radar signals of the same scene. Machine learning then cleans the radar signal to match the high-fidelity scene. This training then can be used to build clear radar reconstructions of environments where optical sensors are obscured.</span></p><p dir="ltr"><span>This powerful synthesis of physics and computer science stands to dramatically improve the capability of radar as a perception sensor.</span></p><p class="lead"><span>Beyond sensing</span></p><p dir="ltr"><span>Heckman has further plans as well. He wants to use this advance to support quick and accurate actions and replanning for autonomous systems.</span></p><p dir="ltr"><span>Robotic thinking has traditionally followed the saying "sense, plan, act." A robot understands a scene, plans its route according to its inputs, and acts on that plan. Segmenting these activities, however, can lead to slow movement and inability to react to changes.</span></p><p dir="ltr"><span>Heckman seeks to use radar in conjunction with optical and lidar sensors to improve navigation strategies as a robot is navigating a space, allowing it to respond more quickly to changes.</span></p><p dir="ltr"><span>Robots that can plan for themselves better and can see into obscured spaces have a valuable role in search-and-rescue, firefighting and space missions.</span></p><p dir="ltr"><span>Heckman's MARBLE team has&nbsp;</span><a href="/engineering/2023/11/17/building-next-generation-autonomous-robots-serve-humanity" rel="nofollow"><span>used robots to explore dark caves</span></a><span> through the DARPA Subterranean Challenge and as a firefighting assistant finding active embers. As the research advances made possible by this CAREER Award take shape, where will robots be able to see next?&nbsp;</span></p></div> </div> <div class="ucb-article-content-media ucb-article-content-media-right col-lg"> <div> <blockquote class="ucb-article-blockquote"> <div class="ucb-article-blockquote-icon font-gold"> <i class="fa-solid fa-quote-left"></i> </div> <div class="ucb-article-blockquote-text"> <div>Humans operating in a visually degraded environment are in trouble. We cannot solve that problem, but incorporating millimeter wave radar could enable our robots to do things that even humans can't do." - Chris Heckman</div> </div></blockquote> </div> </div> </div> </div> </div> </div> <div>Radar breakthrough in robotic sensing to help systems see and act in smoke, darkness recognized by $600,000 National Science Foundation award.</div> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div> <div class="imageMediaStyle large_image_style"> <img loading="lazy" src="/cs/sites/default/files/styles/large_image_style/public/2024-10/MARBLE%20robot%20edgar%20mine.JPG?itok=sYDLZxSI" width="1500" height="1000" alt="A SPOT robot with a light enters a dark mine tunnel"> </div> <span class="media-image-caption"> <p>A SPOT robot with a light enters a dark mine tunnel.</p> </span> </div> <div>On</div> <div>White</div> Tue, 15 Oct 2024 20:07:19 +0000 Emily Adams 2507 at /cs Is the World Ready for Self-Driving Cars? /cs/2023/11/27/world-ready-self-driving-cars <span>Is the World Ready for Self-Driving Cars?</span> <span><span>Anonymous (not verified)</span></span> <span><time datetime="2023-11-27T10:29:39-07:00" title="Monday, November 27, 2023 - 10:29">Mon, 11/27/2023 - 10:29</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/cs/sites/default/files/styles/focal_image_wide/public/article-thumbnail/screenshot_2023-11-27_at_10.23.31_am.png?h=195f6103&amp;itok=PqD5Qhi-" width="1200" height="600" alt="An illustration of two people talking in an futuristic autonomous vehicle "> </div> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/cs/taxonomy/term/478" hreflang="en">Chris Heckman</a> <a href="/cs/taxonomy/term/559" hreflang="en">Leanne Hirschfield</a> <a href="/cs/taxonomy/term/485" hreflang="en">Majid Zamani</a> <a href="/cs/taxonomy/term/549" hreflang="en">Sidney D'mello</a> </div> <span>Daniel Oberhaus</span> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default"> <div class="ucb-article-content-media ucb-article-content-media-above"> <div> <div class="paragraph paragraph--type--media paragraph--view-mode--default"> </div> </div> </div> <div class="ucb-article-text d-flex align-items-center" itemprop="articleBody"> </div> </div> </div> </div> <div>Autonomous vehicles are hitting the road in cities across the U.S. Can they be trusted? Researchers from the Department of Computer Science weigh in. External link to the Coloradan Alumni Magazine. </div> <script> window.location.href = `/coloradan/2023/11/06/world-ready-self-driving-cars`; </script> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Mon, 27 Nov 2023 17:29:39 +0000 Anonymous 2399 at /cs Building next generation autonomous robots to serve humanity /cs/2023/11/17/building-next-generation-autonomous-robots-serve-humanity <span>Building next generation autonomous robots to serve humanity</span> <span><span>Anonymous (not verified)</span></span> <span><time datetime="2023-11-17T16:16:23-07:00" title="Friday, November 17, 2023 - 16:16">Fri, 11/17/2023 - 16:16</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/cs/sites/default/files/styles/focal_image_wide/public/article-thumbnail/edgar_mines_lab_2023_090.png?h=29bc86dc&amp;itok=Q-3FvZI4" width="1200" height="600" alt="Edgar Mine robot"> </div> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/cs/taxonomy/term/478" hreflang="en">Chris Heckman</a> </div> <span>Jeff Zhender</span> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default"> <div class="ucb-article-content-media ucb-article-content-media-above"> <div> <div class="paragraph paragraph--type--media paragraph--view-mode--default"> </div> </div> </div> <div class="ucb-article-text d-flex align-items-center" itemprop="articleBody"> </div> </div> </div> </div> <div>One thousand feet underground, a four-legged creature scavenges through tunnels in pitch darkness. The sound of its movements echo eerily off the walls, but it is not to be feared – this is no wild animal; it is an autonomous rescue robot.</div> <script> window.location.href = `/engineering/2023/11/17/building-next-generation-autonomous-robots-serve-humanity`; </script> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Fri, 17 Nov 2023 23:16:23 +0000 Anonymous 2397 at /cs Keeping water on the radar: Machine learning to aid in essential water cycle measurement /cs/2022/05/20/keeping-water-radar-machine-learning-aid-essential-water-cycle-measurement <span>Keeping water on the radar: Machine learning to aid in essential water cycle measurement</span> <span><span>Anonymous (not verified)</span></span> <span><time datetime="2022-05-20T11:45:44-06:00" title="Friday, May 20, 2022 - 11:45">Fri, 05/20/2022 - 11:45</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/cs/sites/default/files/styles/focal_image_wide/public/article-thumbnail/untitled_1920_x_200_px.png?h=6c045699&amp;itok=BXvwOnw-" width="1200" height="600" alt="Water and radar abstract image"> </div> </div> <div role="contentinfo" class="container ucb-article-categories" itemprop="about"> <span class="visually-hidden">Categories:</span> <div class="ucb-article-category-icon" aria-hidden="true"> <i class="fa-solid fa-folder-open"></i> </div> <a href="/cs/taxonomy/term/465"> News </a> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/cs/taxonomy/term/478" hreflang="en">Chris Heckman</a> </div> <a href="/cs/grace-wilson">Grace Wilson</a> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default"> <div class="ucb-article-content-media ucb-article-content-media-above"> <div> <div class="paragraph paragraph--type--media paragraph--view-mode--default"> </div> </div> </div> <div class="ucb-article-text d-flex align-items-center" itemprop="articleBody"> <div><p dir="ltr">Department of Computer Science assistant professor Chris Heckman and CIRES research hydrologist Toby Minear have been awarded a <a href="/researchinnovation/research-development/funding/rio-funding-opportunities-competitions/seed-grant-winners-2022" rel="nofollow">Grand Challenge Research &amp; Innovation Seed Grant</a> to create an instrument that could revolutionize our understanding of the amount of water in our rivers, lakes, wetlands and coastal areas by greatly increasing the places where we measure it.</p> <p>The new low-cost instrument would use radar and machine learning to quickly and safely measure water levels in a variety of scenarios.&nbsp;</p> <p>This work could prove vital as the <a href="https://www.denverpost.com/2022/04/27/colorado-drought-natural-disaster-wildfire-usda/" rel="nofollow">USDA recently proclaimed</a> the entire state of Colorado to be a "primary natural disaster area" due to an ongoing drought that has made the American West potentially the driest it has been in <a href="https://www.denverpost.com/2022/02/19/colorado-megadrought-study-years/" rel="nofollow">over a millennium</a>. Other climate records across the globe also continue to be broken, year after year. Our understanding of the changing water cycle has never been more essential at a local, national and global level.&nbsp;</p> <p dir="ltr">A fundamental part to developing this understanding is knowing changes in the surface height of bodies of water. Currently, measuring changing water surface levels involves high-cost sensors that are easily damaged by floods, difficult to install and time consuming to maintain.&nbsp;</p> <p dir="ltr">"One of the big issues is that we have limited locations where we take measurements of surface water heights," Minear said.&nbsp;</p> <h2 dir="ltr">A new method</h2> <p dir="ltr">Heckman and Minear are aiming to change this by building a low-cost instrument that doesn't need to be in a body of water to read its average water surface level. It can instead be placed several meters away – safely elevated from floods.</p> <p>The instrument, roughly the size of two credit-cards stacked on one another, relies on high-frequency radio waves, often referred to as "millimeter wave", which have only been made commercially accessible in the last decade.&nbsp;</p> <p dir="ltr">Through radar, these short waves can be used to measure the distance between the sensor and the surface of a body of water with great specificity. As the water's surface level increases or decreases over time, the distance between the sensor and the water's surface level changes.&nbsp;</p> <p>The instrument's small form-factor and potential off-the-shelf usability separate it from previous efforts to identify water through radar.&nbsp;</p> <p dir="ltr">It also streamlines data transmitted over often limited and expensive cellular and satellite networks, lowering the cost.</p> <p dir="ltr">In addition, the instrument will use machine learning to determine whether a change in measurements could be a temporary outlier, like a bird swimming by, and whether or not a surface is liquid water.</p> <p>Machine learning is a form of data analysis that seeks to identify patterns from data to make decisions with little human intervention.&nbsp;</p> <p>While traditionally radar has been used to detect solid objects, liquids require different considerations to avoid being misidentified. Heckman believes that traditional ways of processing radar may not be enough to measure liquid surfaces at such close proximity.</p> <p>&nbsp;"We're considering moving further up the radar processing chain and reconsidering how some of these algorithms have been developed in light of new techniques in this kind of signal processing," Heckman said.&nbsp;</p> <h2>Citizen science&nbsp;</h2> <p>In addition to possible fundamental shifts in radar processing, the project could empower communities of citizen scientists, according to Minear.&nbsp;</p> <p>"Right now, many of the systems that we use need an expert installer. Our idea is to internalize some of those expert decisions, which takes out a lot of the cost and makes this instrument more friendly to a citizen science approach," he said.&nbsp;</p> <p>By lowering the barrier of entry to water surface level measurement through low-cost devices with smaller data requirements, the researchers broaden opportunities for communities, even in areas with limited cellular networks, to measure their own water sources.&nbsp;</p> <p>The team is also committing to open-source principles to ensure that anyone can use and build on the technology, allowing for new innovations to happen more quickly and democratically.&nbsp;</p> <h2>Broader applications</h2> <p dir="ltr">Minear, who is a Science Team and Cal/Val Team member for the upcoming NASA Surface Water and Ocean Topography (SWOT) Mission, also hopes that the new instrument could help check the accuracy of water surface level measurements made by satellites.</p> <p dir="ltr">These sensors could also give local, regional and national communities more insight into their water usage and supply over time and could be used to help make evidence-informed policy decisions about water rights and usage.</p> <p dir="ltr">"I'm very excited about the opportunities that are presented by getting data in places that we don't currently get it. I anticipate that this could give us better insight into what is happening with our water sources, even in our backyard," said Heckman.&nbsp;</p></div> </div> </div> </div> </div> <div>Department of Computer Science assistant professor Chris Heckman and CIRES research hydrologist Toby Minear have been awarded a Grand Challenge Research &amp; Innovation Seed Grant to create an instrument that could revolutionize our understanding of the amount of water in our rivers, lakes, wetlands and coastal areas by greatly increasing the places where we measure it.</div> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Fri, 20 May 2022 17:45:44 +0000 Anonymous 2095 at /cs Robotics researchers ready to start field-testing their code /cs/2020/06/24/robotics-researchers-ready-start-field-testing-their-code <span>Robotics researchers ready to start field-testing their code</span> <span><span>Anonymous (not verified)</span></span> <span><time datetime="2020-06-24T10:59:12-06:00" title="Wednesday, June 24, 2020 - 10:59">Wed, 06/24/2020 - 10:59</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/cs/sites/default/files/styles/focal_image_wide/public/article-thumbnail/ceas_heckman_return2research.png?h=9d50d13a&amp;itok=RUPtfHfO" width="1200" height="600" alt="Dan Torres wears a mask while working on one of the DARPA Subterranean Challenge robots"> </div> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/cs/taxonomy/term/478" hreflang="en">Chris Heckman</a> </div> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default"> <div class="ucb-article-content-media ucb-article-content-media-above"> <div> <div class="paragraph paragraph--type--media paragraph--view-mode--default"> </div> </div> </div> <div class="ucb-article-text d-flex align-items-center" itemprop="articleBody"> </div> </div> </div> </div> <div>Out of 20 students who work in Chris Heckman's lab, five have been approved to head back to their space in the ECES wing of the Engineering Center. There, they’ll be able to field-test the software they’ve been developing in a simulation platform, which they also had to build from scratch to accommodate remote teamwork. </div> <script> window.location.href = `/engineering/2020/06/24/robotics-researchers-ready-start-field-testing-their-code`; </script> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Wed, 24 Jun 2020 16:59:12 +0000 Anonymous 1509 at /cs It has to work: Sub T Challenge sharpens students’ skill in the field /cs/2020/04/20/it-has-work-sub-t-challenge-sharpens-students-skill-field <span>It has to work: Sub T Challenge sharpens students’ skill in the field</span> <span><span>Anonymous (not verified)</span></span> <span><time datetime="2020-04-20T09:49:41-06:00" title="Monday, April 20, 2020 - 09:49">Mon, 04/20/2020 - 09:49</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/cs/sites/default/files/styles/focal_image_wide/public/article-thumbnail/subt.png?h=c70d7753&amp;itok=ZoKrgbuu" width="1200" height="600" alt="Michael Miles and Daniel Torres talk with another challenge participant during a break outside of the DARPA Subterranean Challenge course area. "> </div> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/cs/taxonomy/term/478" hreflang="en">Chris Heckman</a> </div> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default"> <div class="ucb-article-content-media ucb-article-content-media-above"> <div> <div class="paragraph paragraph--type--media paragraph--view-mode--default"> </div> </div> </div> <div class="ucb-article-text d-flex align-items-center" itemprop="articleBody"> </div> </div> </div> </div> <div> Boulder is one of several funded teams in the Subterranean Challenge, a competition launched by the U.S. Defense Advanced Research Projects Agency to stimulate and test ideas around autonomous robot use in difficult underground environments.</div> <script> window.location.href = `/engineering/2020/04/17/it-has-work-sub-t-challenge-sharpens-students-skill-field`; </script> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Mon, 20 Apr 2020 15:49:41 +0000 Anonymous 1485 at /cs Drones go underground in high-stakes competition /cs/2020/02/06/drones-go-underground-high-stakes-competition <span>Drones go underground in high-stakes competition</span> <span><span>Anonymous (not verified)</span></span> <span><time datetime="2020-02-06T08:41:55-07:00" title="Thursday, February 6, 2020 - 08:41">Thu, 02/06/2020 - 08:41</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/cs/sites/default/files/styles/focal_image_wide/public/article-thumbnail/subt_lidar.jpg?h=1059f6c9&amp;itok=Iv36DByp" width="1200" height="600" alt="A close-up look at the LIDAR sensors on one of the MARBLE team's robots"> </div> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/cs/taxonomy/term/478" hreflang="en">Chris Heckman</a> </div> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default"> <div class="ucb-article-content-media ucb-article-content-media-above"> <div> <div class="paragraph paragraph--type--media paragraph--view-mode--default"> </div> </div> </div> <div class="ucb-article-text d-flex align-items-center" itemprop="articleBody"> </div> </div> </div> </div> <div>Assistant Professor Chris Heckman and his team are helping to design robots that view their surroundings using three different types of sensors, including a traditional camera, radar and a laser-based system called Light Detection and Ranging (LIDAR). </div> <script> window.location.href = `/today/2020/02/05/drones-go-underground-high-stakes-competition`; </script> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Thu, 06 Feb 2020 15:41:55 +0000 Anonymous 1425 at /cs Robotics researchers have a duty to prevent autonomous weapons /cs/2019/12/04/robotics-researchers-have-duty-prevent-autonomous-weapons <span>Robotics researchers have a duty to prevent autonomous weapons</span> <span><span>Anonymous (not verified)</span></span> <span><time datetime="2019-12-04T09:50:52-07:00" title="Wednesday, December 4, 2019 - 09:50">Wed, 12/04/2019 - 09:50</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/cs/sites/default/files/styles/focal_image_wide/public/article-thumbnail/file-20191201-156112-1ydr1i2.jpg?h=11efaa47&amp;itok=G_bDxono" width="1200" height="600" alt="Three operators work on a drone in the middle of a grassy field. "> </div> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/cs/taxonomy/term/478" hreflang="en">Chris Heckman</a> </div> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default"> <div class="ucb-article-content-media ucb-article-content-media-above"> <div> <div class="paragraph paragraph--type--media paragraph--view-mode--default"> </div> </div> </div> <div class="ucb-article-text d-flex align-items-center" itemprop="articleBody"> </div> </div> </div> </div> <div>Assistant Professor Christoffer Heckman writes about the ethical challenges of AI-enhanced autonomous systems in The Conversation. </div> <script> window.location.href = `https://theconversation.com/robotics-researchers-have-a-duty-to-prevent-autonomous-weapons-126483`; </script> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Wed, 04 Dec 2019 16:50:52 +0000 Anonymous 1373 at /cs