{"id":2217,"date":"2021-01-11T18:57:51","date_gmt":"2021-01-11T18:57:51","guid":{"rendered":"https:\/\/thenextweb.com\/?p=1333660"},"modified":"2021-01-11T18:57:51","modified_gmt":"2021-01-11T18:57:51","slug":"ai-devs-claim-theyve-created-a-robot-that-demonstrates-a-primitive-form-of-empathy","status":"publish","type":"post","link":"https:\/\/www.londonchiropracter.com\/?p=2217","title":{"rendered":"AI devs claim they\u2019ve created a robot that demonstrates a \u2018primitive form of empathy\u2019"},"content":{"rendered":"\n<p>Columbia University researchers have developed a robot that displays&nbsp;a \u201cglimmer of empathy\u201d by visually predicting how another&nbsp;machine will behave.<\/p>\n<p>The robot learns to forecast its partner\u2019s future actions and goals by observing a few video frames of its&nbsp;actions<\/p>\n<p>The researchers first programmed the partner robot to move towards green circles in a playpen of around 3\u00d72 feet in size. It would sometimes move directly towards a green circle spotted by its cameras, but if the circles were hidden by an obstacle, it would either roll towards a different circle or not move at all.<\/p>\n<figure class=\"post-image post-mediaBleed aligncenter\"><img decoding=\"async\" loading=\"lazy\" class=\"size-full wp-image-1333661 lazy\" src=\"https:\/\/cdn0.tnwcdn.com\/wp-content\/blogs.dir\/1\/files\/2021\/01\/Screenshot-2021-01-11-at-16.58.57.png\" alt width=\"1206\" height=\"680\" sizes=\"(max-width: 1206px) 100vw, 1206px\" data-lazy=\"true\" data-srcset=\"https:\/\/cdn0.tnwcdn.com\/wp-content\/blogs.dir\/1\/files\/2021\/01\/Screenshot-2021-01-11-at-16.58.57.png 1206w, https:\/\/cdn0.tnwcdn.com\/wp-content\/blogs.dir\/1\/files\/2021\/01\/Screenshot-2021-01-11-at-16.58.57-280x158.png 280w, https:\/\/cdn0.tnwcdn.com\/wp-content\/blogs.dir\/1\/files\/2021\/01\/Screenshot-2021-01-11-at-16.58.57-479x270.png 479w, https:\/\/cdn0.tnwcdn.com\/wp-content\/blogs.dir\/1\/files\/2021\/01\/Screenshot-2021-01-11-at-16.58.57-239x135.png 239w, https:\/\/cdn0.tnwcdn.com\/wp-content\/blogs.dir\/1\/files\/2021\/01\/Screenshot-2021-01-11-at-16.58.57-796x449.png 796w\"><figcaption>Credit: Creative Machines Lab\/Columbia Engineering<\/figcaption><figcaption><a href=\"https:\/\/thenextweb.com\/neural\/2021\/01\/11\/ai-devs-claim-theyve-created-a-robot-that-demonstrates-a-primitive-form-of-empathy\/#\" data-url=\"https:\/\/twitter.com\/intent\/tweet?url=https%3A%2F%2Fthenextweb.com%2Fneural%2F2021%2F01%2F11%2Fai-devs-claim-theyve-created-a-robot-that-demonstrates-a-primitive-form-of-empathy%2F&amp;via=thenextweb&amp;related=thenextweb&amp;text=Check out this picture on: The machine uses visual observations to predict how its partner robot will try to reach the green dots.\" data-title=\"Share The machine uses visual observations to predict how its partner robot will try to reach the green dots. on Twitter\" data-width=\"685\" data-height=\"500\" class=\"post-image-share popitup\" title=\"Share The machine uses visual observations to predict how its partner robot will try to reach the green dots. on Twitter\"><i class=\"icon icon--inline icon--twitter--dark\"><\/i><\/a>The machine uses visual observations to predict how its partner robot will try to reach the green dots.<\/figcaption><\/figure>\n<p>After the observer robot watched the actor\u2019s&nbsp;behavior for&nbsp;roughly two hours, it started guessing its partner\u2019s future movements. It eventually managed to predict the subject\u2019s goal and path 98 out of 100 times.<\/p>\n<p><em>[Read:&nbsp;<a href=\"https:\/\/thenextweb.com\/dutch-disruptors\/2020\/12\/15\/meet-the-4-scale-ups-using-data-to-save-the-planet\/\">Meet the 4 scale-ups using data to save the planet<\/a>]<\/em><\/p>\n<p>Boyuan Chen, the lead author of the study, <a href=\"https:\/\/www.engineering.columbia.edu\/press-release\/lipson-robot-displays-empathy\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">said<\/a> the initial results were \u201cvery exciting:\u201d<\/p>\n<blockquote readability=\"13\">\n<p><span>Our findings begin to demonstrate how robots can see the world from another robot\u2019s perspective. The ability of the observer to put itself in its partner\u2019s shoes, so to speak, and understand, without being guided, whether its partner could or could not see the green circle from its vantage point, is perhaps a primitive form of empathy.<\/span><\/p>\n<\/blockquote>\n<p>The team believes their approach could help pave a towards a&nbsp;robotic \u201cTheory of Mind,\u201d which humans use to&nbsp;understand other people\u2019s thoughts and feelings.<\/p>\n<p><span>\u201cWe hypothesize that such visual behavior modeling is an essential cognitive ability that will allow machines to understand and coordinate with surrounding agents, while sidestepping the notorious <a href=\"http:\/\/www.scholarpedia.org\/article\/Symbol_grounding_problem\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">symbol grounding problem<\/a>,\u201d&nbsp;the researchers said in <a href=\"https:\/\/www.nature.com\/articles\/s41598-020-77918-x\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">their study paper<\/a>.<\/span><\/p>\n<p><iframe loading=\"lazy\" width=\"500\" height=\"281\" src=\"https:\/\/www.youtube.com\/embed\/f2U7_jZVxcU?start=12&amp;feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture\" allowfullscreen>[embedded content]<\/iframe><\/p>\n<p>The researchers admit that there are many limitations to their project. They note that the observer&nbsp;hasn\u2019t handled complex actor logic and that they gave it a full overhead view, when in practice it would typically only have a first-person perspective or partial information.<\/p>\n<p>They also warn that giving robots the ability to anticipate how humans think could lead them to manipulate our thoughts.<\/p>\n<p>\u201cWe recognize that robots aren\u2019t going to remain passive instruction-following machines for long,\u201d said study lead&nbsp;Professor Hod Lipson.<\/p>\n<p>\u201cLike other forms of advanced AI, we hope that policymakers can help keep this kind of technology in check, so that we can all benefit.\u201d<\/p>\n<p>Nonetheless, the team believes the \u201cvisual foresight\u201d they\u2019ve demonstrated could deepen our understanding of human social behavior \u2014 and lay the foundations for more socially adept machines.<\/p>\n<p>You can read <a href=\"https:\/\/www.nature.com\/articles\/s41598-020-77918-x\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">their study paper<\/a>&nbsp;in&nbsp;<em>Nature Scientific Reports<\/em>&nbsp;and find all their code and data&nbsp;<a href=\"https:\/\/github.com\/BoyuanChen\/visual_behavior_modeling\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">at GitHub<\/a>.<\/p>\n<p class=\"c-post-pubDate\"> Published January 11, 2021 \u2014 18:57 UTC <\/p>\n<p> <a href=\"https:\/\/thenextweb.com\/neural\/2021\/01\/11\/ai-devs-claim-theyve-created-a-robot-that-demonstrates-a-primitive-form-of-empathy\/\">Source<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Columbia University researchers have developed a robot that displays&nbsp;a \u201cglimmer of empathy\u201d by visually predicting how another&nbsp;machine will behave. The robot learns to forecast its partner\u2019s future actions and goals by observing&#8230;<\/p>\n","protected":false},"author":1,"featured_media":2218,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":[],"categories":[1],"tags":[],"_links":{"self":[{"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=\/wp\/v2\/posts\/2217"}],"collection":[{"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=2217"}],"version-history":[{"count":0,"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=\/wp\/v2\/posts\/2217\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=\/wp\/v2\/media\/2218"}],"wp:attachment":[{"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=2217"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=2217"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=2217"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}