{"id":15,"date":"2025-04-28T16:12:18","date_gmt":"2025-04-28T07:12:18","guid":{"rendered":"https:\/\/www.rs.mech.tohoku.ac.jp\/?page_id=15"},"modified":"2025-04-29T14:41:18","modified_gmt":"2025-04-29T05:41:18","slug":"%e7%a0%94%e7%a9%b6%e7%b4%b9%e4%bb%8b","status":"publish","type":"page","link":"https:\/\/www.rs.mech.tohoku.ac.jp\/?page_id=15","title":{"rendered":"Projects (EN)"},"content":{"rendered":"\n<p class=\"has-global-color-8-color has-text-color has-background has-link-color wp-elements-2416623e9ccd8eac998c0246f5aa44bc\" style=\"background-color:#3f168721\"><strong>See also our&nbsp;<a href=\"https:\/\/www.youtube.com\/channel\/UCo2lNxVWWJLTi2M9P4oCQmA\" target=\"_blank\" rel=\"noreferrer noopener\">youtube channel<\/a>&nbsp;for more information.<\/strong><\/p>\n\n\n\n<h4 class=\"wp-block-heading has-text-align-center has-base-color has-global-color-8-background-color has-text-color has-background has-link-color has-medium-font-size wp-elements-4c86d60841bc5c1ad461d1c8d020f14e\">3<strong> main research themes<\/strong><\/h4>\n\n\n\n<div class=\"wp-block-columns is-layout-flex wp-container-core-columns-is-layout-9d6595d7 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\">\n<figure class=\"wp-block-embed is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio\"><div class=\"wp-block-embed__wrapper\">\n<iframe loading=\"lazy\" title=\"Motion Generation for Mobile Manipulators to Assist in Putting on a Jacket While Walking\" width=\"1200\" height=\"675\" src=\"https:\/\/www.youtube.com\/embed\/THbDrQR8Yt0?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share\" referrerpolicy=\"strict-origin-when-cross-origin\" allowfullscreen><\/iframe>\n<\/div><\/figure>\n\n\n\n<h4 class=\"wp-block-heading\">Manipulation planning and learning of deformable objects<\/h4>\n\n\n\n<p>Tasks involving deformable objects can be observed in various environments such as living spaces, factory settings, and logistics sites, but it is not easy to have automated machines perform these tasks. Our laboratory aims to research and systematize the modeling, recognition, manipulation, and behavioral learning of flexible objects. We are tackling this using various methods, including traditional image processing techniques, motion planning extensions, deep learning, reinforcement learning, and imitation learning.<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Generation of Cloth Manipulation Procedures Based on Predictions<\/li>\n\n\n\n<li>Automatic Acquisition of Folding Tasks<\/li>\n\n\n\n<li>Simulations for Deformable Objects<\/li>\n\n\n\n<li>Differentiable Simulation<\/li>\n<\/ul>\n<\/div>\n\n\n\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\">\n<figure class=\"wp-block-embed is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio\"><div class=\"wp-block-embed__wrapper\">\n<iframe loading=\"lazy\" title=\"Viewpoint Planning for Object Identification Using Visual Experience According to Long-Term Activity\" width=\"1200\" height=\"675\" src=\"https:\/\/www.youtube.com\/embed\/5K3yOI3n8SM?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share\" referrerpolicy=\"strict-origin-when-cross-origin\" allowfullscreen><\/iframe>\n<\/div><\/figure>\n\n\n\n<h4 class=\"wp-block-heading\">Motion learning and planning for mobile manipulators<\/h4>\n\n\n\n<p>This research focuses on the intelligence of robots equipped with robotic arms mounted on mobile platforms (mobile manipulators). Mobile manipulators have both mobility and object manipulation capabilities, allowing for various applications. However, they need to address the redundancy of movement degrees of freedom and the complexity of the surrounding environment. Our laboratory is advancing several studies from the perspectives of vision, motion planning, and behavioral learning. <\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>System Integration of Household Support Robots<\/li>\n\n\n\n<li>Efficiency of Object Recognition Actions Based on Long-Term Activity Experience<\/li>\n\n\n\n<li>Simultaneous Execution of Task Planning and Motion Planning<\/li>\n<\/ul>\n<\/div>\n\n\n\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\">\n<figure class=\"wp-block-embed is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio\"><div class=\"wp-block-embed__wrapper\">\n<iframe loading=\"lazy\" title=\"Assistance in dressing a hemiplegic patient with pants using a branching arm robot\" width=\"1200\" height=\"675\" src=\"https:\/\/www.youtube.com\/embed\/h2Jwj6Se7ug?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share\" referrerpolicy=\"strict-origin-when-cross-origin\" allowfullscreen><\/iframe>\n<\/div><\/figure>\n\n\n\n<h4 class=\"wp-block-heading\">Action support by autonomous robots<\/h4>\n\n\n\n<p>This research focuses on robot systems that support human tasks by staying close to people. The requirement for robots here is not to proceed at their own pace, but to recognize and predict human actions and intentions, and to move appropriately in accordance with the person. Our laboratory is working on supporting daily activities such as dressing and fetching objects. Here are some of our research achievements.<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Dressing Assistance System for Individuals with Hemiplegia<\/li>\n\n\n\n<li>Proposal and Implementation of Branch-Type Robots for Task Support<\/li>\n\n\n\n<li>Object Manipulation Skills by Imitating Human Actions<\/li>\n<\/ul>\n<\/div>\n<\/div>\n\n\n\n<h4 class=\"wp-block-heading has-text-align-center has-base-color has-global-color-8-background-color has-text-color has-background has-link-color has-medium-font-size wp-elements-5d531c8d36b422f199a6d3fb7d8767cd\"><strong>Other research themes<\/strong><\/h4>\n\n\n\n<p class=\"has-text-align-center\">Other research topics include the followings.<\/p>\n\n\n\n<div class=\"wp-block-columns is-layout-flex wp-container-core-columns-is-layout-9d6595d7 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\">\n<figure class=\"wp-block-embed is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio\"><div class=\"wp-block-embed__wrapper\">\n<iframe loading=\"lazy\" title=\"[IROS2023] Recognising Affordances in Predicted Futures\" width=\"1200\" height=\"675\" src=\"https:\/\/www.youtube.com\/embed\/lHMZTLFtX44?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share\" referrerpolicy=\"strict-origin-when-cross-origin\" allowfullscreen><\/iframe>\n<\/div><\/figure>\n\n\n\n<h4 class=\"wp-block-heading\">Autonomous Behavior Control of Robots Using Fundamental Models<\/h4>\n\n\n\n<p>Robots need the ability to autonomously make decisions and act according to the environment and situation. The foundational model is a collection of basic knowledge that supports the robot&#8217;s decision-making. By using this, robots are endowed with high flexibility in generating actions, enabling them to efficiently perform complex tasks.<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Consideration of Dynamic Phenomena in Action Generation for Procedural Tasks<\/li>\n\n\n\n<li>Generation of Appropriate Pushing and Avoidance Actions in Obstructed Situations<\/li>\n<\/ul>\n<\/div>\n\n\n\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\">\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"975\" height=\"589\" src=\"https:\/\/www.rs.mech.tohoku.ac.jp\/wp-content\/uploads\/2025\/04\/suzuki.png\" alt=\"\" class=\"wp-image-33\" srcset=\"https:\/\/www.rs.mech.tohoku.ac.jp\/wp-content\/uploads\/2025\/04\/suzuki.png 975w, https:\/\/www.rs.mech.tohoku.ac.jp\/wp-content\/uploads\/2025\/04\/suzuki-300x181.png 300w, https:\/\/www.rs.mech.tohoku.ac.jp\/wp-content\/uploads\/2025\/04\/suzuki-768x464.png 768w\" sizes=\"auto, (max-width: 975px) 100vw, 975px\" \/><\/figure>\n\n\n\n<h4 class=\"wp-block-heading\">Task and motion planning for multi-DoF robots<\/h4>\n\n\n\n<p>Robots with many degrees of freedom can perform tasks in various postures. However, it becomes necessary to carefully consider which posture to adopt. Therefore, we have proposed evaluation criteria and methods for selecting appropriate postures, as well as motion generation methods that leverage the high degrees of freedom.<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Grasp Planning Using Automatically Generated Object Models<\/li>\n\n\n\n<li>Grasp Posture Planning Considering Possible Errors<\/li>\n\n\n\n<li>Obstacle Avoidance Method Using Virtual Arms<\/li>\n<\/ul>\n<\/div>\n\n\n\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\">\n<figure class=\"wp-block-embed is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-4-3 wp-has-aspect-ratio\"><div class=\"wp-block-embed__wrapper\">\n<iframe loading=\"lazy\" title=\"Autonomous robot navigation\" width=\"1200\" height=\"900\" src=\"https:\/\/www.youtube.com\/embed\/GdyuwQnC7wI?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share\" referrerpolicy=\"strict-origin-when-cross-origin\" allowfullscreen><\/iframe>\n<\/div><\/figure>\n\n\n\n<h4 class=\"wp-block-heading\">Vision system for field robots<\/h4>\n\n\n\n<p>This research theme aims to contribute to the automation of outdoor tasks, which are facing labor shortages. Our laboratory has been advancing research on recognition functions to assist with search operations, inspection tasks, and the harvesting of agricultural products.<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Image Recognition System to Support Search and Situation Confirmation Activities<\/li>\n\n\n\n<li>Detection of Harvested Produce for Automatic Harvesting Machines of Head-Type Vegetables<\/li>\n\n\n\n<li>Navigation Algorithm for Autonomous Mobile Robots<\/li>\n<\/ul>\n<\/div>\n<\/div>\n\n\n\n<div class=\"wp-block-columns is-layout-flex wp-container-core-columns-is-layout-9d6595d7 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\">\n<figure class=\"wp-block-embed is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio\"><div class=\"wp-block-embed__wrapper\">\n<iframe loading=\"lazy\" title=\"A Versatile End-Effector for Pick-and-Release of Fabric Parts\" width=\"1200\" height=\"675\" src=\"https:\/\/www.youtube.com\/embed\/2T4vI_pzroo?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share\" referrerpolicy=\"strict-origin-when-cross-origin\" allowfullscreen><\/iframe>\n<\/div><\/figure>\n\n\n\n<h4 class=\"wp-block-heading\">Hardware fabrication suitable for object manipulation in the manufacturing process<\/h4>\n\n\n\n<p>In the context of automating product manufacturing, we are exploring new robot hands and sensors that are not bound by conventional thinking.<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>End Effector for Fabric Winding<\/li>\n\n\n\n<li>End Effector for Cable Wiring<\/li>\n\n\n\n<li>Pair of End Effectors to Facilitate Fabric Unfolding Tasks<\/li>\n<\/ul>\n<\/div>\n\n\n\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\">\n<figure class=\"wp-block-embed is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio\"><div class=\"wp-block-embed__wrapper\">\n<iframe loading=\"lazy\" title=\"UnifiedFootMotionMapGeneration\" width=\"1200\" height=\"675\" src=\"https:\/\/www.youtube.com\/embed\/gF0s8I2cghc?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share\" referrerpolicy=\"strict-origin-when-cross-origin\" allowfullscreen><\/iframe>\n<\/div><\/figure>\n\n\n\n<h4 class=\"wp-block-heading\">Using robotics in human work measurement<\/h4>\n\n\n\n<p>We are conducting research to analyze human work capabilities and apply them to robots, or to elucidate human work capabilities using robotics technology.<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Measurement of Finger Movements and Force Information During Sewing Tasks<\/li>\n\n\n\n<li>Construction of Measurement Systems for Wall Painting Tasks and Analysis of Skilled Workers&#8217; Abilities<\/li>\n\n\n\n<li>Estimation of Walking Movements of People Moving Over Wide Areas<\/li>\n<\/ul>\n<\/div>\n\n\n\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\"><\/div>\n<\/div>\n\n\n\n<p><\/p>\n","protected":false},"excerpt":{"rendered":"<p>See also our&nbsp;youtube channel&nbsp;for more information. 3 main research themes Manipulation planning and  &#8230; <a title=\"Projects (EN)\" class=\"read-more\" href=\"https:\/\/www.rs.mech.tohoku.ac.jp\/?page_id=15\" aria-label=\"Projects (EN) \u306b\u3064\u3044\u3066\u3055\u3089\u306b\u8aad\u3080\">\u7d9a\u304d\u3092\u8aad\u3080<\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"parent":78,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"footnotes":""},"class_list":["post-15","page","type-page","status-publish"],"_links":{"self":[{"href":"https:\/\/www.rs.mech.tohoku.ac.jp\/index.php?rest_route=\/wp\/v2\/pages\/15","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.rs.mech.tohoku.ac.jp\/index.php?rest_route=\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/www.rs.mech.tohoku.ac.jp\/index.php?rest_route=\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/www.rs.mech.tohoku.ac.jp\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.rs.mech.tohoku.ac.jp\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=15"}],"version-history":[{"count":12,"href":"https:\/\/www.rs.mech.tohoku.ac.jp\/index.php?rest_route=\/wp\/v2\/pages\/15\/revisions"}],"predecessor-version":[{"id":136,"href":"https:\/\/www.rs.mech.tohoku.ac.jp\/index.php?rest_route=\/wp\/v2\/pages\/15\/revisions\/136"}],"up":[{"embeddable":true,"href":"https:\/\/www.rs.mech.tohoku.ac.jp\/index.php?rest_route=\/wp\/v2\/pages\/78"}],"wp:attachment":[{"href":"https:\/\/www.rs.mech.tohoku.ac.jp\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=15"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}