  {"id":661,"date":"2026-04-22T18:08:08","date_gmt":"2026-04-22T18:08:08","guid":{"rendered":"https:\/\/cmblog.neuroscience.queensu.ca\/?p=661"},"modified":"2026-04-22T18:08:09","modified_gmt":"2026-04-22T18:08:09","slug":"i-hung-out-with-a-g1-humanoid-robot-for-a-day","status":"publish","type":"post","link":"https:\/\/cmblog.neuroscience.queensu.ca\/i-hung-out-with-a-g1-humanoid-robot-for-a-day","title":{"rendered":"I Hung Out With a G1 Humanoid Robot for a Day"},"content":{"rendered":"\n<p>Last semester, I watched a YouTube video of the Ingenuity Labs unboxing their new Unitree G1 humanoid robot. The robot looked slightly larger than a child and as the team gathered around it, they watched it walk, run, and even dance. I immediately had one thought: I need to meet this robot. A few months later, I found myself standing inside Ingenuity Labs, about to do exactly that.<\/p>\n\n\n\n<figure class=\"wp-block-embed is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio\"><div class=\"wp-block-embed__wrapper\">\n<p class=\"responsive-video-wrap clr\"><iframe title=\"We have a G1 humanoid!\" width=\"1200\" height=\"675\" src=\"https:\/\/www.youtube.com\/embed\/Dr6Gd6RaXFY?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share\" referrerpolicy=\"strict-origin-when-cross-origin\" allowfullscreen><\/iframe><\/p>\n<\/div><\/figure>\n\n\n\n<p>Before seeing the robot up close, I sat down with Ramzi Asfour, Associate Director (Administration) at Ingenuity Labs Research Institute, to learn more about their newest addition. For Asfour, bringing a new robot into the lab is always an exciting moment. \u201cIt\u2019s usually a fun experience when we open up boxes and there\u2019s a new robot to get to do stuff,\u201d recalls Asfour.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>What is a G1 Humanoid?<\/strong><\/h2>\n\n\n\n<p>When I finally saw the robot in person, the first thing that struck me was its size. Standing a little over four feet tall, the G1 looks almost like a small human figure: arms, legs, and joints designed to move in ways that mimic our own.<\/p>\n\n\n\n<p>\u201cIt\u2019s a robot that looks like a person\u2026 about the size of an average 10-year-old kid. It\u2019s a humanoid form factor,\u201d says Asfour. \u201cThe promise with it is that it can kind of do tasks that humans normally have done.\u201d<\/p>\n\n\n\n<p>Speaking with Asfour, I learned that the initial focus for the robot will be an agricultural application. But how exactly does a humanoid robot fit into agriculture?<\/p>\n\n\n\n<p>The design of the robot, standing at about 4\u20194\u201d (1.3m) and weighing around 35kg, allows it to perform tasks originally designed for humans. Its mobility allows it to walk, run, and navigate uneven terrain.<\/p>\n\n\n\n<p>For instance, it would be useful for repetitive and tedious tasks often associated with agriculture. As Asfour explains, \u201cif you\u2019re in a greenhouse and want to bag produce, it\u2019s very repetitive. You\u2019re doing the same task over and over again.\u201d<\/p>\n\n\n\n<p>The robot is also equipped with advanced sensing and computing abilities. \u201cIt has computer vision. It has a built-in computer. It has some AI capability. You can talk to it. You can program it to do different things,\u201d says Asfour.<\/p>\n\n\n\n<p>A long-term goal will be to have the robot perform tasks independently, but a first application will be telerobotics. Or, in other words, having a human in a comfortable spot operating the robot remotely.<\/p>\n\n\n<div class=\"wp-block-image\">\n<figure class=\"aligncenter\"><img fetchpriority=\"high\" decoding=\"async\" width=\"480\" height=\"480\" src=\"https:\/\/cmblog.neuroscience.queensu.ca\/wp-content\/uploads\/2026\/04\/ramzi-asfour.jpg\" alt=\"Ramzi Asfour - Smith Engineering Directory | Queen's University\" class=\"wp-image-667\" srcset=\"https:\/\/cmblog.neuroscience.queensu.ca\/wp-content\/uploads\/2026\/04\/ramzi-asfour.jpg 480w, https:\/\/cmblog.neuroscience.queensu.ca\/wp-content\/uploads\/2026\/04\/ramzi-asfour-300x300.jpg 300w, https:\/\/cmblog.neuroscience.queensu.ca\/wp-content\/uploads\/2026\/04\/ramzi-asfour-150x150.jpg 150w\" sizes=\"(max-width: 480px) 100vw, 480px\" \/><figcaption class=\"wp-element-caption\">Ramzi Asfour, Associate Director (Administration) at Ingenuity Labs Research Institute<\/figcaption><\/figure>\n<\/div>\n\n\n<h2 class=\"wp-block-heading\"><strong>Humans and Robots Working Together<\/strong><\/h2>\n\n\n\n<p>Seeing the robot move makes its humanoid design immediately clear. The G1 doesn\u2019t glide on wheels like many robots, but instead it walks. Watching it shift its weight from one leg to the other, I could easily imagine how robots like this might eventually operate in environments built for humans.<\/p>\n\n\n\n<p>Asfour puts it clearly: \u201crobotics and AI are going to be everywhere.\u201d So a key question for Ingenuity Labs becomes, \u201chow do you successfully roll out robots into a community situation or workplace situation and have it be a positive experience rather than people worrying about safety or job security?\u201d<\/p>\n\n\n\n<p>A first step is to study how humans behave around robots. Asfour says, \u201c[we ask] how do people behave differently in the presence of a robot?\u201d<\/p>\n\n\n\n<p>Moreover, researchers must also study humans to improve robotic motion. \u201cYou study the biomechanics of how people walk\u2026 and then analyze that and come up with control schemes to have the robot walk better,\u201d explains Asfour. \u201cYou want it to be more stable and look more natural while it\u2019s moving around.\u201d<\/p>\n\n\n<div class=\"wp-block-image\">\n<figure class=\"aligncenter size-large\"><img decoding=\"async\" width=\"768\" height=\"1024\" src=\"https:\/\/cmblog.neuroscience.queensu.ca\/wp-content\/uploads\/2026\/04\/IMG_5924-768x1024.jpeg\" alt=\"\" class=\"wp-image-662\" srcset=\"https:\/\/cmblog.neuroscience.queensu.ca\/wp-content\/uploads\/2026\/04\/IMG_5924-768x1024.jpeg 768w, https:\/\/cmblog.neuroscience.queensu.ca\/wp-content\/uploads\/2026\/04\/IMG_5924-225x300.jpeg 225w, https:\/\/cmblog.neuroscience.queensu.ca\/wp-content\/uploads\/2026\/04\/IMG_5924-1152x1536.jpeg 1152w, https:\/\/cmblog.neuroscience.queensu.ca\/wp-content\/uploads\/2026\/04\/IMG_5924-1536x2048.jpeg 1536w, https:\/\/cmblog.neuroscience.queensu.ca\/wp-content\/uploads\/2026\/04\/IMG_5924-scaled.jpeg 1920w\" sizes=\"(max-width: 768px) 100vw, 768px\" \/><figcaption class=\"wp-element-caption\">Meeting the robot!<\/figcaption><\/figure>\n<\/div>\n\n\n<h2 class=\"wp-block-heading\"><strong>AI and Robotics<\/strong><\/h2>\n\n\n\n<p>The robot also provides a unique avenue to integrate artificial intelligence (AI) with physical machines. \u201cPeople have called it embodied AI or physical AI\u2026 where AI and robotics come together,\u201d Asfour explains.<\/p>\n\n\n\n<p>For years, AI has largely existed behind screens whether that be powering algorithms, recommendations, and data analysis. But robots like the G1 begin to shift that, bringing AI into physical spaces where it can interact with the world in real time.<\/p>\n\n\n\n<p>\u201cAI exists on computers\u2026 but they needed a way to go out into the real world,\u201d says Asfour. \u201cYou could see the potential to talk to a robot and ask it to do something in an environment.\u201d<\/p>\n\n\n\n<p>The idea of simply talking to a machine and having it carry out a task still feels slightly futuristic. But standing in the lab, watching the G1 move, it becomes easier to see how that future might not be so far away.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>Researchers, Students, and Collaborators<\/strong><\/h2>\n\n\n\n<p>The robot also creates new opportunities for students.<\/p>\n\n\n\n<p>\u201cAll our robots here are industrial-grade\u2026 these are robots that industry will be using,\u201d says Asfour. That means students are gaining experience with the kinds of systems they may encounter in their future careers.<\/p>\n\n\n\n<p>Graduate students use the robots in research projects, while undergraduate engineering students get hands-on experience through capstone design projects. For many, it\u2019s a rare chance to work directly with advanced robotic systems before entering the workforce.<\/p>\n\n\n\n<p>Beyond individual projects, Ingenuity Labs is also encouraging collaboration across disciplines. Asfour notes that \u201canybody who\u2019s part of the Connected Minds program can make use of it through our collaboration.\u201d<\/p>\n\n\n\n<p>In a space like this, the robot becomes a shared platform for ideas, bringing together researchers, students, and different fields to explore what these technologies can do.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>Conclusion: Robots Everywhere<\/strong><\/h2>\n\n\n\n<p>Standing next to the robot, it\u2019s easy to imagine a future where machines like this are no longer unusual lab equipment, but working alongside people in tasks like harvesting, packaging, or other everyday work.<\/p>\n\n\n\n<p>\u201cAt some point, robots are going to have to fill the jobs and do the tasks that humans normally do now,\u201d Asfour explained. But rather than replacing expertise, these robots could expand what people are able to do.<\/p>\n\n\n\n<p>As these systems develop, another key focus is making them easier for people to use. Rather than requiring specialized coding knowledge, researchers are working toward ways of interacting with robots more naturally, for example, through language or simple instructions. \u201cYou could have someone who knows a lot about metalworking and a little bit of robotics, and the robot becomes very intuitive to program.\u201d<\/p>\n\n\n\n<p>Ultimately, Asfour believes the future will involve a world where intelligent machines and humans work together. \u201cWe think robots and AI are going to be everywhere\u2026 and we want to put together systems for the benefit of society.\u201d<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Last semester, I watched a YouTube video of the Ingenuity Labs unboxing their new Unitree G1 humanoid robot. The robot looked slightly larger than a child and as the team gathered around it, they watched it walk, run, and even dance. I immediately had one thought: I need to meet this robot. A few months [&hellip;]<\/p>\n","protected":false},"author":4,"featured_media":666,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"ocean_post_layout":"","ocean_both_sidebars_style":"","ocean_both_sidebars_content_width":0,"ocean_both_sidebars_sidebars_width":0,"ocean_sidebar":"","ocean_second_sidebar":"","ocean_disable_margins":"enable","ocean_add_body_class":"","ocean_shortcode_before_top_bar":"","ocean_shortcode_after_top_bar":"","ocean_shortcode_before_header":"","ocean_shortcode_after_header":"","ocean_has_shortcode":"","ocean_shortcode_after_title":"","ocean_shortcode_before_footer_widgets":"","ocean_shortcode_after_footer_widgets":"","ocean_shortcode_before_footer_bottom":"","ocean_shortcode_after_footer_bottom":"","ocean_display_top_bar":"default","ocean_display_header":"default","ocean_header_style":"","ocean_center_header_left_menu":"","ocean_custom_header_template":"","ocean_custom_logo":0,"ocean_custom_retina_logo":0,"ocean_custom_logo_max_width":0,"ocean_custom_logo_tablet_max_width":0,"ocean_custom_logo_mobile_max_width":0,"ocean_custom_logo_max_height":0,"ocean_custom_logo_tablet_max_height":0,"ocean_custom_logo_mobile_max_height":0,"ocean_header_custom_menu":"","ocean_menu_typo_font_family":"","ocean_menu_typo_font_subset":"","ocean_menu_typo_font_size":0,"ocean_menu_typo_font_size_tablet":0,"ocean_menu_typo_font_size_mobile":0,"ocean_menu_typo_font_size_unit":"px","ocean_menu_typo_font_weight":"","ocean_menu_typo_font_weight_tablet":"","ocean_menu_typo_font_weight_mobile":"","ocean_menu_typo_transform":"","ocean_menu_typo_transform_tablet":"","ocean_menu_typo_transform_mobile":"","ocean_menu_typo_line_height":0,"ocean_menu_typo_line_height_tablet":0,"ocean_menu_typo_line_height_mobile":0,"ocean_menu_typo_line_height_unit":"","ocean_menu_typo_spacing":0,"ocean_menu_typo_spacing_tablet":0,"ocean_menu_typo_spacing_mobile":0,"ocean_menu_typo_spacing_unit":"","ocean_menu_link_color":"","ocean_menu_link_color_hover":"","ocean_menu_link_color_active":"","ocean_menu_link_background":"","ocean_menu_link_hover_background":"","ocean_menu_link_active_background":"","ocean_menu_social_links_bg":"","ocean_menu_social_hover_links_bg":"","ocean_menu_social_links_color":"","ocean_menu_social_hover_links_color":"","ocean_disable_title":"default","ocean_disable_heading":"default","ocean_post_title":"","ocean_post_subheading":"","ocean_post_title_style":"","ocean_post_title_background_color":"","ocean_post_title_background":0,"ocean_post_title_bg_image_position":"","ocean_post_title_bg_image_attachment":"","ocean_post_title_bg_image_repeat":"","ocean_post_title_bg_image_size":"","ocean_post_title_height":0,"ocean_post_title_bg_overlay":0.5,"ocean_post_title_bg_overlay_color":"","ocean_disable_breadcrumbs":"default","ocean_breadcrumbs_color":"","ocean_breadcrumbs_separator_color":"","ocean_breadcrumbs_links_color":"","ocean_breadcrumbs_links_hover_color":"","ocean_display_footer_widgets":"default","ocean_display_footer_bottom":"default","ocean_custom_footer_template":"","ocean_post_oembed":"","ocean_post_self_hosted_media":"","ocean_post_video_embed":"","ocean_link_format":"","ocean_link_format_target":"self","ocean_quote_format":"","ocean_quote_format_link":"post","ocean_gallery_link_images":"on","ocean_gallery_id":[],"footnotes":""},"categories":[9],"tags":[],"class_list":["post-661","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-connected-minds","entry","has-media"],"rttpg_featured_image_url":{"full":["https:\/\/cmblog.neuroscience.queensu.ca\/wp-content\/uploads\/2026\/04\/image.png",1920,1080,false],"landscape":["https:\/\/cmblog.neuroscience.queensu.ca\/wp-content\/uploads\/2026\/04\/image.png",1920,1080,false],"portraits":["https:\/\/cmblog.neuroscience.queensu.ca\/wp-content\/uploads\/2026\/04\/image.png",1920,1080,false],"thumbnail":["https:\/\/cmblog.neuroscience.queensu.ca\/wp-content\/uploads\/2026\/04\/image-150x150.png",150,150,true],"medium":["https:\/\/cmblog.neuroscience.queensu.ca\/wp-content\/uploads\/2026\/04\/image-300x169.png",300,169,true],"large":["https:\/\/cmblog.neuroscience.queensu.ca\/wp-content\/uploads\/2026\/04\/image-1024x576.png",1024,576,true],"1536x1536":["https:\/\/cmblog.neuroscience.queensu.ca\/wp-content\/uploads\/2026\/04\/image-1536x864.png",1536,864,true],"2048x2048":["https:\/\/cmblog.neuroscience.queensu.ca\/wp-content\/uploads\/2026\/04\/image.png",1920,1080,false],"ocean-thumb-m":["https:\/\/cmblog.neuroscience.queensu.ca\/wp-content\/uploads\/2026\/04\/image-600x600.png",600,600,true],"ocean-thumb-ml":["https:\/\/cmblog.neuroscience.queensu.ca\/wp-content\/uploads\/2026\/04\/image-800x450.png",800,450,true],"ocean-thumb-l":["https:\/\/cmblog.neuroscience.queensu.ca\/wp-content\/uploads\/2026\/04\/image-1200x700.png",1200,700,true]},"rttpg_author":{"display_name":"Jaspreet Dodd","author_link":"https:\/\/cmblog.neuroscience.queensu.ca\/author\/jaspreet"},"rttpg_comment":0,"rttpg_category":"<a href=\"https:\/\/cmblog.neuroscience.queensu.ca\/category\/connected-minds\" rel=\"category tag\">Connected Minds<\/a>","rttpg_excerpt":"Last semester, I watched a YouTube video of the Ingenuity Labs unboxing their new Unitree G1 humanoid robot. The robot looked slightly larger than a child and as the team gathered around it, they watched it walk, run, and even dance. I immediately had one thought: I need to meet this robot. A few months&hellip;","_links":{"self":[{"href":"https:\/\/cmblog.neuroscience.queensu.ca\/wp-json\/wp\/v2\/posts\/661","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/cmblog.neuroscience.queensu.ca\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/cmblog.neuroscience.queensu.ca\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/cmblog.neuroscience.queensu.ca\/wp-json\/wp\/v2\/users\/4"}],"replies":[{"embeddable":true,"href":"https:\/\/cmblog.neuroscience.queensu.ca\/wp-json\/wp\/v2\/comments?post=661"}],"version-history":[{"count":2,"href":"https:\/\/cmblog.neuroscience.queensu.ca\/wp-json\/wp\/v2\/posts\/661\/revisions"}],"predecessor-version":[{"id":668,"href":"https:\/\/cmblog.neuroscience.queensu.ca\/wp-json\/wp\/v2\/posts\/661\/revisions\/668"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/cmblog.neuroscience.queensu.ca\/wp-json\/wp\/v2\/media\/666"}],"wp:attachment":[{"href":"https:\/\/cmblog.neuroscience.queensu.ca\/wp-json\/wp\/v2\/media?parent=661"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/cmblog.neuroscience.queensu.ca\/wp-json\/wp\/v2\/categories?post=661"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/cmblog.neuroscience.queensu.ca\/wp-json\/wp\/v2\/tags?post=661"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}