https://www.youtube.com/watch?v=wJqD6yEoRjQ
The demo, whether 100% autonomous or even 50% autonomous, gives us a hint at what 2026 is going to look like. As I've said previously, getting this hardware into the hands of as many people and organizations as possible is key to advancing the tech.
2026 is going to be fire. :-)

(notebooklm) Executive Summary
A recent demonstration video from Mind on Tech, a nascent Chinese startup, has generated significant excitement among AI experts and the robotics community. The video showcases a Unitree G1 humanoid robot, powered by Mind on Tech's software "brain," performing a variety of complex household chores with remarkable fluidity and full-body control. The demo's most significant strategic implication is its departure from the industry trend of vertical integration; by developing software for a third-party hardware platform, Mind on Tech introduces a potential "Android vs. Apple" paradigm for the humanoid robotics market. While the robot's capabilities—including dynamic balancing, object manipulation, and apparent semantic understanding—are highly impressive, the demonstration has also drawn scrutiny regarding the true extent of its autonomy, with some experts suggesting certain segments may be pre-programmed or teleoperated. Nevertheless, the progress shown by a company only six months old highlights the potential for rapid advancement through a decoupled hardware-software approach, a model that could fundamentally alter and expand the landscape of humanoid robotics.
--------------------------------------------------------------------------------
1. The MindOn Tech Demonstration
A video released by Mind on Tech has captured widespread attention for showcasing a humanoid robot performing household tasks at a level of speed and agility previously unseen. The demonstration is notable for several key factors:
• The Company: Mind on Tech is a startup founded in May 2023 by two former Tencent researchers, making its demonstrated progress in approximately six months highly significant.
• The Technology Stack: The demo features a crucial separation of "brain" and "body." Mind on Tech developed the advanced software controller, while the physical platform is a Unitree G1 humanoid robot. This represents a significant departure from the vertically integrated approach common among robotics firms.
• The Claims: The video frequently displays text asserting "No speed up no teleoperation," indicating the robot is operating autonomously and in real-time.
2. Analysis of Demonstrated Capabilities
The robot in the video performs a wide range of tasks that demonstrate advanced physical control, environmental interaction, and cognitive processing.
Advanced Mobility and Body Control
The demo is defined by the robot's fluid, confident, and human-like movements, which experts have praised as exceptional "whole body manipulation."
• Dynamic Actions: The robot executes a "violent sort of sweeping motion" to open a curtain, runs to throw a frisbee, and squats under a table at an awkward angle to retrieve an item.
• Balancing and Complex Maneuvering: A particularly impressive sequence involves the robot stepping onto a small bench, maintaining its balance while watering plants, and then stepping off. The step-down maneuver is described as "very human," involving the robot spreading its legs in a "man spread" and pivoting away.
• Human-like Postures: It kneels on a bed to steam a mattress, using one hand for balance, and crawls across the bed's surface.
Task Execution and Object Manipulation
Despite using simple two-finger grippers rather than complex, multi-fingered hands, the robot successfully completes numerous chores. This suggests that a "good enough" hardware stack can be highly effective when paired with advanced software.
• Household Chores: Tasks shown include watering plants, ironing sheets, steam cleaning a mattress, wiping crumbs from a table into a cloth, and sorting toys.
• Handling Deformable Objects: The robot demonstrates proficiency in manipulating soft objects like curtains, cloth, and stuffed animals, which is a traditionally difficult challenge in robotics.
• Co-bot Interaction: The robot operates safely in close proximity to a child, picking up toys the child adds to a pile, suggesting it can function as a collaborative robot, or "cobot."
Cognitive and Interactive Abilities
The demo implies a significant level of intelligence and environmental awareness built into the system.
• Semantic Understanding: The robot appears to understand concepts necessary for its tasks, such as identifying plants, knowing how to use a watering can, distinguishing between hard and soft toys for sorting, and recognizing trash to be placed in a wastebasket.
• Real-time Correction: In one key sequence, a person uses a pole to move a teddy bear as the robot reaches for it. The robot continuously tracks the moving target and successfully adjusts its trajectory to grab the bear, demonstrating perception-in-the-loop control.
3. Expert Commentary and Technical Scrutiny
The video has prompted both praise and critical analysis from experts in the AI and robotics fields.
Commentator
Affiliation
Key Observation/Critique
Chawi Pon
AI at Meta (CMU PhD)
Called the demo the "best whole body manipulation demo ever." He speculated the training method could be a combination of reinforcement learning bootstrapped with initial teleoperation data.
Chris Paxton
Agility Robotics
Highlighted the rapid progress made by Mind on Tech, noting the company was only founded in May 2023 by two researchers from Tencent.
Scott Walter
(Unspecified)
Questioned the robot's oscillation ("vertigo") while standing on the bench, jokingly asking if it was a motion-capture artifact. This pointed to a real control system challenge.
Brett Adcock
(Unspecified)
Offered a "dose of reality," suggesting the demo employs an "open-loop replay reinforcement learning controller" (i.e., a pre-programmed sequence). He noted the "no teleoperation" text is absent from the dynamic bear-grabbing scene, implying it may have been human-controlled.
The Oscillation Anomaly
While the robot stands on the bench, it exhibits a noticeable swaying or oscillation. The video's narrator notes this is a common challenge in robotics control systems, even for those based on neural networks. This behavior is compared to oscillation loops seen in traditional PID controllers or the "brake stabbing" behavior observed in early versions of Tesla's Full Self-Driving software, suggesting an issue with integrating information over time.
The Autonomy Debate
The most significant point of contention is the claim of full autonomy.
• Adcock's Skepticism: Brett Adcock argues that much of the demo appears to be a pre-set routine where the robot is "moving blind" without real-time perception. He specifically points out that during the most complex interactive task—grabbing the moving teddy bear—the "no teleoperation" text is missing from the screen, suggesting it was controlled by a human.
• The Counter-Argument: The narrator refutes this by highlighting the toy-sorting scene. In this segment, a child actively adds new toys to the pile, forcing the robot to react to a changing environment. This scene does include the "no teleoperation" text, which strongly suggests it cannot be a pre-programmed, open-loop sequence and must be autonomous. The difficulty of precisely placing soft objects for a pre-programmed routine is also cited as evidence for genuine autonomy.
4. Strategic Implications: The Hardware vs. Software Paradigm Shift
The most profound insight from the Mind on Tech demo is its business and development model, which could reshape the entire humanoid robotics industry.
The "Apple vs. Android" Analogy
The narrator posits that Mind on Tech's approach—building a "brain" for another company's "body"—could split the market, much like the smartphone industry.
1. The Proprietary "Apple" Model: Most current robotics companies (e.g., Tesla, Figure) are vertically integrated, building both the hardware and software as a single, proprietary "turnkey solution."
2. The Open Platform "Android" Model: Mind on Tech's strategy mirrors the Android ecosystem, where a base hardware platform (e.g., Unitree's robot) can be licensed by numerous independent software companies.
Potential Market Impact
This division could have transformative effects:
• Accelerated Innovation: By focusing solely on software, companies like Mind on Tech can innovate at "software speeds" without being slowed by the immense challenges of hardware development.
• Market Expansion: An open platform model could "expand the pie," allowing hundreds or thousands of software companies to enter the market and create niche applications for a few standardized hardware platforms. This contrasts with a market limited to a handful of vertically integrated players.
• Future Trajectory: The narrator predicts that even if the industry is too nascent for this model now, a separation of hardware and software is inevitable in the near future. This would lead to hardware companies providing a base robot and operating system, upon which a vast ecosystem of third-party software applications could be built. Companies that remain strictly proprietary risk being "sidelined" in this potential future.
5. Overall Takeaway
The Mind on Tech demonstration represents "amazing progress" from a company that has been in existence for only six months. Its core strategic advantage is the decision to decouple software development from hardware, allowing it to race ahead on a predefined platform. This approach not only yielded a highly impressive technical demo but also introduced a paradigm that could fundamentally change the competitive landscape of humanoid robotics. If the open platform model takes hold, it promises to democratize development, accelerate innovation, and significantly broaden the market for autonomous systems.