Apple Exploring Foray Into Home Robotics With Two Products in the Works: Report

Apple is said to be exploring the possibility of the company entering the home robotics sector, shortly after the company reportedly cancelled its project to develop a self-driving car. According to a Bloomberg report citing unnamed sources aware of the company’s plans, the company is in the process of developing two robotic devices designed to be used indoors. Earlier this year, the company began selling the Vision Pro mixed reality headset — its first new product since the Apple Watch was launched in 2015.

According to the report, the iPhone maker is reportedly working on two potential products. The first one is a robotic device that sits atop a table and moves a display around using robotics. The second product, which is also in development, is a mobile robot that can follow a user around their house. These products are not ready to be launched, but the former is said to be in a more advanced stage than the mobile robot, as per the report.

Development of hardware for the robotics project was reportedly overseen by Matt Costello and Brian Lynch, Apple executives who work on the company’s home products. The company is said to be looking for its next product to generate additional revenue, following the launch of the Apple Vision Pro earlier this year.

The Bloomberg report also states that Apple was planning on a three-pronged approach for new products in the coming years. One of these — a autonomous car said to be in development for over a decade — has been cancelled, while the company continues to focus on smart home and mixed reality products, such as the first-generation Apple Vision Pro.

The rumoured robotics-enabled products from Apple would likely compete with existing products from companies like Amazon’s $1,599 (roughly Rs. 1.33 lakh) invitation-only Astro robot that is available in the US. There’s currently no word from Apple — the company is known for keeping its products and services under wraps until they are ready to be announced — on plans to develop or sell smart home products equipped with robotic features, and it is possible that these initial devices might never be sold by the company.


Affiliate links may be automatically generated – see our ethics statement for details.

For the latest tech news and reviews, follow Gadgets 360 on X, Facebook, WhatsApp, Threads and Google News. For the latest videos on gadgets and tech, subscribe to our YouTube channel. If you want to know everything about top influencers, follow our in-house Who’sThat360 on Instagram and YouTube.


Spotify to Raise Prices by Up to $2 Monthly in Several Key Markets by End of April



Meta Records 16 Percent Uptick in 2023 Annual Revenue, Metaverse Unit Still Under Loss



Check out our Latest News and Follow us at Facebook

Original Source

Google Demonstrates AI Robots Fetching Soda, Snacks Using Voice Commands

Google is combining the eyes and arms of physical robots with the knowledge and conversation skills of virtual chatbots to help its employees fetch soda and chips from breakrooms with ease. The mechanical waiters, shown in action to reporters last week, embody an artificial intelligence breakthrough that paves the way for multipurpose robots as easy to control as ones that perform single, structured tasks such as vacuuming or standing guard.

The company’s robots are not ready for sale. They perform only a few dozen simple actions, and the company has not yet embedded them with the “OK, Google” summoning feature familiar to consumers.

While Google says it is pursuing development responsibly, adoption could ultimately stall over concerns such as robots becoming surveillance machines, or being equipped with chat technology that can give offensive responses, as Meta and others have experienced in recent years.

Microsoft and Amazon are pursuing comparable research on robots.

“It’s going to take a while before we can really have a firm grasp on the direct commercial impact,” said Vincent Vanhoucke, senior director for Google’s robotics research.

When asked to help clean a spill, Google’s robot recognises that grabbing a sponge is a doable and more sensible response than apologising for creating the mess.

The robots interpret naturally spoken commands, weigh possible actions against their capabilities and plan smaller steps to achieve the ask.

The chain is made possible by infusing the robots with language technology that draws understanding of the world from Wikipedia, social media and other webpages. Similar AI underlies chatbots or virtual assistants, but has not been applied to robots this expansively before, Google said.

It unveiled the effort in a research paper in April. Incorporating more sophisticated language AI since then boosted the robots’ success on commands to 74 percent from 61 percent, according to a company blog post on Tuesday.

Fellow Alphabet subsidiary Everyday Robots designs the robots, which for now will stay confined to grabbing snacks for employees.

© Thomson Reuters 2022


Check out our Latest News and Follow us at Facebook

Original Source

Exit mobile version