How Robots Can Help Us Act and Feel Younger - IEEE Spectrum

2022-09-17 09:15:30 By : Mr. Mark Li

IEEE websites place cookies on your device to give you the best user experience. By using our websites, you agree to the placement of these cookies. To learn more, read our Privacy Policy.

Toyota’s Gill Pratt on enhancing independence in old age

By 2050, the global population aged 65 or more will be nearly double what it is today. The number of people over the age of 80 will triple, approaching half a billion. Supporting an aging population is a worldwide concern, but this demographic shift is especially pronounced in Japan, where more than a third of Japanese will be 65 or older by midcentury.

Toyota Research Institute (TRI), which was established by Toyota Motor Corp. in 2015 to explore autonomous cars, robotics, and “human amplification technologies,” has also been focusing a significant portion of its research on ways to help older people maintain their health, happiness, and independence as long as possible. While an important goal in itself, improving self-sufficiency for the elderly also reduces the amount of support they need from society more broadly. And without technological help, sustaining this population in an effective and dignified manner will grow increasingly difficult—first in Japan, but globally soon after.

Gill Pratt, Toyota’s Chief Scientist and the CEO of TRI, believes that robots have a significant role to play in assisting older people by solving physical problems as well as providing mental and emotional support. With a background in robotics research and five years as a program manager at the Defense Advanced Research Projects Agency, during which time he oversaw the DARPA Robotics Challenge in 2015, Pratt understands how difficult it can be to bring robots into the real world in a useful, responsible, and respectful way. In an interview earlier this year in Washington, D.C., with IEEE Spectrum’s Evan Ackerman, he said that the best approach to this problem is a human-centric one: “It’s not about the robot, it’s about people.”

What are the important problems that we can usefully and reliably solve with home robots in the relatively near term?

Gill Pratt: We are looking at the aging society as the No. 1 market driver of interest to us. Over the last few years, we’ve come to the realization that an aging society creates two problems. One is within the home for an older person who needs help, and the other is for the rest of society—for younger people who need to be more productive to support a greater number of older people. The dependency ratio is the fraction of the population that works relative to the fraction that does not. As an example, in Japan, in not too many years, it’s going to get pretty close to 1:1. And we haven’t seen that, ever.

Solving physical problems is the easier part of assisting an aging society. The bigger issue is actually loneliness. This doesn’t sound like a robotics thing, but it could be. Related to loneliness, the key issue is having purpose, and feeling that your life is still worthwhile.

What we want to do is build a time machine. Of course we can’t do that, that’s science fiction, but we want to be able to have a person say, “I wish I could be 10 years younger” and then have a robot effectively help them as much as possible to live that kind of life.

There are many different robotic approaches that could be useful to address the problems you’re describing. Where do you begin?

Pratt: Let me start with an example, and this is one we talk about all of the time because it helps us think: Imagine that we built a robot to help with cooking. Older people often have difficulty with cooking, right?

Well, one robotic idea is to just cook meals for the person. This idea can be tempting, because what could be better than a machine that does all the cooking? Most roboticists are young, and most roboticists have all these interesting, exciting, technical things to focus on. And they think, “Wouldn’t it be great if some machine made my meals for me and brought me food so I could get back to work?”

But for an older person, what they would truly find meaningful is still being able to cook, and still being able to have the sincere feeling of “I can still do this myself.” It’s the time-machine idea—helping them to feel that they can still do what they used to be able to do and still cook for their family and contribute to their well-being. So we’re trying to figure out right now how to build machines that have that effect—that help you to cook but don’t cook for you, because those are two different things.

A robot for your home may not look much like this research platform, but it’s how TRI is learning to make home robots that are useful and safe. Tidying and cleaning are physically repetitive tasks that are ideal for home robots, but still a challenge since every home is different, and every person expects their home to be organized and cleaned differently.Toyota Research Institute

How can we manage this temptation to focus on solving technical problems rather than more impactful ones?

Pratt: What we have learned is that you start with the human being, the user, and you say, “What do they need?” And even though all of us love gadgets and robots and motors and amplifiers and hands and arms and legs and stuff, just put that on the shelf for a moment and say: “Okay. I want to imagine that I’m a grandparent. I’m retired. It’s not quite as easy to get around as when I was younger. And mostly I’m alone.” How do we help that person have a truly better quality of life? And out of that will occasionally come places where robotic technology can help tremendously.

A second point of advice is to try not to look for your keys where the light is. There’s an old adage about a person who drops their keys on the street at night, and so they go look for them under a streetlight, rather than the place they dropped them. We have an unfortunate tendency in the robotics field—and I’ve done it too—to say, “Oh, I know some mathematics that I can use to solve this problem over here.” That’s where the light is. But unfortunately, the problem that actually needs to get solved is over there, in the dark. It’s important to resist the temptation to use robotics as a vehicle for only solving problems that are tractable.

It sounds like social robots could potentially address some of these needs. What do you think is the right role for social robots for elder care?

Pratt: For people who have advanced dementia, things can be really, really tough. There are a variety of robotic-like things or doll-like things that can help a person with dementia feel much more at ease and genuinely improve the quality of their life. They sometimes feel creepy to people who don’t have that disability, but I believe that they’re actually quite good, and that they can serve that role well.

There’s another huge part of the market, if you want to think about it in business terms, where many people’s lives can be tremendously improved even when they’re simply retired. Perhaps their spouse has died, they don’t have much to do, and they're lonely and depressed. Typically, many of them are not technologically adept the way that their kids or their grandkids are. And the truth is their kids and their grandkids are busy. And so what can we really do to help?

Here there’s a very interesting dilemma, which is that we want to build a social-assistive technology, but we don’t want to pretend that the robot is a person. We’ve found that people will anthropomorphize a social machine, which shouldn’t be a surprise, but it’s very important to not cross a line where we are actively trying to promote the idea that this machine is actually real—that it’s a human being, or like a human being.

So there are a whole lot of things that we can do. The field is just beginning, and much of the improvement to people's lives can happen within the next 5 to 10 years. In the social robotics space, we can use robots to help connect lonely people with their kids, their grandkids, and their friends. We think this is a huge, untapped potential.

A robot for your home may not look much like this research platform, but it’s how TRI is learning to make home robots that are useful and safe. Perceiving and grasping transparent objects like drinking glasses is a particularly difficult task.Toyota Research Institute

Where do you draw the line with the amount of connection that you try to make between a human and a machine?

Pratt: We don’t want to trick anybody. We should be very ethically stringent, I think, to not try to fool anyone. People will fool themselves plenty—we don't have to do it for them.

To whatever extent that we can say, “This is your mechanized personal assistant,” that’s okay. It’s a machine, and it’s here to help you in a personalized way. It will learn what you like. It will learn what you don’t like. It will help you by reminding you to exercise, to call your kids, to call your friends, to get in touch with the doctor, all of those things that it's easy for people to miss on their own. With these sorts of socially assistive technologies, that’s the way to think of it. It’s not taking the place of other people. It’s helping you to be more connected with other people, and to live a healthier life because of that.

How much do you think humans should be in the loop with consumer robotic systems? Where might it be most useful?

Pratt: We should be reluctant to do person-behind-the-curtain stuff, although from a business point of view, we absolutely are going to need that. For example, say there's a human in an automated vehicle that comes to a double-parked car, and the automated vehicle doesn’t want to go around by crossing the double yellow line. Of course the vehicle should phone home and say, “I need an exception to cross the double yellow line.” A human being, for all kinds of reasons, should be the one to decide whether it’s okay to do the human part of driving, which is to make an exception and not follow the rules in this particular case.

However, having the human actually drive the car from a distance assumes that the communication link between the two of them is so reliable it’s as if the person is in the driver’s seat. Or, it assumes that the competence of the car to avoid a crash is so good that even if that communications link went down, the car would never crash. And those are both very, very hard things to do. So human beings that are remote, that perform a supervisory function, that’s fine. But I think that we have to be careful not to fool the public by making them think that nobody is in that front seat of the car, when there’s still a human driving—we’ve just moved that person to a place you can’t see.

In the robotics field, many people have spoken about this idea that we’ll have a machine to clean our house operated by a person in some part of the world where it would be good to create jobs. I think pragmatically it’s actually difficult to do this. And I would hope that the kinds of jobs we create are better than sitting at a desk and guiding a cleaning machine in someone’s house halfway around the world. It’s certainly not as physically taxing as having to be there and do the work, but I would hope that the cleaning robot would be good enough to clean the house by itself almost all the time and just occasionally when it’s stuck say, “Oh, I’m stuck, and I’m not sure what to do.” And then the human can help. The reason we want this technology is to improve quality of life, including for the people who are the supervisors of the machine. I don’t want to just shift work from one place to the other.

These bubble grippers are soft to the touch, making them safe for humans to interact with, but they also include the necessary sensing to be able to grasp and identify a wide variety of objects.Toyota Research Institute

Can you give an example of a specific technology that TRI is working on that could benefit the elderly?

Pratt: There are many examples. Let me pick one that is very tangible: the Punyo project.

In order to truly help elderly people live as if they are younger, robots not only need to be safe, they also need to be strong and gentle, able to sense and react to both expected and unexpected contacts and disturbances the way a human would. And of course, if robots are to make a difference in quality of life for many people, they must also be affordable.

Compliant actuation, where the robot senses physical contact and reacts with flexibility, can get us part way there. To get the rest of the way, we have developed instrumented, functional, low-cost compliant surfaces that are soft to the touch. We started with bubble grippers that have high-resolution tactile sensing for hands, and we are now adding compliant surfaces to all other parts of the robot's body to replace rigid metal or plastic. Our hope is to enable robot hardware to have the strength, gentleness, and physical awareness of the most able human assistant, and to be affordable by large numbers of elderly or disabled people.

What do you think the next DARPA challenge for robotics should be?

Pratt: Wow. I don’t know! But I can tell you what ours is [at TRI]. We have a challenge that we give ourselves right now in the grocery store. This doesn't mean we want to build a machine that does grocery shopping, but we think that trying to handle all of the difficult things that go on when you’re in the grocery store—picking things up even though there’s something right next to it, figuring out what the thing is even if the label that’s on it is half torn, putting it in the basket—this is a challenge task that will develop the same kind of capabilities we need for many other things within the home. We were looking for a task that didn’t require us to ask for 1,000 people to let us into their homes, and it turns out that the grocery store is a pretty good one. We have a hard time helping people to understand that it’s not about the store, it’s actually about the capabilities that let you work in the store, and that we believe will translate to a whole bunch of other things. So that’s the sort of stuff that we're doing work on.

As you’ve gone through your career from academia to DARPA and now TRI, how has your perspective on robotics changed?

Pratt: I think I’ve learned that lesson that I was telling you about before—I understand much more now that it’s not about the robot, it’s about people. And ultimately, taking this user-centered design point of view is easy to talk about, but it’s really hard to do.

As technologists, the reason we went into this field is that we love technology. I can sit and design things on a piece of paper and feel great about it, and yet I’m never thinking about who it is actually going to be for, and what am I trying to solve. So that’s a form of looking for your keys where the light is.

The hard thing to do is to search where it’s dark, and where it doesn’t feel so good, and where you actually say, “Let me first of all talk to a lot of people who are going to be the users of this product and understand what their needs are. Let me not fall into the trap of asking them what they want and trying to build that because that’s not the right answer.” So what I’ve learned most of all is the need to put myself in the user’s shoes, and to really think about it from that point of view.

Evan Ackerman is a senior editor at IEEE Spectrum. Since 2007, he has written over 6,000 articles on robotics and technology. He has a degree in Martian geology and is excellent at playing bagpipes.

Greg Munson, cofounder of the tournament, on the tech that’s made a difference in combat

Stephen Cass is the special projects editor at IEEE Spectrum. He currently helms Spectrum's Hands On column, and is also responsible for interactive projects such as the Top Programming Languages app. He has a bachelor's degree in experimental physics from Trinity College Dublin.

Earlier this year, friend-of-IEEE Spectrum and fashiontech designer Anouk Wipprecht gave a peek of what it’s like to be a competitor on “BattleBots,” the 22-year-old robot-combat competition, from the preparation “pit” to the arena. Her team, Ghostraptor, was knocked out of the regular competition after losing its first and second fights, though they regained some glory by winning a round in the bonus Golden Bolt tournament, which recently finished airing on the TBS TV channel.

This week, tickets went on sale for audience seating for the next season of “BattleBots”; filming will commence in October in Las Vegas. We thought it was a good moment to get a different perspective on the show, so Spectrum asked one of the founders of “BattleBots” and its current executive producer, Greg Munson, about how two decades’ worth of technological progress has impacted the competition.

What are the biggest changes you’ve seen, technology-wise, over 20 years or so?

Greg Munson: Probably the biggest is battery technology. “BattleBots” premiered on Comedy Central in, I think it was, 2000. Now we’re 22 years later. In the early days, people were using car batteries. Then NiCad packs became very popular. But with the advent of lithium technology, when the battery packs could be different sizes and shapes, that’s when things just took off in terms of power-to-weight ratio. Now you can have these massively spinning disk weapons, or bar weapons, or drum weapons that can literally obliterate the other robot.

Greg MunsonGabe Ginsberg/Getty Images

Second is the [improvement in electronic speed control (ESC) circuitry]. We built a robot called Bombmachine back in the day. And besides its giant gel cell batteries, which were probably a third of the [bot’s total] weight, we had this big old Vantex speed controller with a big giant heat sink. The ESC form factors have gotten smaller. They’ve gotten more efficient. They’re able to handle way more amperage through the system, so they don’t blow up. They’ve got more technology built into them, so the team can have a person monitoring things like heat, and they’ll know when to, for instance, shut a weapon down. You see this a lot now on the show where they’re spinning up really fast, going in for a hit. And then they actually back off the weapon. And watchers will think, “Oh, the weapon’s dead.” But no, they’re actually just letting it cool down because the monitor guy has told his driver, “Hey, the weapon’s hot. I’m getting some readings from the ESC. The weapon’s hot. Give me five seconds.” That kind of thing. And that’s a tremendous strategy boon.

So instead of just one-way remote control, teams are getting telemetry back from the robots now as well?

Munson: A lot of that is starting to happen more and more, and teams like Ribbot are using that. I think they’re influencing other teams to go that route as well, which is great. Just having that extra layer of data during the fight is huge.

CAD gives the robots more personality and character, which is perfect for a TV show.

What other technologies have made a big difference?

Munson: CAD is probably just as big of a technology boost since the ’90s compared to now. In the early “BattleBots” era, a lot of teams were using pencil and paper or little wooden prototypes. Only the most elite, fancy teams back then would use some early version of Solidworks or Autodesk. We were actually being hit up by the CAD companies to get more builders into designing in CAD. Back in the day, if you’re going to build a robot without CAD, you think very pragmatically and very form-follows-function. So you saw a lot of robots that were boxes with wheels and a weapon on top. That’s something you can easily just draw on a piece of paper and figure out. And now CAD is just a given. High-school students are designing things in CAD. But when you’ve got CAD, you can play around and reshape items, and you can get a robot like HyperShock—it looks like there’s no right-angled pieces on HyperShock.

CAD gives the robots more personality and character, which is perfect for a TV show because we want the audience to go, “Hey, that’s HyperShock, my favorite!” Because of the silhouettes, because of the shape, it’s branded, it’s instantly identifiable—as opposed to a silver aluminum box that has no paint.

It quickly became obvious that if there’s a battery fire in the pit, with the smoke and whatnot, that’s a no-go.

When Anouk was writing about being a competitor, she pointed out that there’s quite a strict safety regime teams have to follow, especially with regard to batteries, which are stored and charged in a separate area where competitors have to bring their robots before a fight. How did those rules evolve?

Munson: It’s part “necessity is the mother of invention” and part you just know the lithium technology is more volatile. We have a really smart team that helps us do the rules—there are some EEs on there and some mechanical engineers. They know about technology issues even before they hit the awareness of the general public. The warning shots were there from the beginning—lithium technology can burn, and it keeps on burning. We started out with your basic bucket full of sand and special fire extinguishers along the arena side and in the pit where people were fixing the robots. Every row had a bucket of sand and a protocol for disposing of the batteries properly and safely. But it quickly became obvious that if there’s a battery fire in the pit, with the smoke and whatnot, that’s a no-go. So we quickly pivoted away from that [to a separate] battery charging pit.

We’ve seen batteries just go up, and they don’t happen in the main pit; they happen in the battery pit—which is a huge, huge win for us because that’s a place where we know exactly how to deal with that. There’s staff at the ready to put the fires out and deal with them. We also have a battery cool-down area for after a fight. When the batteries have just discharged massive amounts of energy, they’re hot and some of them are puffing. They get a full inspection. You can’t go back to the pit after your match. You have to go to the battery cool-down area—it’s outside, it’s got fans, it’s cool. A dedicated safety inspector is there inspecting the batteries, making sure they’re not on the verge of causing a fire or puffing in any kind of way. If it’s all good, they let them cool down and stay there for 10, 15 minutes, and then they can go back to the battery-charging tent, take the batteries out and recharge them, and then go back to fixing the robot. If the batteries are not good, they are disposed of properly.

The technology has become more flexible, but how do you prevent competitors from just converging on a handful of optimal design solutions, and all start looking alike?

Munson: That’s a constant struggle. Sometimes we win, and sometimes we lose. A lot of it is in the judging rules, the criteria. We’ve gone through so many iterations of the judging rules because builders love to put either a fork, a series of forks, or a wedge on their bot. Makes total sense because you can scoop the guy up and hit them with your weapon or launch them in the air. So okay, if you’re just wedging the whole fight, is that aggressive? Is that control? Is that damage? And so back in the day, we were probably more strict and ruled that if you all you do is just wedge, we actually count it against you. We’ve loosened up there. Now, if all you do is wedge, it only counts against you just a little bit. But you’ll never win the aggression category if all you’re going to do is wedge.

Because a wedge can beat everything. We often saw the finals would be between a big gnarly spinner and a wedge. Wedges are a very effective, simple machine that can clean up in robot combat. So we’re tweaking how we count the effectiveness of wedges and our judging guide if the fight goes to judges. Meanwhile, we don’t want it to go to judges. We want to see a knockout. So we demand that you have to have an active weapon. You can’t just have a wedge. It has to be a robust, active weapon that can actually cause damage. You just can’t put a Home Depot drill on the top of your robot and call it a day. That was just something we knew we needed to have to push the sport forward. What seems to be happening is the vertical spinners are now sort of the dominant class.

We don’t want the robots to be homogenized. That’s one of the reasons why we allow modifications during the actual tournament. Certain fans have gotten mad at us, like, “Why’d you let them add this thing during the middle of the tournament?” Because we want that. We want that spirit of ingenuity and resourcefulness. We want to break any idea of “vertical spinners will always win.” We want to see different kinds of fights because people will get bored otherwise. Even if there’s massive amounts of destruction, which always seems to excite us, if it’s the same kind of destruction over and over again, it starts to be like an explosion in Charlie’s Angels that I’ve seen 100 times, right? A lot of robots are modular now, where they can swap out a vertical spinner for a horizontal undercutter and so on. This will be a constant evolution for our entire history. If you ask me this question 20 years from now, I’m going to still be saying it’s a struggle!

Insights from IEEE-USA’s annual salary survey, in six charts

Tekla S. Perry is a senior editor at IEEE Spectrum. Based in Palo Alto, Calif., she's been covering the people, companies, and technology that make Silicon Valley a special place for more than 40 years. An IEEE member, she holds a bachelor's degree in journalism from Michigan State University.

How much does a tech professional in the United States earn? In 2021, the median income of U.S. engineers and other tech professionals who were IEEE members hit US $160,097, up from $154,443 in 2020. That bump in pay is revealed in the IEEE-USA 2022 Salary & Benefits Survey.

This apparent increase turns into a nearly $3,500 dip, however, when converted to real dollars [see chart, below]. It’s the first significant dip in median tech salary in terms of spending power recorded by IEEE-USA since 2013.

These numbers—and 65 more pages of detailed 2021 salary and job-satisfaction statistics—give readers of the salary and benefits survey a good sense of the United States’ tech employment landscape. The analysis is based on 3,057 responses from professionals working full time in their areas of technical competence; they reported their income, excluding overtime pay, bonuses, profit sharing, and side hustles. (When those are considered, the 2021 median income for these tech professionals was $167,988, according to the report.)

The IEEE-USA 2022 Salary & Benefits Survey chronicles bad news for women in engineering, as their incomes fell further behind men’s in 2021. The gap in salaries between genders grew $5,900 (not adjusted for inflation) to $33,900. The gap is tricky to measure, given that men responding to the survey had more years of experience, as a group, than the women, and more women entering the engineering workforce could skew the median salary downward. However, the proportion of female engineers in the workforce remained flat (on a plateau at under 10 percent, where it’s been for the past 10 years), the survey report noted.

The salary gap between Caucasian and African American engineers decreased by $11,000 to $13,000 in 2021, while the disparity between Caucasian and Hispanic engineers’ incomes fell by nearly $6,000 to $12,278.

2021 was a good time to be an engineer working with solid-state circuitry; salaries in that technical field continued a steep climb and claimed the No. 1 spot on the salaries-by-specialty list. Last year’s No. 1 on that chart, consumer electronics, saw a decline in average salary. Engineers working with other circuits and devices, machine learning, image and video processing, and engineering in medicine and biology recorded big gains.

Overall job satisfaction for engineers surveyed by IEEE-USA fell in 2021, with the biggest drop-offs related to compensation and advancement opportunities. Satisfaction with the technical challenge of engineering jobs was up significantly, however.

Median salaries for engineers in the Pacific region increased dramatically compared with the rest of the United States, climbing faster than hypothetically booming regions like the West South Central area, which includes Texas. These numbers were not adjusted for regional costs of living, however.

Enhance your development efficiency with myBuddy, the most cost-effective dual-arm collaborative robot

This is a sponsored article brought to you by Elephant Robotics.

In July 2022, Elephant Robotics released myBuddy—a dual-arm, 13-axis humanoid collaborative robot powered by Raspberry Pi with multiple functions—at an incredible price. It works with multiple accessories such as suction pumps, grippers, and more. Additionally, users can boost their secondary development with the artificial intelligence and myAGV kits and detailed tutorials published by Elephant Robotics. myBuddy helps users achieve more applications and developments as a collaborative robotic arm.

Elephant Robotics has been committed to R&D, manufacturing, and producing collaborative robots, such as myCobot, mechArm, myPalletizer, and myAGV. To meet the expectations of users from more than 50 countries worldwide and allow everyone to enjoy the world of robotics, Elephant Robotics is achieving more breakthroughs in product R&D ability and manufacturing capacity.

In 2020, the team of Elephant Robotics found that the need for robotics applications was increasing, so they decided to produce a robot with multiple functions that could meet more requirements. In the development and production process, the team met many difficulties. At least three auxiliary control chips were needed to develop more functions, increasing the production difficulty by more than 300 percent compared to myCobot, a 6-axis collaborative robot (cobot). The biggest problem was how to make a robot with multiple functions at an affordable and reasonable price.

After more than two years of continuous efforts, Elephant Robotics has upgraded the myCobot series and transferred it to the new myBuddy cobot based on its highly integrated product design and self-developed robot control platform. The product design of myBuddy is based on the myCobot series combined rounded corners, and the overall industrial design style is simple and beautiful. A robot at an affordable price makes the development of dual-arm cobot applications no longer a problem.

Get to know what applications myBuddy can achieve through the features and functional analysis.

The working radius of a single arm of myBuddy is 280 millimeters, and the maximum payload is 250 grams. It is light and flexible, with 13 degrees of freedom. The built-in axis in the torso of myBuddy improves the working range by more than 400 percent compared to myCobot's single robotic arm, so it can perform more complicated tasks such as flag waving, kinematics practice, and AI recognition.

There are more than 100 API interfaces that can be used, and the bottom control interfaces of myBuddy are open. The potential value, angles, coordinates, running speeds, and other interfaces can be controlled freely, so users can master the application research of dual-arm robots, motion path planning, development of action, and visual recognition. On the hardware interface, myBuddy provides a variety of input and output interfaces, including HDMI, USB, Grove, 3.3V IO, LEGO, RJ45 interface, and more.

In the software, myBuddy supports multiple programming environments. myBlockly, a visual tool with multiple built-in robot application cases for graphical programming, simple and easy for users to use and develop their projects. Users can also control myBuddy in Python and set the joint angle and robot coordinates, and get the speed position in real-time (response time up to 20 milliseconds). Moreover, myBuddy supports the simulation development environment ROS. With the built-in ROS environment, users can realize robot motion path planning algorithm research, dual-arm interference avoidance algorithm research, robot vision learning, and other artificial intelligence application development.

myBuddy has a 7-inch interactive display screen, two 2-million-pixel HD cameras, and more than 20 built-in dynamic facial expressions. Users can conduct scientific research in human-robot interaction, robot vision, robotics learning, artificial intelligence, action planning, mechatronics, manufacturing, and automation with myBuddy. The built-in cameras support area location positioning, object, and QR code recognition. myBuddy can achieve face and body recognition, motion simulation, and trajectory tracking with the cameras.

With fast, high-tech development, VR technology is beginning to become an area of independent research and development, so Elephant Robotics decided to build a VR wireless control function into myBuddy. In this function, users can not only experience human-robot interaction and carry out some dangerous scientific experiments, they can also explore more principles and basic applications of wireless control in cobots, such as underwater exploration, remotely-piloted vehicles, and space exploration. In the future, myBuddy can be used as a surgeon in the support of a virtual surgical system.

Elephant Robotics has developed more than 20 robotic arm accessories, including an end-effector, base, camera, mobile phone gripper, and more. myBuddy has more flexibility, maneuverability, and load capacity than myCobot's single robotic arm. The ability to grasp and move objects has been effectively improved in both rigid and flexible objects and effectively avoids any collisions between the two arms when working. With these accessories, myBuddy can perform more applications in science and education. For example, after installing a gripper and a suction pump, myBuddy can grab test tubes and pour liquids.

A dual-arm robot at an affordable price is a preferred choice for many individual developers, especially teachers and students in robotics and engineering. myBuddy, with its multiple functions supported, will help people explore and develop more possibilities in the world of robotics.

myBuddy 280-Pi | The most compact collaborative Dual-arm robot in the world