Most Advanced Inventions In The Robotics.

Unimate, first industrial robot.
Unimate: The first industrial robot.

In the 1960s UNIMATE became the first digital and programmable robot to replace humans in an industry or factory. In 1961 it was installed at General Motors and it carried out assembly line tasks that were dangerous for humans like picking up hot metals. UNIMATE was basically a big mechanical are much less advanced than some of the technology we have today but even now robots are often built to perform the dangerous mundane or difficult tasks humans perform.

To do that a lot of these machines are developed to physically replicate our actions and behavior to have things like a Bipedal Balanced Walk. A large range of motion and the ability to perceive and interact with the environment. but maybe not to your surprise that is a lot more difficult to replicate than some viral robot videos might make you think and without these five groundbreaking inventions much of modern robotics would not be possible.

Machine Vision

One of humanity’s most noteworthy traits is our ability to sense our physical environment and react intelligently to that information, I mean sometimes our brains are constantly working to synthesize input from things like our eyes, nose, skin, and even internal organs. Robots however work on a much simpler level. In many cases the main information we need them to understand and respond to is visual for instance is the path to their destination open where something is in the way can they climb over that thing or move it to make these decisions robots use machine vision which uses cameras and image processing algorithms to measure and inspect the environment.

The first image processing programs that were applied to real-life images and environments came around in the 1970s but they were pretty inefficient since then advances with both cameras and image processing have caused some breakthroughs in machine vision. Today robots can use multiple 3D and 2D cameras to sense the world around them and those images can be processed and analyzed to detect objects. This information is communicated to the brain or the robots centralized computer which then decides based on its programming what action it should take, this kind of Tech has allowed robotics to move forward by leaps and bounds and today it’s often used on assembly lines for quality control but it also has some more flashy applications, for example, a robot called PEPPER uses its machine vision for social purposes, 


pepper Robot

 

It uses complex algorithms to analyze the facial expressions of people around it and it can guess emotions and modify its conversation tactics. As a result, because of its unique skills, PEPPER is being snapped up for use in customer service to help customers resolve simple easily solved issues a human isn’t needed for.

 

Closed-loop control

With machine vision, we can build robots that can accurately see and respond to their environment but that alone isn’t enough to be practical. For robots to really mimic and help us, we also need them to monitor their world continuously, especially, if the robot moves. That way they can detect if something blocks their path and adjust accordingly, and that brings us to our second innovation robots that can monitor their world are said to have closed-loop control that means they’re constantly on the lookout for changes and can update their internal maps and actions if new obstacles arise.

Now, this general idea has been around for a long time like watchmakers were building things with closed-loop control in the 18th century but it continued to grow as engineering advancement. Nowadays we have amazing computers with huge processing capabilities and pretty efficient algorithms and they’re great at calculating and incorporating feedback. This is helpful to stop robots from running into people but it’s also super important for simple things like keeping a robot balanced as it moves.

 

The machine needs constant feedback about its weight distribution so that it can adjust its alignment and center of gravity and not fall over, which honestly a lot of robots fail as it’s really hard for them to navigate precisely and small errors and calculations or coding bugs can cause big problems in locomotion. To achieve balance and self-adjustment, the most advanced robots are absolutely decked out in sensors that measure everything from weight to temperature to posture and that helps them achieve their incredibly dexterous human-like movements. A great example of this is NASA’s Valkyrie robots,


Robot Valkyrie by NASA

 

Engineers began developing Valkyrie in 2013 and they originally intended it as an emergency response robot But now it’s been redeveloped to one day set up habitats on Mars before human arrival. To help us Valkyrie would have to perform several detailed tasks without any help, like picking up boxes and walking upstairs, and to do this it is covered in sensors like there are dozens of them in Valkyries hands alone which is why it’s fine motor skills are so advanced for a bot so although this thing is a long way from the Red Planet it’s still one of the coolest robots being developed today.

 

Closed-loop systems are consistently becoming more advanced as we develop faster more powerful computers to help them with their calculations but of course, computers aren’t the only factor here control theory, in general, is still being developed which means there might be some fundamental lessons left to learn so someday robots as Valkyrie might just be the norm. Now just because your robot has closed-loop control doesn’t mean it can move like a human. After all, it might be able to adjust its position in real-time but that doesn’t mean it has the grace that we do.
Reducing that kind of fluid motion is actually one of the unique obstacles for many robots and it’s an important one, we need robots to have precise motions so they can do tasks that require a little finesse like holding a glass beaker or even shaking someone’s hand. If you have the power to crush a person’s hand you don’t want to make a mistake here. A lot of robots are getting closer to this fine movement with the help of modern linear actuators.

 

What is an actuator?

An actuator is a mechanical component that converts energy into physical motion and a linear actuator is one that focuses on creating precise forces in a single direction. Many actuators use a hydraulic or pneumatic force that uses pressurized fluids or gases to create large amounts of energy. Whatever the last decades have been great improvements in electromechanical linear actuators too, ones that use electric motors to produce motion their size cost and energy usage have been significantly reduced compared to other types of actuators and that’s allowed us to pack more punch into a smaller area.

Having smaller and more precise actuators allows for more degrees of freedom and finer control of the robotic appendage so by combining these pieces in just the right way you can get a robot that moves with almost eerie fluid grace if you want to see an example for yourself you should watch a video of Boston Dynamics humanoid robot Atlas, it uses four hydraulic limbs 28 hydraulic joints and numerous actuators to perform acrobatic jumps somersaults other robot parkour it’s pretty cool.


Atlas a humanoid robot by Boston Dynamics

3D Printed Parts

As you might guess modern robots have a lot of working parts and 3D printed ones are becoming more and more commonplace. For 3D printing machine parts were generally made either by pouring material into a mold or by removing material to achieve the desired shape. But in 2009 a key patent for a type of 3D printing expired and the market began to grow as new companies were finally able to develop and release 3D printers. Since then this kind of construction has been really picking up steam because of its ability to cheaply and quickly produce uniquely constructed parts with a high strength-to-weight ratio. In other words, they’re both light and strong which is ideal for building robots.

In 3D printing different ratios are achieved by using filaments with different properties. Thermoplastics like polycarbonate are often used because the long chains of molecules are durable and very easy to manipulate at high temperatures. The chains of molecules are very loosely packed now high temperatures their chemical bonds quickly break down in the material liquefies this pliability allows engineers to precisely design objects and optimize them for load-bearing so their robots can be as sturdy and strong as possible. You can find 3D printed parts in all kinds of machines but to go back to a previous example Atlas from Boston dynamic can achieve such unique movement partly thanks to its 3D printed parts. They allow the bot to be light enough to do all those cool flips.

Deep Learning Algorithm

Finally, let’s wrap up with one of the most exciting and ongoing advancements in robotics the development of deep learning algorithms. These algorithms allow robots to not only sense and react to their immediate environment but also to remember and recall past experiences to help inform future decisions. In the past classical programming used static directions to inform robot decision-making. Rules like if you arrive at a fork in the road always turn right but deep learning algorithms take this one step further, they use accumulated experiences to modify their directions. For example, after turning right and walking into a bunch of walls they might revise their programming to make a rule if you arrive at a fork in the road always turn right unless there is an obstacle.

The biggest advancement in deep learning is the implementation of neural networks. Inspired by neurons in the brain these networks are made up of computer nodes that store information and send it to other nodes. A neural network gradually adjusts the weight assigned to each signal as it maps inputs to the correct outputs. The more weight a signal has the more it will affect decisions in other parts of the robot’s programming. Using neural networks robots can put more weight into choices that result in a positive outcome allowing them to remember good decisions and preferentially perform those actions later.
Most robots being developed today have some form of deep learning algorithm integrated into their hardware but a really cute example is Paro a therapeutic robot that looks like an adorable baby seal. Paro is meant as an alternative to animal-assisted therapy. It’s programmed to respond to people like a dog or a cat does but it doesn’t require the upkeep of an animal. Also, it’s helpful in situations where someone has had a bad experience with another kind of pet.
Paro was specifically designed to aid elderly patients with dementia and although more rigorous studies are needed a pilot study indicates that it might increase their quality of life. The seal can be trained to respond to its name and in response to touch by cuddling. Paro can even try to repeat actions that are positively enforced to receive more pets. Steve learning at work everyone like if this is how the robot apocalypse goes down I for one welcome our adorable baby seal overlords when you see a new video of a robot doing something ridiculous it can be easy to give all the credit to the group that built it but in reality most of the things that make robotics possible are the result of decades or even centuries of research it really has been a team effort and there’s no doubt that there are many more advancements like this coming in the future.

After all, it only took 50 years to get from UNIMATE to a robot-like Valkyrie that could someday perform on Mars so who knows what the next 50 years might bring. I won’t I’ll be dead by then.

Leave a Reply