
Is it possible to make the robots feel as we do? Hard question. Actually impossible, though it is possible to teach robots to analyze our behavior and understand us. Digitalization can help us with that. Not only with robots. Digitalization brought lots of new devices and possibilities so as problems. We can travel around the world, and visit any temples or casinos while we are staying at home. If you don’t trust us then check le meilleur casino en ligne au Canada and win your first jackpot.
Recently, Google engineers asked a robot to make a burger. The robot knew all the details and didn’t make the right one. He placed all the ingredients into the bottle. Of course, he would become a terrible cook. Though, the engineers noticed that the robot reacts to commands using voice rather than code.
Previously they asked the robot to bring some snacks. The robot moved to the kitchen, opened the table and grabbed a pack of chips, and brought them. According to Google executives and researchers, this is the first time that language models have been integrated into robots.
Robots Among Us
Robots have already conquered our world. Millions of them work in factories around the world, but they follow specific instructions and usually focus on just one or two tasks, such as moving a product along a conveyor belt or welding two pieces of metal together. Creating a robot that can perform a range of everyday tasks and learn on the spot is much more difficult.
Language models work in the following way: they take a huge amount of text uploaded to the Internet and use it to teach AI to guess what answers might follow certain questions or comments.
Models have become so good at predicting the right answer that talking to them is often like talking to an erudite. Google and other companies, including OpenAI and Microsoft, are investing a lot of resources to improve models and train them on ever larger sets of text in multiple languages.
One of the problems that trouble the engineers is working with the huge amount of text from the internet. You see, there is no control over it on the internet and Artificial intelligence can use it as much as it wants.
Complicated Tasks
If robots could handle complex tasks, distribution centers could be reduced in size, employing fewer people and more robots. This can mean job cuts for people. Usually, when there is a cut due to automation in one area, jobs are created in other areas. Though this might not happen this time.
Besides, it’s probably still a long way off. Techniques such as neural networks and reinforcement learning have been used to train robots for many years. This has led to some breakthroughs, but progress is still slow. While Google robots are not yet ready for the real world, and in interviews, researchers and Google executives have repeatedly stated that they simply run a research laboratory and do not plan to make money from technology yet.
But it’s clear that Google and other big tech companies are taking a serious interest in robotics.