• Send Us A Tip
  • Calling all Tech Writers
  • Advertise
Wednesday, July 9, 2025
  • Login
TechStory
  • News
  • Crypto
  • Gadgets
  • Memes
  • Gaming
  • Cars
  • AI
  • Startups
  • Markets
  • How to
No Result
View All Result
  • News
  • Crypto
  • Gadgets
  • Memes
  • Gaming
  • Cars
  • AI
  • Startups
  • Markets
  • How to
No Result
View All Result
TechStory
No Result
View All Result
Home Future Tech

How Do Robots See?

by Shruti Jani
September 8, 2018
in Future Tech, Robotics
Reading Time: 5 mins read
0
How Do Robots See?

Credits: University of Maryland

TwitterWhatsappLinkedin
Credits: University of Maryland

In 2018, we’re surrounded by tech evolutions all around us. From the smartphone to artificial intelligence, we as a species have become extremely reliant on it too. A significant albeit, still nascent development in this tech-driven world, is robotics. The world is looking for solutions through robots and there is a craze buzzing in the air. A significantly sized robot revolution is said to be underway.

This revolution is speculated to overhaul the technological and overall economy in the next two decades. And it will not just take up a niche part of our environment, i.e., factories or labs. Robots will be omnipresent – from being at your nearest McDonalds taking orders to replacing your stylist for your haircut.

All these robots will be very different in look, feel, and functions, but there are certain characteristics that have to be present in all robots, i.e., analyzing a situation and acting on it using its inbuilt software.

There is a lot of things we as humans take for granted without realizing. When performing certain tasks, we’re interpreting it, analyzing the outcomes, making decisions about what variables work best together, and then take action towards it.

But before all of these, there is something very essential we do that comes effortlessly – we look. We look at the situation, problem, or the surrounding by using reflected light from a source and then decide how to tackle it. These visual cues give us the ability to tangibly assess a situation.

So, now going back to our previous discussion, robots are meant to analyze and solve situations. But whilst doing so, how do they see? More so, how do they see to understand enough about the situation? Let’s take a deeper dive.

First things first – robots today don’t have the ability to actually perform the action of seeing in its conventional sense. The way we perform the action is not the way they do. And technological advancements that can make them do so are far in the future. So then, what do they actually do when they are seeing?

The hardware part is easy. Robots are fitted with cameras. The robotics camera field is a huge area of development but for now, we can think of it like a phone camera for ease of understanding. These cameras grab light from the surrounding area and take images of the things around. These images can be singular instances like snapshots or can be a string of images and frames which then form a moving image or a video.

https://youtu.be/2Ypt4BdQ3Co

But the tricky part comes when the robot is confronted with what to do with the image when it has been taken.

PICTURE TO PIXEL TO PATTERN

Since the 70’s, robotics engineers have dissected an image to what it actually is in its essence – a collection of pixels. Each of these pixels has a color on the RGB spectrum and then joins other pixels to form something useful. Programmers take the data on each of these pixels, note it for color, contrast, shadow, highlight, and then code a suitable course of action into the robot.

However, there was a problem. Pixels provide way too much data for a small simple task. And a few decades ago, storing this data efficiently was also an issue. So, to tackle this, engineers started coding algorithms for universal features like lines, corners, and textures. Soon, there are patterns of images which have the same features and similar actions can be taken for images following the same pattern. This reduces the data from millions of pixels to a few hundred feature-based algorithms.

WHAT DOES THE ROBOT NEED TO SEE?

Like a human, a robot doesn’t need everything in its surrounding to be seen. It needs just enough to perform a function. For example, a floor cleaning robot like the Roomba only needs to identify what’s an unobtrusive open floor and what’s not and move on. A shadow and highlight algorithm can help it perform this function seamlessly. It doesn’t need to know what’s a chair, table, or the details of the furniture or the object.

However, these same strategies cannot be applied to all sorts of functions. Like driverless cars or dog-walking robots. They need more refined tech and strategies. These strategies can come in terms of add-ons to the original base technology of a camera.

NEW ROBOT VISION LEARNING

Robots with cameras and pixel/feature-based algorithms are a foundational level order. They are manual or labor-intensive and are essentially spoon feeding robots. Over time, this aspect doesn’t seem feasible. Here is where our previous examples like driverless cars come in.

These advanced robots use better vision systems developed by scientists over the years which do not program the robots to interpret objects. Instead of that, the scientists are teaching them how to see. Not only see but also perform a more evolutionary behavior and self-learn along the way. Thus a robot is not just a chassis for a camera but a self-evolving software.

One of the ways they do this is by is creating a software-based neuron network. The engineers, to state simply, enter the robot with data and structures of the functions but do not develop an algorithm. This is left to the robot’s own capacity.

The robot by performing the functions learns more and more and soon, becomes more evolved than a scientist could ever make it. For example, there are robots who are learning how to cook by watching YouTube videos on cooking. This also makes them evolve recipes. Thus the robot knew what cooking was, what vessels were, and what techniques were there, but how to perform a perfect function by coordinating all these aspects is something it learned on its own.

However, these types of robots are still in their evolutionary stages. At present, they are experimental forms but are a soon-to-be reality.

THE FUTURE

There are systems called hive minds where the neural networks of multiple robots are connected. Thus, if a new robot joins in, it doesn’t need to learn from scratch. It can just build on the information the robots before it have gathered.

You might also like

CoreWeave’s Game-Changing $9 Billion Acquisition of Core Scientific

Fiat 500 Hybrid Debuts: A Petrol-Powered Twist to an Electric Icon

Xiaomi’s First EV Hits European Roads: A Test Drive Into the Future

The future of robotics is bright and it has never evolved faster. Soon, most systems will be automated or robotized. But whether this is a bane for humans or a boon is something only the future can show. So far, we’re in too deep to turn back, and maybe robotics is indeed a change for the better.

Tags: Robotroboticstechh
Tweet54SendShare15
Previous Post

GreyOrange raises $140 million in Series C Funding

Next Post

BMW Introduces Intelligent Personal Assistant for its Cars

Shruti Jani

Recommended For You

CoreWeave’s Game-Changing $9 Billion Acquisition of Core Scientific

by Anochie Esther
July 9, 2025
0
CoreWeave

In a landmark deal signaling a significant shift in the competitive landscape of AI infrastructure, CoreWeave, a leading AI hyperscaler, has announced its intent to acquire Core Scientific,...

Read more

Fiat 500 Hybrid Debuts: A Petrol-Powered Twist to an Electric Icon

by Samir Gautam
July 9, 2025
0
Fiat 500 Hybrid Debuts: A Petrol-Powered Twist to an Electric Icon

In a move that blends nostalgia with necessity, Fiat has officially unveiled a petrol-powered successor to its iconic 500 city car, marking the return of combustion to a...

Read more

Xiaomi’s First EV Hits European Roads: A Test Drive Into the Future

by Samir Gautam
July 8, 2025
0
Xiaomi’s First EV Hits European Roads: A Test Drive Into the Future

Xiaomi, the company that became a household name with smartphones and smart gadgets, has taken a major step in a new direction, electric cars. Over the weekend, Xiaomi...

Read more
Next Post
bmw-intelligent-personal-assistant

BMW Introduces Intelligent Personal Assistant for its Cars

Please login to join discussion

Techstory

Tech and Business News from around the world. Follow along for latest in the world of Tech, AI, Crypto, EVs, Business Personalities and more.
reach us at [email protected]

Advertise With Us

Reach out at - [email protected]

BROWSE BY TAG

#Crypto #howto 2024 acquisition AI amazon Apple bitcoin Business China cryptocurrency e-commerce electric vehicles Elon Musk Ethereum facebook flipkart funding Gaming Google India Instagram Investment ios iPhone IPO Market Markets Meta Microsoft News NFT samsung Social Media SpaceX startup startups tech technology Tesla TikTok trend trending twitter US

© 2024 Techstory.in

No Result
View All Result
  • News
  • Crypto
  • Gadgets
  • Memes
  • Gaming
  • Cars
  • AI
  • Startups
  • Markets
  • How to

© 2024 Techstory.in

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In
Are you sure want to unlock this post?
Unlock left : 0
Are you sure want to cancel subscription?