A password will be e-mailed to you.

AI for Improving Touch Screens
CapContact to Enable Super Resolution

AI has been incorporated in many sectors due to its user friendliness and easy accessibility. Al provides machines the capability to imitate human intelligence and helps them to learn from data fed into it and by experience. Artificial Intelligence has proved to be beneficial in many industries facilitating work to be done with ease and reducing the load on human workers.

Video Credits: ACM SIGCHI, YouTube

Recently, ETH Computer scientists Christian Holz and Paul Streli have now developed an AI solution which allows the touchscreen to be eight times more effective at sensing than current devices.

Whenever one is typing quickly on a smartphone or tablets or any screen, words often get misspelled or wrong words are hit. The size of the keyboard displayed is often quite small and also the touch sensors which are responsible for detection of touch input have not been updated or modified much since they were launched in the mid 2000s.

Christian Holz who is currently serving as an ETH Computer Science professor from the Sensing, Interaction & Perception Lab (SIPLAB) said, “And here we are, wondering why we make so many typing errors on the small keyboard? We think that we should be able to select objects with pixel accuracy through touch, but that’s certainly not the case.”

Image Credits: CHI/SIPLAP

The AI solution developed by Christian Holz and Paul Streli is called CapContact. CapContact provides the touch screens with super resolution which allows them to detect exactly where the file gers have touched on the display. This has also proved to be much more accurate than the devices currently available.

The CapContact system makes use of two approaches, one is using the touch screen as image sensors and the other is detection of the contact areas between the fingers and surface. It is able to perform this function with the help of a new deep learning algorithm developed by Christian Holz and Paul Streli.

While explaining the process in more detail, Christian Holz said that, “First, “CapContact’ estimates the actual contact areas between fingers and touchscreens upon touch.”

He added that, “Second, it generates these contact areas at eight times the resolution of current touch sensors, enabling our touch devices to detect touch much more precisely.”

The researchers presented their new AI solution at the ACM CHI 2021, which is a premier conference on Human Factors in Computing Systems.

Giving a brief idea about the paper, Paul Streli said that, “In our paper, we show that from the contact area between your finger and a smartphone’s screen as estimated by CapContact, we can derive touch-input locations with higher accuracy than current devices do.”

CapContact with the help of the new deep learning algorithm developed will be able to reduce the number of errors occurring on current devices which contain the low resolution input sensing. One third errors which take place on current devices are due to such low resolution input sensing.

Compared to the current versions available in the market, CapContact was proved to be more accurate when it comes to detecting input locations and touches with the help of its higher resolution. This can prove to be fundamental in introducing new touch sensing screens for smart phones and other gadgets.

In order to allow others to improve or build on their findings the researchers have now released their deep learning model, code and dataset on their project page.

Comments

comments

No more articles
Send this to a friend