Team Extra Touch



The Opportunity:

                           
Advanced Input Systems’ existing touch-screen offerings only support single-touch controls.  We are to develop a software navigation engine that will enable AIS to offer multi-touch control surfaces.

Multi-touch Control Surfaces?


Think touchscreens to the next level -- touchscreens that can sense touches at more than one point.  Or, think iPhoneMicrosoft Surface, or Perceptive Pixel's products--see the teaser video on our multi-touch Introduction page.
Multi-touch is "the future of input technology"!  (Or at least part of it.)  With natural finger or hand gestures, you can control digital devices in simple yet elegant ways.  

Project Extra Touch


So where do we fit in?  This is a senior design project, not Microsoft or Apple or a well-funded high-tech startup!  Our project isn't quite as complicated as a whole system with gesture interpretation.  We're just going to read multiple touches from a touchscreen and report them to a PC.

Our project goal is to develop software/firmware that will:

  • Locate touches on a touchscreen 
  • Filter out unwanted touches (e.g., the heel of a hand)
  • Map touches to separate control channels based on location
  • Report X,Y positions, per channel, to a host PC

Components

  • NXP LPC2148-based ARM7 chip on an evaluation board from IAR Systems
  • A "navigation engine" running on the micro
  • Zytronic Zypos 12.1" capacitive touchscreen and controller (pictured below)
  • Desktop PC to visualize the data (in Matlab or a custom app)

Hardware overview

Search

Google
Web Extratouch search

Sponsors

Multi-touch

Development tools


Navigation Engine

The navigation engine is a piece of firmware running on the microcontroller.  It consists of three main pieces:
  • Data interpolation
  • Pattern recognition
  • Channel assignment
Project block diagram


Touchscreen

Touch panel

The screen's controller can send X,Y coordinates of a single touch, or the raw data from its sensor wires.  A simplified version is shown; there are actually 16 wires on each axis:

The touchscreen's sensor grid (simplified)

When the screen is touched, the capacitive wires are activated and we can find the touch location:

The touchscreen's sensor grid (simplified)

However, multiple touches are trickier.  See below.


Interim Demo

For our design review at the end of first semester, we prepared a demo showcasing the basic touchscreen functionality.  It uses a Rabbit 3000 microcontroller since our team was already familiar with this platform and boards were readily available in a department lab.  We will port to the ARM platform described above during second semester.

Hardware architecture of the demo 

The PC periodically queries the screen for a frame of raw sensor data, which the micro processes into a 16x16 array showing the magnitude of the sensors' intersection at each point.  Then the PC plots it in 3D in Matlab.

3D Matlab plot of sensor data


Challenges So Far

In creating the demo, we realized that the way the touchscreen works causes extra "ghost touches" to appear.  For example, the 3D image shown above was actually created with two touches.


Simulated sensor grid showing two touchesSimulated sensor grid showing two touches and ghost points


If we simply calculate the intersections of wires that are activated, we will end up with these extra points.  One way to get around this problem might be to sense which touch occurred first.  Also, we know the the average value of the sensors, so we could compare it to some baseline to help distinguish between two touches with "ghosts" and three or four real touches.