Virtual touch screen (3D Ultrasonic Radar).

10 Aug

First things: there are no servo motors. No motors or any mechanical moving parts in this project.

There are 4 HC-SR04 ultrasonic sensor units, only one is working as transmitting – pinging module and receiver simultaneously, 3 others are just receivers. Arduino DUE, of course, is a prime violin of the whole project. Small prototype board has an additional 4-channel buffer amplifier (MCP6024).

Technical specification ( approximately ):

  • Scanning range 3 m, limited very low power and sensitivity of the HC-SR04 devices.
  • Spacial XY resolution depends on a distance, has to be determined.  Two object should be position at least 5 cm apart and not on the same spherical surface around sensor board.
  • Directivity diagram +-30 degree, or so.
  • Spacial Z – (distance) resolution 12 um. No typo error, micrometers.
  • Time to complete full scan 16 milliseconds, frame rate may vary from 60 Hz down to 10 Hz if there is strong reverberation.

Have to say, that ultrasonic units were slightly modified, to brought out an analog signal (40 kHz) before it gets quantization in the local uCPU.  After amplification, 4-channels are digitized by arduino’s ADC (12-bits 1 MSPS).

Fast Fourier Transform, not much differs from the published on this blog library. I’m not ready to disclose complete signal processing algorithm, and is not publishing a code, at this time. Nevertheless, I don’t mind to answer reasonable /meaningful questions.

Video: have to practice more -);


A few comments on a video clip. I intentionally use a pen to draw a picture, even it’s almost violate the basic of the physics, because reflective area of the pen practically equals to wave length, 8.5 mm for 40 kHz in the air. You can see, that arduino is loosing track on a few occasions. Distance ~ 1m.

Computer is running Linux with regular “mtPaint 3.40″ from official source. Software is receiving a mouse commands, as its has no idea where this commands come from. In the same way, if you draw a picture manually. To interface with a host, arduino emulates a left button and XY move mouse commands using “build-in” mouse driver, and I copy ButtonMouseControl example from IDE.

The surface of the touch screen is “virtual”, first things arduino does after I send a command over serial monitor console to start a drawing, is “search – scan” an object. Whatever it finds first, the closest one, would be “locked” and distance to this object is interpreted as “touched – untouched”. This is first try, and I was not pushing hard on the gesture pattern recognition yet. But as you can guess, there is no limits to “slide” “rotate” “scroll” etc movement discrimination, with only one exception. There is no “multi-touch”, as I mentioned in the specification section, two object has to be 5 cm apart. This limitation is introduced by two shortcomings of the current hardware design. First one, because there is no phase array, only one unit is transmitting ( in the middle on right side ), so there is no way arduino could identify two objects on the same sphere. Second, is low sampling rate of the ADC. In order to “shrink” XY spatial resolution down to wave length (8.5 mm), sampling rate has to be at least 6 MSPS or so.

Tracking update rate (scan frame rate – don’t confuse with a video)  is set to 32 fps.




eddited: 14 Aug. 2014       “New technology is rising!”

Second video clip is posted, that demonstrates better tracking stability over bigger distance range.

Distance is 1.2 m, same pen. I think, that all for Virtual Touch Screen demonstration. Any improvements I could make in a code ‘d introduced only small changes in overall representativity of the project.

This HID technology is completely new area for me, and I’m not a professional programmer. Be curious, I look into “regular” touch screen (resistive – capacitive)  library free accessible on the i-net. I find over 100 variables that initialized and updated in order to keep track on a bunch of real-time parameters, that “normal” TS supplies with 10 ms time interval. Another 100’s variables are buried inside proprietary driver in the OS. It would takes a years to run a test and debug effects each of this variables on stability, smoothness,  susceptibility etc. And moreover, my invention Virtual TS – 3D would require a lot more than a 100’s….


edited: 26 Aug. 2014   Answering the question, modification to HC-SR04 module.

There is an electrical drawings, I was able to locate on-line:

And photo:                  circuits_mod

As you can see, analog 40 kHz output is taken from pin 7, LM324. Conveniently,  it’s the rightest one, close to the board edge. Module – transmitter has a jumper wire over B-E of the transistor, others 3 units may not have this wire. I find out, that unit doesn’t transmit anything till it gets a response, that may never happened for echo reflected from tiny object like a pen.  It looks like on-board uCPU is waiting a transition in poling mode.  And additional amplification stage I build with MCP6024, is similar to first stage of the LM324 (U2D), with a gain x50.  In my first try, I connect output of LM324 directly to arduino DUE analog inputs, basically its not realy safe, as voltage goes close to 3.6-3.7 V. But than introducing MCP6024 (rail-to-rail) I decrease power voltage of the OPA down to 3.3V,  not to worry about my DUE.



About these ads

Posted by on August 10, 2014 in Uncategorized


Tags: , , , ,

7 responses to “Virtual touch screen (3D Ultrasonic Radar).

  1. Apprentice

    August 19, 2014 at 3:09 pm


    My mind is open!

    I have a simple question:

    If you know the frequency you are transmitting, 40KHz, and if you are only interested in detecting the position by more than one receiver (that only respond to a narrow range, natural filter), and you are not interested in the acceleration (detected by doopler effect) why do you need a FFT in this project? Why can’t this problem be made by measuring the delays of the arrival time of the signal to the different receivers?

    Best regards,
    João Carvalho

  2. magiciant

    August 19, 2014 at 7:39 pm

    Measuring TOA time of arriving, you may get precision only down to wavelength or half of it. Classic HC-SR04 and bunch of similar devices that available on a market, measures TOA, and this is why 0.4 cm error at the best it may get, because 0.85 cm /2.
    Applying FFT to analog waveform (extracted in front of on-board comparator) I’m calculating a PHASE. And phase is a key. It’s makes enormous difference compare to primitive TOA based devices, because precision is better in 360 !!! times. I mentioned in the blog 12 micrometers, that comes from 0.85 cm / 2 / 360. Using analogy to real world, its like using TOA you can define the time of arriving a bus at day time or at night. Phase (FFT) would tell you exactly hours, minutes, and seconds. 360 is not final result, simple I do integer math at this time, still at prototyping stage,
    but there is a way to relax performance and get 360.0 and consequently 3600 times improvement in precision. May be not so high due to noise limitations, but you get the idea.
    Using the FFT is CPU costly process, but it does it tricks as nothing else in this world.

  3. dovoto

    August 26, 2014 at 3:41 pm

    This is pretty slick. How did you modify the transceivers? Working on a similar project that only needs bearing (not range).

    • magiciant

      August 26, 2014 at 5:50 pm

      I replied in the topic, at the end, thinking others may have similar questions.

      • dovoto

        August 26, 2014 at 8:09 pm

        Awesome, thanks.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s


Get every new post delivered to your Inbox.

Join 47 other followers

%d bloggers like this: