AR · HMD · Android

Blurry (Sticky) Finger

A novel AR interaction technique: aim at distant real-world objects using an intentionally unfocused finger, leveraging proprioception and ocular dominance

Published at ISMAR 2016 (Demo) & ICAT-EGVE 2016 — Blurry (Sticky) Finger: Proprioceptive Pointing and Selection of Distant Objects for Optical See-through based Augmented Reality · Yu & Kim

Abstract

Abstract

Most AR interaction techniques target close-range objects. We propose Blurry (Sticky) Finger — the user aims at a distant real-world object with an unfocused finger, relying on proprioception and ocular dominance, without switching focal planes. A 3×3×3 controlled experiment (16 participants, 891 selections) showed BF outperforms cursor-based pointing for small targets, is size-insensitive, and was preferred by 13/16 participants as the most natural method.

Keywords: AR, pointing gesture, proprioception, optical see-through display, vergence-accommodation conflict

1. Motivation

The Multi-Focus Problem in OST Displays

Optical see-through (OST) AR displays force users to constantly shift focus between virtual overlay planes and real-world objects — a significant source of eye strain known as the vergence-accommodation conflict (VAC). Pre-studies at INTERACT 2015 and VRST 2015 quantified this strain and established the formulation for correcting the view offset between the user's eye and the head-mounted camera.

Current cursor-based methods compound the problem, and are limited to the small coverage area of the AR glass (e.g., Google Glass overlays only a corner of the full visual field). Blurry Finger eliminates both constraints.

2. The Technique

Proprioceptive Aiming with an Unfocused Finger

The user keeps their pointing finger out of focus while focusing only on the target. The body's innate proprioceptive sense and ocular dominance allow accurate aiming — with both eyes open, no focal plane switching required. The system tracks the fingertip via OpenCV and applies a geometric eye-to-camera offset correction:

Offset Formulation

dist = x₀ + x₁    where   x₀ = [tan(Θ₀) / tan(φ₀/2)] × (f₀/2)

Blurry Finger concept diagram showing user view and camera view with offset correction
Figure 2. Blurry (Sticky) Finger — the user focuses on the distant target while the finger remains blurred (user view, left). The lateral eye-to-camera offset must be corrected to identify the target in the camera view (right). (ICAT 2016)

3. Experiment

Comparative Evaluation: Three Distant Selection Methods

Condition 1

Hand-Cursor

Logitech MX Air 3D air mouse. Indirect cursor control requiring focus on the on-screen cursor.

Condition 2 Most Preferred

Blurry Finger (BF)

Proprioceptive pointing, no cursor. User focuses only on target; finger is the natural proxy.

Condition 3

BF-Cursor

Blurry Finger with a supplementary cursor (offset-corrected position) for assurance.

Three methods of distant object selection: hand-cursor, blurry finger, blurry finger-cursor
Figure 5. The three compared methods: (a) cursor-based Hand-Cursor with air mouse, (b) Blurry Finger with cursor, (c) Blurry Finger without cursor. (ICAT 2016)

Design: 3×3×3 fully counterbalanced repeated-measures within-subjects. 16 paid participants (ages 23–34). Object sizes: 120/60/30 mm. Moving distances: 200/400/600 mm. 891 selections per participant. Hand started from a resting position on the knee at block start.

4. Results

Key Findings

Completion Time by Object Size

Hand-Cursor was fastest for large targets (F(2,4292)=303.68, p<0.012). BF/BF-Cursor outperformed for medium and small targets. BF performance was stable across all object sizes (F(2,4250)=0.5, p=0.607) — a key differentiator.

Task completion times by object size
Figure 8. Completion time by target size. BF's flat curve vs. Hand-Cursor's sharp degradation for small objects. (ICAT 2016)
Usability survey responses for all three conditions
Figure 11. Usability survey responses. BF rated highest for naturalness and future use; cursor conditions rated higher for confidence. (ICAT 2016)

Initial Object Selection & Preference

For the first selection from a resting hand position, Hand-Cursor was consistently worst across all sizes (F(2,141)=41.097, p<0.001). This startup cost is practically significant as real-world selections are infrequent. In post-briefing, 13/16 participants preferred BF as the most intuitive and natural — BF-Cursor was often found distracting as the finger and cursor competed for attention.

Error Rate Summary

Overall errors were very low (1–2 per 891 trials). BF-Cursor was size-insensitive (proprioception + cursor synergy). BF and Hand-Cursor both degraded slightly for the smallest targets. H2 not rejected.

5. Application Demo — ISMAR 2016

AR Object Inquiry System

A practical demonstration: the user uses Blurry Finger to encircle a real 3D object. The system segments the image (with offset correction), queries Google Cloud Vision, and displays the result on a Liteye LE-500 OST display. Even using a nominal fixed depth (without a depth sensor), selection accuracy was maintained — the target depth had little practical effect.

AR object inquiry system setup and usage scene
Figure 12. The Blurry Finger AR image search system. User encircles a real object; the result is shown through the OST display. (ICAT 2016)
AndroidJava / OpenCV
40°×32°Camera FOV
C/C++Server-client
13/16Participants preferred BF