Abstract
Abstract
Most AR interaction techniques target close-range objects. We propose Blurry (Sticky) Finger — the user aims at a distant real-world object with an unfocused finger, relying on proprioception and ocular dominance, without switching focal planes. A 3×3×3 controlled experiment (16 participants, 891 selections) showed BF outperforms cursor-based pointing for small targets, is size-insensitive, and was preferred by 13/16 participants as the most natural method.
Keywords: AR, pointing gesture, proprioception, optical see-through display, vergence-accommodation conflict
1. Motivation
The Multi-Focus Problem in OST Displays
Optical see-through (OST) AR displays force users to constantly shift focus between virtual overlay planes and real-world objects — a significant source of eye strain known as the vergence-accommodation conflict (VAC). Pre-studies at INTERACT 2015 and VRST 2015 quantified this strain and established the formulation for correcting the view offset between the user's eye and the head-mounted camera.
Current cursor-based methods compound the problem, and are limited to the small coverage area of the AR glass (e.g., Google Glass overlays only a corner of the full visual field). Blurry Finger eliminates both constraints.
2. The Technique
Proprioceptive Aiming with an Unfocused Finger
The user keeps their pointing finger out of focus while focusing only on the target. The body's innate proprioceptive sense and ocular dominance allow accurate aiming — with both eyes open, no focal plane switching required. The system tracks the fingertip via OpenCV and applies a geometric eye-to-camera offset correction:
Offset Formulation
dist = x₀ + x₁ where x₀ = [tan(Θ₀) / tan(φ₀/2)] × (f₀/2)
3. Experiment
Comparative Evaluation: Three Distant Selection Methods
Condition 1
Hand-Cursor
Logitech MX Air 3D air mouse. Indirect cursor control requiring focus on the on-screen cursor.
Condition 2 Most Preferred
Blurry Finger (BF)
Proprioceptive pointing, no cursor. User focuses only on target; finger is the natural proxy.
Condition 3
BF-Cursor
Blurry Finger with a supplementary cursor (offset-corrected position) for assurance.
Design: 3×3×3 fully counterbalanced repeated-measures within-subjects. 16 paid participants (ages 23–34). Object sizes: 120/60/30 mm. Moving distances: 200/400/600 mm. 891 selections per participant. Hand started from a resting position on the knee at block start.
4. Results
Key Findings
Completion Time by Object Size
Hand-Cursor was fastest for large targets (F(2,4292)=303.68, p<0.012). BF/BF-Cursor outperformed for medium and small targets. BF performance was stable across all object sizes (F(2,4250)=0.5, p=0.607) — a key differentiator.
Initial Object Selection & Preference
For the first selection from a resting hand position, Hand-Cursor was consistently worst across all sizes (F(2,141)=41.097, p<0.001). This startup cost is practically significant as real-world selections are infrequent. In post-briefing, 13/16 participants preferred BF as the most intuitive and natural — BF-Cursor was often found distracting as the finger and cursor competed for attention.
Error Rate Summary
Overall errors were very low (1–2 per 891 trials). BF-Cursor was size-insensitive (proprioception + cursor synergy). BF and Hand-Cursor both degraded slightly for the smallest targets. H2 not rejected.
5. Application Demo — ISMAR 2016
AR Object Inquiry System
A practical demonstration: the user uses Blurry Finger to encircle a real 3D object. The system segments the image (with offset correction), queries Google Cloud Vision, and displays the result on a Liteye LE-500 OST display. Even using a nominal fixed depth (without a depth sensor), selection accuracy was maintained — the target depth had little practical effect.