GB2345538A - Optical tracker - Google Patents
Optical tracker Download PDFInfo
- Publication number
- GB2345538A GB2345538A GB9824339A GB9824339A GB2345538A GB 2345538 A GB2345538 A GB 2345538A GB 9824339 A GB9824339 A GB 9824339A GB 9824339 A GB9824339 A GB 9824339A GB 2345538 A GB2345538 A GB 2345538A
- Authority
- GB
- United Kingdom
- Prior art keywords
- optical tracker
- ordinates
- computer
- control
- optically tracked
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
- Position Input By Displaying (AREA)
Abstract
A tracker having a purely optical interface comprises a monocular video camera to observe the actions of a user, a video capture card to convert the video image into numbers, and a computer with dedicated software for video analysis and subsequent 3D and 2D tracking of incoming images of the body or a body part eg for controlling a computer cursor or a virtual reality image and sound. The computer decodes 3D body information via intelligent computer algorithms and optical image recognition which determine the position and size of recognised objects. There is no need for body worn transmitter or reflectors, VR dataglove or headset, mouse or lightpen. A mouse pointer may be moved around the screen merely by pointing at the desired position in free space. Also the viewer's head may be optically tracked. The viewer can view a computer object from different angles and move it in 2 or 3 dimensions merely by pointing at it.
Description
Optical Tracker
Video cameras are being connected to computers in increasing numbers and new uses for these cameracomputer interfaces are being found.
An optical tracker is a method of interacting with a computer, by allowing the computer to observe the actions of the user. This interface is purely optical in nature and relies on a video camera, observing the user's actions, a video capture card, which converts the video image into numbers for the computer to analyse, a computer to perform the video analysis and subsequent body or body part position tracking.
The primary function of such a system is as a virtual mouse pointer (V-mouse), virtual reality image (VR) and sound control, (3D object manipulation and stereo and 3D sound positioning).
All this is done with no body worn transmitter or detectable device such as reflective pads/objects. There is no need to wear a VR dataglove or VR headset. There is no need to touch any pointing system such as a mouse, lightpen or the like.
The computer is programmed to decode 3D body information via intelligent computer algorithms and optical image recognition.
This described Optical Tracker allows the mouse pointer can be moved around the computer screen merely by pointing at the desired position in free space.
Also the user can view computer 3D objects from different angles. The viewer's head is optically tracked by a video camera. The user can also move any computer object in 2 or 3 dimensions merely by pointing at it. This is all achieved without the user having to wear or touch any special equipment. The system is completely passive and relies on the computer intelligently analysing the viewer's limb and body positions.
GENERAL DESCRIPTION
The Optical Tracker is a video image analysis system that allows a single video camera to locate a known object in 3D space. Co-ordinates received in this manner can be used to control a mouse pointer,
VR environment 3D and 2D computer controlled/generated objects.
The equipment needed is: a video camera, a video capture card, a computer and operating software. The video camera sits on the computer monitor and watches the user/s. The video information is received by the on-board video capture card. The video information is decoded and analysed by the Optical Tracker software. The computer can be programmed to continually search for particular objects, such as a human head or hand. Once it has detected, say a hand it can then track the hand as it is moved around the camera's field of view
The software has four main levels: I] Video capture card interface, 2] Background detection, 3] Object edge detection.
4] Known object comparison, movement and 3D position detection.
It contains a 3D model of the objects it is trying to locate. It emulates the way we match our memorised visual models onto incoming images. It attempts to determine an incoming object's position by overlaying it's internal model of the object type it is trying to detect on top the incoming object.
It tries to move its internal 3D object overlay, until best match with incoming object is found, thereby determining the incoming object's 3D positioning.
Optical Tracker
This invention relates to an optical tracker
The optical tracker is a 3D object tracker that requires a monocular optical input, eg: a video camera.
An example of the use of the optical tracker is as a VR body tracker that does not require the user to wear trackable device such as a helmet worn transmitter, or to be exposed to trackable force fields such as electrostatic force fields.
The computer is programmed to see and recognise the person's body position, in much the same way that a human being can determine the position someone who they are looking at.
The optical tracker operates on visual information alone.
Because the optical tracker can track 2D and 3D body position it can also be used to track a pointed finger for use as computer cursor pointer, ie: a virtual mouse
It can generally be used to track the 2D or 3D position of any known object.
Images viewed by a video camera are fed into a digitising video capture card.
The video capture card produces a colour level for each pixel (picture element) in the camera's field of view.
There are 4 levels of software control required to perform this optical tracking : 1] Video capture card interface, 2J Background detection, 3] Object edge detection, 4] Known object comparison and 3D position detection.
Level 1 of the optical tracking software allows access to the hardware registers containing the video pixel colour information, so that a complete digital picture of video events is available to the optical tracking software for further analysis.
Level 2 of the optical tracking software measures relatively static pixel levels and assigns these pixel levels to the background, ie anything in the cameras field of view that does not move for a prolonged period is assumed to be background information.
Level 3 of the optical tracking software measures any changes in pixel colour levels
Scanning pixels in turn for numerical colour level changes.
If a pixel level change is found between consecutive video frames AND no prior pixel level changes are found, then this constitutes finding an object PRIMARY EDGE. Object SECONDARY EDGES are found by reversing this process.
Objects are scanned left to right for boundary information, starting in the top left corner of a video frame. It is immaterial where the scan starts and in which direction it proceeds as long as the picture area containing the object/s for detection is scanned.
Once object boundaries are determined relative to background, then an object shape is known and stored in a numerical array.
This INCOMING OBJECT SHAPE or 2D slice is then compared to a KNOWN OBJECT SHAPE or 2D slice.
A comparison is made of two 2D slices or PROJECTIONS of 3D objects.
One is the incoming video of 3D objects and the other is the internally held 3D object
Level 4 of the optical tracking software measures the diameters of the incoming edge detected object and adjusts the internal 3D model scale to fit the incoming object.
Edge overlap is then measured for each vertex of the computer model relative to the incoming object.
If object vertices overlap within required tolerance, then the software decides that it has found a incoming object that matches its internal model.
If the incoming object is then moved, the intemal model can be moved to try and re-match the incoming object.
In this way the computer can detect changes in incoming object orientation [TRANSLATION,
SCALE, ROTATION, ARTICULATION1.
By way of example: A fully articulated human hand 3D model is held in computer memory and can be changed into any configuration at high speed to try and match incoming images of the human hand.
Once a match is found between incoming hand and the virtual hand, the software knows the
TRANSLATION, SCALE, ROTATION, and ARTICULATION of the incoming hand.
This information means that the software knows where for example a hand is pointing to in 3D space.
THESE COORDINATES CAN MOVE A MOUSE POINTER CURSOR ON THE SCREEN.
Furthermore, because incoming 3D object information is detectable, full 3D virtual world control is possible eg : operator head and body movement can control 3D worlds and object in the computer.
According to the present invention there is provided an optical tracker comprising a visual sensing device, (eg: video camera), video capture hardware, (eg: video capture card) a computer and optical tracking software for video image 3D and 2D image position analysis.
A specific embodiment will now be described by way of example to the accompanying drawings in which :-- Figure I shows in block diagram the optical tracker where a person (item 5) points towards a video camera (item 4)
The output from the video camera (item 4) is fed as an electronic video signal into a video digitiser (item 3) this shown in both Figures I and 2.
The output from the digitiser is fed to a computer (item 2) which is programmed to analyse the video camera image this shown in both Figures I and 2
The position of the person relative to the background is calculated in 3D space as shown in Figure 1
The captured image of the person is used to move 3D computer generated object/s as shown in Figure
I.
The person may move the computer generated images (item 6) on the computer monitor (item 1) by moving their body as shown in Figure 1.
The computer generated 3D image/s move in sympathy to detected body motion as shown in Figure 1.
Claims (1)
- Optical Tracker Claims 1) An optical tracker comprising a monocular video camera, video capture card, computer and dedicated software for 3D and 2D tracking of incoming images, used to control the computer cursor, ie: used in lieu of a mouse.2) An optical tracker as claimed in claim 1, wherein optically tracked co-ordinates of incoming object are used to control virtual world configuration and positioning.3) An optical tracker as claimed in claim 1, wherein optically tracked co-ordinates of incoming object are used to control a 3D cursor, ie: a computer pointer that can be moved around the computer display and in and out of the computer display, ie: the z-axis.4) An optical tracker as claimed in claim 1, wherein optically tracked co-ordinates of the head are used to control virtual world configuration and positioning.5) An optical tracker as claimed in claim 1, wherein optically tracked co-ordinates of the head are used to control the computer cursor.6) An optical tracker as claimed in claim 1, wherein optically tracked co-ordinates of the hand are used to control virtual world configuration and positioning.7) An optical tracker as claimed in claim 1, wherein optically tracked co-ordinates of the hand are used to control the computer cursor.8) An optical tracker as claimed in claim 1, wherein optically tracked co-ordinates of any part of a body are used to control virtual world configuration and positioning.9) An optical tracker as claimed in claim 1, wherein optically tracked co-ordinates of any part of a body are used to control the computer cursor.10) An optical tracker as claimed in claim 1, wherein optically tracked co-ordinates of a wand, ie: pointing tool are used to control virtual world configuration and positioning.11) An optical tracker as claimed in claim 1, wherein optically tracked co-ordinates of a wand, ie: pointing tool are used to control the computer cursor.12) An optical tracker as claimed in claim 1, wherein optically tracked co-ordinates of a body are used to position image overlays in virtual reality assisted surgery 13) An optical tracker as claimed in claim 1, wherein optically tracked co-ordinates of a body are used to control stereo and 3D sound positioning.Amendments to the claims have been filed as follows 1) An optical tracker comprising a monocular video camera, video digitiser (internal or external to camera), computer and dedicated software for 3D and 2D tracking of incoming images, used to control the computer cursor, ie: used in lieu of a mouse.2) An optical tracker as claimed in claim 1, wherein optically tracked co-ordinates of incoming object are used to control virtual world configuration and positioning.3) An optical tracker as claimed in claim 1, wherein optically tracked co-ordinates of incoming object are used to control a 3D cursor, ie: a computer pointer that can be moved around the computer display and in and out of the computer display, ie: the z axis.4) An optical tracker as claimed in claim 1, wherein optically tracked co-ordinates of the head are used to control virtual world configuration and positioning.5) An optical tracker as claimed in claim 1, wherein optically tracked co-ordinates of the head are used to control the computer cursor.6) An optical tracker as claimed in claim 1, wherein optically tracked co-ordinates of the hand are used to control virtual world configuration and positioning.7) An optical tracker as claimed in claim 1, wherein optically tracked co-ordinates of the hand are used to control the computer cursor.8) An optical tracker as claimed in claim 1, wherein optically tracked co-ordinates of any part of a body are used to control virtual world configuration and positioning.9) An optical tracker as claimed in claim 1, wherein optically tracked co-ordinates of any part of a body are used to control the computer cursor.10) An optical tracker as claimed in claim 1, wherein optically tracked co-ordinates of a wand, ie : pointing tool are used to control virtual world configuration and positioning.11) An optical tracker as claimed in claim 1, wherein optically tracked co-ordinates of a wand, ie: pointing tool are used to control the computer cursor.12) An optical tracker as claimed in claim 1, wherein optically tracked co-ordinates of a body are used to position image overlays in virtual reality assisted surgery 13) An optical tracker as claimed in claim 1, wherein optically tracked co-ordinates of a body are used to control stereo and 3D sound positioning.14) An optical tracker as claimed in claim 1, wherein optically tracked co-ordinates of incoming object are used to move in 3D projection an articulated 3D computer generated jointed object, in which the computer generated joints move in sympathy to real-world camera viewed jointed object.I) An optical tracker as claimed in claim 14, wherein the computer 3D jointed object is used as a puppet (avatar) for the purpose of moving through a 3D game or VR environment..16) An optical tracker as claimed in claim I or claim 14, wherein the computer 3D jointed or non-jointed, articulated or non-articulated object is used as a 3D video target object recognition and (3D or 2D) tracking system, 17) As claimed in claim 1 wherein the capture card analogue to digital converter electronics is internal or integral with the video camera
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB9824339A GB2345538B (en) | 1998-11-06 | 1998-11-06 | Optical tracker |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB9824339A GB2345538B (en) | 1998-11-06 | 1998-11-06 | Optical tracker |
Publications (3)
Publication Number | Publication Date |
---|---|
GB9824339D0 GB9824339D0 (en) | 1998-12-30 |
GB2345538A true GB2345538A (en) | 2000-07-12 |
GB2345538B GB2345538B (en) | 2003-12-17 |
Family
ID=10841956
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB9824339A Expired - Fee Related GB2345538B (en) | 1998-11-06 | 1998-11-06 | Optical tracker |
Country Status (1)
Country | Link |
---|---|
GB (1) | GB2345538B (en) |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2388418A (en) * | 2002-03-28 | 2003-11-12 | Marcus James Eales | Input or pointing device with a camera |
DE10225077A1 (en) * | 2002-06-05 | 2003-12-24 | Vr Magic Gmbh | Operating theater object tracking system has moveable optical sensors with position measured in fixed reference system |
WO2004088994A1 (en) * | 2003-04-02 | 2004-10-14 | Daimlerchrysler Ag | Device for taking into account the viewer's position in the representation of 3d image contents on 2d display devices |
US8253801B2 (en) | 2008-12-17 | 2012-08-28 | Sony Computer Entertainment Inc. | Correcting angle error in a tracking system |
US8761434B2 (en) | 2008-12-17 | 2014-06-24 | Sony Computer Entertainment Inc. | Tracking system calibration by reconciling inertial data with computed acceleration of a tracked object in the three-dimensional coordinate system |
US8970707B2 (en) | 2008-12-17 | 2015-03-03 | Sony Computer Entertainment Inc. | Compensating for blooming of a shape in an image |
US9058063B2 (en) | 2009-05-30 | 2015-06-16 | Sony Computer Entertainment Inc. | Tracking system calibration using object position and orientation |
US9354719B2 (en) | 2011-02-28 | 2016-05-31 | Stmicroelectronics (Research & Development) Limited | Optical navigation devices |
CN105739703A (en) * | 2016-02-02 | 2016-07-06 | 北方工业大学 | Virtual reality somatosensory interaction system and method for wireless head-mounted display equipment |
US9535516B2 (en) | 2010-02-23 | 2017-01-03 | Muv Interactive Ltd. | System for projecting content to a display surface having user-controlled size, shape and location/direction and apparatus and methods useful in conjunction therewith |
US9880619B2 (en) | 2010-02-23 | 2018-01-30 | Muy Interactive Ltd. | Virtual reality system with a finger-wearable control |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0571702A2 (en) * | 1992-05-26 | 1993-12-01 | Takenaka Corporation | Hand pointing type input unit and wall computer module |
US5297061A (en) * | 1993-05-19 | 1994-03-22 | University Of Maryland | Three dimensional pointing device monitored by computer vision |
US5454043A (en) * | 1993-07-30 | 1995-09-26 | Mitsubishi Electric Research Laboratories, Inc. | Dynamic and static hand gesture recognition through low-level image analysis |
US5563988A (en) * | 1994-08-01 | 1996-10-08 | Massachusetts Institute Of Technology | Method and system for facilitating wireless, full-body, real-time user interaction with a digitally represented visual environment |
EP0823683A1 (en) * | 1995-04-28 | 1998-02-11 | Matsushita Electric Industrial Co., Ltd. | Interface device |
EP0913790A1 (en) * | 1997-10-29 | 1999-05-06 | Takenaka Corporation | Hand pointing apparatus |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3226271B2 (en) * | 1989-07-27 | 2001-11-05 | オリンパス光学工業株式会社 | Digital electronic still camera |
JPH08131659A (en) * | 1994-11-07 | 1996-05-28 | Hitachi Ltd | Virtual reality generating device |
-
1998
- 1998-11-06 GB GB9824339A patent/GB2345538B/en not_active Expired - Fee Related
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0571702A2 (en) * | 1992-05-26 | 1993-12-01 | Takenaka Corporation | Hand pointing type input unit and wall computer module |
US5297061A (en) * | 1993-05-19 | 1994-03-22 | University Of Maryland | Three dimensional pointing device monitored by computer vision |
US5454043A (en) * | 1993-07-30 | 1995-09-26 | Mitsubishi Electric Research Laboratories, Inc. | Dynamic and static hand gesture recognition through low-level image analysis |
US5563988A (en) * | 1994-08-01 | 1996-10-08 | Massachusetts Institute Of Technology | Method and system for facilitating wireless, full-body, real-time user interaction with a digitally represented visual environment |
EP0823683A1 (en) * | 1995-04-28 | 1998-02-11 | Matsushita Electric Industrial Co., Ltd. | Interface device |
EP0913790A1 (en) * | 1997-10-29 | 1999-05-06 | Takenaka Corporation | Hand pointing apparatus |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2388418A (en) * | 2002-03-28 | 2003-11-12 | Marcus James Eales | Input or pointing device with a camera |
DE10225077A1 (en) * | 2002-06-05 | 2003-12-24 | Vr Magic Gmbh | Operating theater object tracking system has moveable optical sensors with position measured in fixed reference system |
DE10225077B4 (en) * | 2002-06-05 | 2007-11-15 | Vr Magic Gmbh | Object tracking device for medical operations |
WO2004088994A1 (en) * | 2003-04-02 | 2004-10-14 | Daimlerchrysler Ag | Device for taking into account the viewer's position in the representation of 3d image contents on 2d display devices |
US8970707B2 (en) | 2008-12-17 | 2015-03-03 | Sony Computer Entertainment Inc. | Compensating for blooming of a shape in an image |
US8761434B2 (en) | 2008-12-17 | 2014-06-24 | Sony Computer Entertainment Inc. | Tracking system calibration by reconciling inertial data with computed acceleration of a tracked object in the three-dimensional coordinate system |
US8253801B2 (en) | 2008-12-17 | 2012-08-28 | Sony Computer Entertainment Inc. | Correcting angle error in a tracking system |
US9058063B2 (en) | 2009-05-30 | 2015-06-16 | Sony Computer Entertainment Inc. | Tracking system calibration using object position and orientation |
US9535516B2 (en) | 2010-02-23 | 2017-01-03 | Muv Interactive Ltd. | System for projecting content to a display surface having user-controlled size, shape and location/direction and apparatus and methods useful in conjunction therewith |
US9880619B2 (en) | 2010-02-23 | 2018-01-30 | Muy Interactive Ltd. | Virtual reality system with a finger-wearable control |
US10528154B2 (en) | 2010-02-23 | 2020-01-07 | Touchjet Israel Ltd | System for projecting content to a display surface having user-controlled size, shape and location/direction and apparatus and methods useful in conjunction therewith |
US9354719B2 (en) | 2011-02-28 | 2016-05-31 | Stmicroelectronics (Research & Development) Limited | Optical navigation devices |
CN105739703A (en) * | 2016-02-02 | 2016-07-06 | 北方工业大学 | Virtual reality somatosensory interaction system and method for wireless head-mounted display equipment |
Also Published As
Publication number | Publication date |
---|---|
GB9824339D0 (en) | 1998-12-30 |
GB2345538B (en) | 2003-12-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4768196B2 (en) | Apparatus and method for pointing a target by image processing without performing three-dimensional modeling | |
US6775014B2 (en) | System and method for determining the location of a target in a room or small area | |
Berman et al. | Sensors for gesture recognition systems | |
US6198485B1 (en) | Method and apparatus for three-dimensional input entry | |
US6697072B2 (en) | Method and system for controlling an avatar using computer vision | |
JP3114813B2 (en) | Information input method | |
Sato et al. | Fast tracking of hands and fingertips in infrared images for augmented desk interface | |
O'Hagan et al. | Visual gesture interfaces for virtual environments | |
Starner et al. | The perceptive workbench: Computer-vision-based gesture tracking, object tracking, and 3D reconstruction for augmented desks | |
Jennings | Robust finger tracking with multiple cameras | |
US20050206610A1 (en) | Computer-"reflected" (avatar) mirror | |
Leibe et al. | Toward spontaneous interaction with the perceptive workbench | |
KR101892735B1 (en) | Apparatus and Method for Intuitive Interaction | |
JPH0844490A (en) | Interface device | |
US20180239428A1 (en) | Remote perception of depth and shape of objects and surfaces | |
KR20000017755A (en) | Method for Acquisition of Data About Motion | |
GB2345538A (en) | Optical tracker | |
Dorfmüller et al. | Real-time hand and head tracking for virtual environments using infrared beacons | |
KR20030037692A (en) | System and Method of Soft Remote Controller Using Hand Pointing Recognition | |
O'Hagan et al. | Visual gesture interfaces for virtual environments | |
KR20190036864A (en) | VR observation telescope, driving method and application for VR observation using the same | |
Chung et al. | Postrack: A low cost real-time motion tracking system for vr application | |
Maidi et al. | Interactive media control using natural interaction-based Kinect | |
Gope et al. | Interaction with Large Screen Display using Fingertip & Virtual Touch Screen | |
Salti et al. | Real-time 3d arm pose estimation from monocular video for enhanced HCI |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
746 | Register noted 'licences of right' (sect. 46/1977) |
Effective date: 20081229 |
|
PCNP | Patent ceased through non-payment of renewal fee |
Effective date: 20141106 |