2009 8th IEEE International Symposium on Mixed and Augmented Reality 2009
DOI: 10.1109/ismar.2009.5336472
|View full text |Cite
|
Sign up to set email alerts
|

Streaming mobile augmented reality on mobile phones

Abstract: Continuous recognition and tracking of objects in live video captured on a mobile device enables real-time user interaction. We demonstrate a streaming mobile augmented reality system with 1 second latency. User interest is automatically inferred from camera movements, so the user never has to press a button. Our system is used to identify and track book and CD covers in real time on a phone's viewfinder. Efficient motion estimation is performed at 30 frames per second on a phone, while fast search through a d… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
21
0

Year Published

2010
2010
2015
2015

Publication Types

Select...
3
3
2

Relationship

1
7

Authors

Journals

citations
Cited by 46 publications
(22 citation statements)
references
References 8 publications
0
21
0
Order By: Relevance
“…A vocabulary tree with K = 10 6 leaf nodes is built from randomly chosen database descriptors, and then the corresponding inverted index is constructed. To test image matching performance, we use a set of 1000 query images exhibiting challenging photometric and geometric distortions [22]. In Sec.…”
Section: Resultsmentioning
confidence: 99%
“…A vocabulary tree with K = 10 6 leaf nodes is built from randomly chosen database descriptors, and then the corresponding inverted index is constructed. To test image matching performance, we use a set of 1000 query images exhibiting challenging photometric and geometric distortions [22]. In Sec.…”
Section: Resultsmentioning
confidence: 99%
“…Usually these applications require a client/server architecture where computationally intensive image processing and classification are carried out on backend servers (e.g. [Lim et al, 2007], [Chen et al, 2009] ).One example is a study by Takacs [Takacs et al, 2008] in providing augmented reality on mobile phones where the camera phone images are processed on the phone to be matched against a large database of location tagged images on back end server. Sometimes picture frames are used directly with no further processing (e.g.…”
Section: Cameramentioning
confidence: 99%
“…There- fore, the image of each page in the database is stored with 1024 pixels in the longer edge. Compared for example to the system of Chen et al [2] with images of 320 × 240 pixels, the images in our database are an order of magnitude larger. Especially for the pages full of texts the number of local features is up to 7000 in one page.…”
Section: Feature Description and Database Setupmentioning
confidence: 99%
“…Images or video captured by the mobile phone can be analyzed to recognize the object [2] or scene [11,3] appearing in the recording. Compared with searching by text in a browser, taking a picture of an interesting object or scene by the mobile phone and then sending it to the server can be a much more convenient method to get extra information about the objects and scenes.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation