Facial Palsy 2021
DOI: 10.1007/978-3-030-50784-8_38
|View full text |Cite
|
Sign up to set email alerts
|

3D, 4D, Mobile APP, VR, AR, and MR Systems in Facial Palsy

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 104 publications
0
2
0
Order By: Relevance
“…Primarily, this analysis provides quantitative criteria that ensure an efficient follow-up for patients with facial pathology, e.g., facial paralysis [5]. Several techniques for assessing facial movement have been developed [6], with a view to quantifying the extent of facial paralysis and facilitating diagnosis and therapy, e.g., plastic or reconstructive surgery. Generally speaking, techniques for assessing facial movement can be categorized as either subjective or objective [7].…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Primarily, this analysis provides quantitative criteria that ensure an efficient follow-up for patients with facial pathology, e.g., facial paralysis [5]. Several techniques for assessing facial movement have been developed [6], with a view to quantifying the extent of facial paralysis and facilitating diagnosis and therapy, e.g., plastic or reconstructive surgery. Generally speaking, techniques for assessing facial movement can be categorized as either subjective or objective [7].…”
Section: Introductionmentioning
confidence: 99%
“…In practice, the analysis through 3D scans [6] can be used for planning future maxillofacial surgery [29], soft tissues changes quantification [30] and facial mimic variations of patients before and after treatment [31]. This class can be further subdivided depending on the sensor: laser-based scanning, stereophotogrammetry, structured-light scanning, or RGB-D (red, green, blue-depth) sensors [32].…”
Section: Introductionmentioning
confidence: 99%