- dentistry (English)
- A branch of medicine that involves diagnosis, prevention, and treatment of any disease concern about teeth, oral cavity, and associated structures.
Learn how to say "dentistry" in other languages:
Browse our dictionary
Find other interesting words by browsing through our English dictionary.