Dentistry is a branch in the medical field that is focused on the oral cavity. A dentist can help whiten your teeth and protect you from tooth decay. Around 91% of adults in the United States experience tooth decay a year. That is a very large number of adults that don’t brush their teeth right? Wrong. Tooth decay can happen to someone that brushes there teeth 80 times a day(that is figurative not literal). Don’t get me wrong of course brushing your teeth helps to fight off all the bad tooth decay monsters. The thing is, sometimes it isn’t the full solution though. That’s why it is important to visit a dentist regularly.
Around 5000 BC, it was believed that tooth decay was caused by tooth worms. These worms would enter in the mouth while humans slept and eat away at their teeth. Later on it was discovered that this was in fact a false claim.
Aristotle and Hippocrates wrote about dentistry, and how tooth decay came about and the possible remedies for it. This is fascinating that they were able to see these common medical issues and were then able to determine a treatment for them. As time progressed, Dentistry began to develop more thoroughly. Doctors were assigned to specify in that field and were beginning to be referred to as dentists, which is based of a French and Latin word that means tooth. By the 1700s, Dentists began writing books based on their medical practices and the anatomy of the mouth. Nowadays, Dentists attend a university and receive a degree before they can begin their career. They have to study these basic concepts that were developed years and years ago.
Why Dentistry is so Important
Dentistry is the backbone to maintaining healthy teeth and bad breath. It is very essential to sustain these things in order to live in today’s society. If you’re a victim of unhealthy teeth, many people will notice this throughout the day, and it will cause some judgement. The proper way to avoid this is by seeing a health care provider.