Health insurance is essentially a plan where you pay certain fees to be covered in case you are in need of medical services that often cost a great deal of money. Do you think it is important to have health insurance? Why or why not? HELP PLZ