ANSWERS: 1
-
There are currently no laws requiring companies to provide health insurance for their employees. However, there are a wide variety of state and federal laws that regulate them should they choose to do so.
Source:
Copyright 2023, Wired Ivy, LLC