Workers’ compensation is a mandatory insurance program that provides benefits to employees who are injured while working at their jobs. With rare exceptions, every California employer is required to purchase workers’ compensation insurance or to establish a self-insurance program that is approved by the State of California.