Originally posted by King Kandy
It's my opinion.
And your opinion doesn't agree with the current laws of this country, and hopefully never will.
Originally posted by King Kandy
Employs are often forced into taking jobs they don't want by the job market.
Employees are forced into taking jobs they don't want due to lack of education, determination and willpower. You're not forced into doing anything.
Originally posted by King Kandy
Actually, many times they can't.
See above, this is due to their own fault, nobody elses.
Originally posted by King Kandy
Yes, the laws allow people to totally rip you off. I didn't realize that the law was always right.
Yeah, what a travesty it is that a company pays an employee to work for them.
Someone just got a job working for wal-mart..WHAT?! They aren't going to buy them a house?! And a new car?! You want this employee to use THEIR OWN MONEY? Madness!
Originally posted by King Kandy
.If people got laid off by a tanking company, they deserve to die.
Strawman
Originally posted by Symmetric Chaos
Actually they're required to uphold anything they put in a contract or promise to provide . . .
Yeah..exactly..
The employer and employee agree on a contract. If that contract does not include benefits, and the employee wants benefits..tough luck, go look someplace else.