When did America work best in your opinion and why?

The 1980's maybe. I was only a child then but it seemed like it was a fairly peaceful time. Most people seemed to have jobs, most kids I knew were brought up with good values, racism wasn't as prevalent, schools weren't as bad. But people have always been evil. I get sick of people talking about how peaceful the 50's and 60's were. Bad stuff happened you just didn't hear about it all the time like you do now. As far as my life is concerned the 90's up through 2004 were the best.