I read so many blogs that complain left and right about the USA. If you
took them to heart, you would think that this is the most evil place on
Earth. After talking with somebody I know out of the country, it is
obvious that the press portrays the US the same way. This person was under
the assumption that nobody had healthcare and you would go broke in the US if
you got sick.
My family has been in the US a *long* time. Like, "Mayflower" long
time. I grew up in a very small town with barely any opportunities.
However, living in this great country, I knew that I could bend my life
into anything that I wanted to.
My Mom was a stay at home Mom, and we lived on a low-medium
income. We had a comfortable home that was within our means. Never
any flashy cars, vacations, and stuff like that, but we lived comfortably.
Through the years, I moved out of that small town because I knew that the job
market was not in what I wanted to do. I moved to a bigger city. I
started working and supporting myself a little over 9 years ago. Is
started out making $9.00. Then, a job opportunity came available at a
start up company. Though is was a risk for me, I took the job because I
believed in the vision of the company. I took a pay cut down to $7.00 per
hour. I got rid of all expenses that I didn't *need*, and then got to work
making sure that the company grew and succeeded.
Over the years, that risk has paid off. The start up company employs
many people and has millions of customers. Everyone gets paid a fair wage,
gets health insurance, and the work environment is great. I am also
very happily married, have a great child, and have a comfortable home and
lifestyle. I work full time to maintain that life, and it all makes me
happy.
What made that possible? It's the way America works. It gives
people with a vision the ability to start a business no matter how small it
is. If that person works hard enough, that business will thrive.
America has the land, resources, and opportunities that allow you to work for
and obtain the life you want.
And, no, it's not handed to you. America doesn't mean a "free
ride". It is here to give you the freedom to work for what you want.
It is here to protect you and give you a fair chance. *No* country is
perfect. There is no way to please everyone. But, in my view,
America is as close to perfect as you can get.
My life is comfortable, and I work in a place that I love and
appreciate. I know its roots and I never forget its beginnings. I
take the same view with America- learn its roots, don't forget its beginnings,
and try to appreciate the way it currently stands.