first of all,
i guess most people had lie before.
they had their reasons i guess.
Whether for their own or for others.
Anyway, here is the main thing today.
Okay, when we were young,
we were told by our parents and teachers not to lie.
Basically most adults told you not to tell lie.
And so some of you did and some did not.
The thing is this,
As we grew up...
you know, you got bigger and meet more people.
Suddenly you were told to lie, and the truth should not be told at times.
People fear of things,
adults tell you to be truthful when you are young.
But as you grow up, when you want to tell the truth about certain thing,
you were told not to do so.
Instead, you are ask to tell a lie.
Very contradicting isn't it?
example, when you are working with your boss.
Although its his fault at times, but you get the blame.
People then tell you not to say its your boss fault as you might get fired.
Your relative did something wrong, and you want to point it out.
However your parents tell you not to do so, as it will hurt the relative's pride.
And might severe the relationship between families.
Sometimes you need to please certain people by lying to them.
Why is it the things we had learn when we were young is somehow different in the real world??
Is it because this world is getting corrupted or our teachings are wrong?
anyway, sometimes i gotta be the "bad guy"
to make certain things happen.
I admit, i lied, wat about you?