Why can't people just tell the truth anymore? Is the truth really that bad?
It's so much worse when you find out the truth from other people/whatever and you realise you've been lied to. However small the lie, to be exposed as a liar makes it huge. People who tell you one thing to your face but are blatantly up to something else really fuck me off. I don't know what's worse, the lying or the fact they expect you to believe them.
I'm so disappointed.
Whose been telling porkies? Is this in regard to the 'Drama' of late?
ReplyDelete