I am just wondering if we really need to tell the truth to people, especially if they are close to us. Doing this means that our lives are an open book. How about the white lies? There are times that we do not want to hurt people's feelings. Thus, we need to tell white lies. What do you think?
Image Credit: rd.com