Plastic surgery has definitely transitioned from something only the rich and famous do and then try to keep a secret, to something for the masses that women are proud of.
I’m sure we all know someone who’s either gone under the knife or at the very least indulged in a bit of Botox or fillers.
While I’ve never indulged myself (although I’d tummy tuck in a heart beat!) I love that people are feeling confident enough to do whatever it is that makes them feel better.
Unfortunately it also seems that with the growing popularity of plastic surgery, new beauty ideals that are perhaps not completely natural are being popularized, and women are now going under the knife to conform to this standard of beauty.
Would you do it?