（Contributed by Kota)
Culture Shock at the Drug Store
I occasionally go to a local drug store. One of the things I am surprised by is that they sell all sorts of things (e.g. food and liquor), although it is a drug store. This is already an interesting fact, but I realized another interesting thing. When I was walking around in the drug store, I heard some commercial announcements. They give brief announcements which tell you the results of some medical research.
For example, I think I heard something like Japanese people do not take enough eicosapentaenoic acid (EPA) because they do not eat enough fish. It is good to know what these recent research projects revealed, but here is the thing… Once the commercial mentioned the scientific finding, it further mentioned what kind of potential illness we could have. On one hand, I appreciate what they tell us, because it makes me more health conscious. On the other hand, I do not enjoy listening to these announcements because I feel that they are trying to traumatize me. At least I feel that they are trying to make me feel worried, so that I would buy some nutritional supplements and some other products.
Have you had such experiences before? Or, have you just realized that drug stores have these kinds of commercial announcements? As long as I can remember, I have not heard any commercial announcements like these in North America, Europe, or Oceania. This drug store experience was culture shock to me. I understand that business owners need to sell more products to make profits. But, I wonder if this is a good thing to do. I think people want to enjoy their lives without worrying about health too much. What do you think?