It seems that HIV/AIDS awareness programs/campaigns in the U.S. (anyways) are almost non-existant these days. New generations are coming up with no idea what HIV/AIDS really is or how to fully prevent it. The focus seems to be now in foreign countries such as Africa. Billions are poured into Africa yearly. What I find absurd is here in the U.S. many do not have insurance and many are on state funded medication programs with long waiting lists to get anti-virals. In addition, many HIV/AIDS programs are loosing funding or closing up all together due to lack of funds. There is something wrong with this picture. What are your thoughts.
Posts You May Be Interested In