Wouldn't it be nice if Hollywood did something to help society, rather than plague it with perpetual ignorance? Something as simple as requiring all sex scenes to display the use of condoms. A simple subliminal message conveying safety and responsibility. In a world so invested in the televised "arts" to supply a concept of maturity for one's viewing pleasure could do so much for our near and distant futures. Imagine the decreased, unwanted and teen pregnancies, and because of it the reduction in welfare dependency and other government aids... the increase of not just the average maturation of education but, over time, the level of our standard of education. Something so small and inconsequential to Hollywood could be so drastic in our standard of existence here in the country most of us claim to love so much. Just show an empty wrapper on the night stand. A brief discussion of precaution before commencement. Any other ideas?